Skip to content

One Researcher’s Comments on the Unity Oversight Committee Survey and Findings

unity_oversight_committee_0

Editor’s Note: We asked researchers for their analysis of the Unity Oversight Committee Survey process and results. Here is one response.

As the conflict has continued around the ordination of women pastors and issues of compliance with voted actions of General Conference Sessions and of the General Conference Executive Committee, the Unity Oversight Committee requested that the Office of Archives, Statistics and Research (ASTR) conduct a global survey. The results of the survey, titled a “Questionnaire on Compliance,” have been published in the Adventist News Network post dated March 23, 2018 and on Spectrum. The General Conference said that the findings represent the profile of global Seventh-day Adventist opinion on the issue of unity and compliance in the SDA Church. Because it is in the interest of us all to understand the profile of opinion among us on this and other issues, I offer these comments on the methodology and findings of this undertaking in the hope that they will help to clarify the relationship between the stated purpose of this study and its findings.

Study Purpose

Because this study is so important in the development of Church policy on a looming, divisive issue, it is essential to consider the authenticity of its findings. The key to this is the way the findings were generated by the data said to support them. And the very foundation of data generation is the methodology by which they were produced.

However, these issues cannot be engaged without first considering the study’s purpose. What was it after? What did it seek to discover or elucidate? The prelude to the survey’s questions states this:

“The General Conference Unity Oversight Committee would like to explore the opinion of the world field, represented by division and union presidents, on the issue of compliance with voted actions of General Conference Sessions and of the General Conference Executive Committee.

“We request that you, as a division/union president, record what you believe is the view of the majority of members in your territory (as opposed to your personal view) on the following questions.” [Underlining added by author.]1

Another indication of the survey’s purpose can be surmised from a statement about how the data will be used by the Committee.

“…the survey provides quantitative data, allowing the committee ‘to more accurately judge where the world Church leaders and members stand on these issues,’ according to Mike Ryan, chair of the committee. ‘This information will serve as a guide to the Unity Oversight Committee in defining consequences for unions who have not complied with votes of the GC Session and of the GC Executive Committee,’ he added.”2

This statement presupposes that there is a shared understanding among the leadership and general membership on the meaning of “compliance” and that division and union presidents can accurately know and represent the view of the majority of members in their territory.

Two Parts of the Study

The methodology of the study—the way its data were generated—has two main parts: a survey of 150 top Seventh-day Adventist leaders and a “qualitative” part involving conversations with a number of these leaders. There are five aspects of the study addressed here. Two are aspects of the survey, its sample and instrument. Two are aspects of the qualitative component, the extent to which it was systematic and its documentation. Finally, the findings of the two study components are addressed as they are related to the study’s stated purpose.

The Survey Questionnaire/Instrument

A basic issue in any sample survey is the extent to which it represents the population from which it is drawn. The best sample in any case is a strict probability sample in which every element of that population has a known and equal probability of being selected into the sample. This is rarely achieved because the response rates of respondents in a strict probability sample are seldom 100 percent. The question then becomes the extent to which the almost inevitable compromise with this standard corrodes the representativeness of the sample.

The sample in this study is not at all a probability sample, but one apparently based on the convenience of the investigators as it was easy for them to poll 150 of the most senior Church leaders who were supposed to be able to accurately know and report the opinions of congregants in their massive units. It is a problematic leap to get from leadership beliefs about the opinions of members of their groups to the opinions of the members themselves. It is misleading to assert that any leader can accurately know and report the range of opinion of hundreds of thousands of others in the group, particularly when no attempts to systematically gather information have been done within these large groups. Claiming to know the opinions of those in one’s union or division does not make it so, and it is a gross misrepresentation of the data to claim that it does. It is like saying that all the Cardinals and Bishops of the Catholic Church can accurately know and report the range of Catholic opinion on things like contraception or abortion.

The questionnaire, attached as Exhibit 1, is also problematic.3 The construction of instruments, often called questionnaires or interview schedules, is an extremely important step in the sample survey process. The most credible organizations engaged in this kind of work are generally the more well-known and seasoned university survey research shops. They often work for months and sometimes years to create reliable and well-validated items—questions—for their surveys. This means simply that the well-validated items measure what we think they measure.

The questions in this survey are derived from various actions proposed in the document titled “Procedures for Reconciliation and Adherence in Church Governance Phase II” discussed at last year’s Annual Council and referred back to the Committee.4 Likely, the committee wanted the wording of the questions to be consistent with the language in the compliance document. Yet the wording is important to the scientific nature of the survey process, findings, and conclusions. The six items in the Unity Oversight Committee survey are too long and too vague to meet this standard though some seem to be more valid than others. (See Exhibit 1.)

In question 1, the meanings of some of the major terms are not clear and subject to manifold interpretations: “listen sensitively,” “counsel,” “not in compliance.” In question 2, the concept of “organizational consequences” is unclear. Questions 3, 4, 5, and 6 are clearer, but they could certainly be sharpened and made more valid with a substantial period of application and honing. But if this could not be done because of the urgency of launching the survey, researchers would have been well-advised to consult existing, well-validated survey items and to base their new items on these.5 Even assuming that the items are reliable, meaning that they would consistently generate the same results when measuring the same opinions, the validity of the six items of the survey is questionable. We cannot know with real confidence that they measure what we think they measure. And without the assurance that the convenience sample of 150 Adventist leaders represents the range of opinion of 20 million of us and that the survey items measure what we think they measure, we cannot be at all sure of the apparent survey results.

In addition, the use of a five-item scale for responses for each question, such as strongly favor, favor, no opinion, oppose, strongly oppose, instead of the bi-modal “yes” or “no” responses would have produced a more varied range of positions on the studied compliance issues. Similarly, the addition of demographic data, such as age, ethnic background, length of service, and education, would have allowed for more nuanced findings on the opinion items.

Qualitative Data Collection

There are a number of well-accepted qualitative data collection methods in social analysis. One of them is nominally-scaled items in sample surveys, and some would argue that the six items in this survey are of this type. Other accepted qualitative methods include focus groups, in-depth interviews, simulations, and anthropological field studies and its cousin, participant observation studies. What the standards all of these methods have in common is that they must be systematic, and their procedures and results must be documented. Since there is no readily available documentation of the “personal visits and dialogues” with church leaders said to comprise the qualitative component of the unity project, it is impossible to know whether these conversations were appropriately systematic and documented. Therefore, it is difficult to be confident in the data generated and to draw conclusions about the consistency of the information from the “listening sessions” and the findings from the questionnaire. This is especially true given the lack of anonymity in both the “quantitative” and “qualitative” responses.

The Findings

The findings of systematic social research are typically reported in such a way that there is a clear and logical link between the research operations and the conclusions drawn. As in all such studies the very foundation of data generation is the methodology by which they were generated. In this study, the sample is not representative of the global body of the Seventh-day Adventist Church, as no person in the general membership was in the sample, only a small number of its higher leaders. And the validity of the survey instrument is questionable. In the qualitative component of the study, we have no assurance that the data collection was systematic or well-documented. For these reasons, we can have little confidence in the study findings as a whole.   

The Presentation of the Findings

The appropriateness of the presentation of the study findings are open to challenge by those who bear the standards for the conduct of systematic social research.6 In the first place, in the reportage of the findings, the proportion of the global Seventh-day Adventist population represented by Church leaders responding “Yes” or “No” to survey questions is indicated, strongly suggesting that the responses represent the indicated proportion of the entire population under study. This is potentially misleading. Second, the identity of study respondents may be made known to some of the researchers apparently in such a way that individuals’ responses may be known. Any sample survey with such sensitive questions, ones that could lead to punishment of those who answer in ways that do not support leadership, should be absolutely anonymous in the sense that the responses of individuals could be known to researchers or anyone else. Otherwise, the survey can only be construed as an open plebiscite of followers by their leaders. How could that be presented as an adequate measure of opinion on sensitive issues?

It is this researcher’s hope that these observations will enhance our purpose in promoting the work of our Church in advancing the gospel.

Exhibit 1. Survey Instrument

Listen to this article:

 

Notes & References:

1. The instructions for the “Questionnaire on Compliance,” the questionnaire for this survey, are included below as Exhibit 1.

2. Adventist News Network.  Survey results presented to Unity Oversight Committee: Qualitative Research Continueshttps://news.adventist.org/en/all-news/news/go/2018-03-23/survey-results-presented-to-unity-oversight-committee-qualitative-research-continues/ [accessed April 5, 2018]

3. See Exhibit 1.

4. Adventist News Network.  Procedures for Reconciliation and Adherence in Church Governance Phase II.  https://news.adventist.org/fileadmin/news.adventist.org/files/content/procedures-for-reconciliation-and-adherence-in-church-governance-phase-ii.pdf   [accessed April 11, 2018]

5. Among the many sources of well-validated survey items are the Survey Research Center at the University of Michigan and the National Opinion Research Center at the University of Chicago.

6. Note the findings have been reported in the Adventist New Network post of March 23, 2018 and in the Spectrum post also of that date.  Adventist News Network.  Survey results presented to Unity Oversight Committee:  Qualitative Research Continues.  https://news.adventist.org/en/all-news/news/go/2018-03-23/survey-results-presented-to-unity-oversight-committee-qualitative-research-continues/ [Accessed April 11, 2018].  Spectrum Magazine.  Unity Oversight Committee Survey Results.  https://spectrummagazine.org/print/8646 [Accessed April 11, 2018]

 

William W. Ellis, Ph.D., is Professor of Political Studies at Washington Adventist University. Earlier in his career, he held tenured faculty positions in political science at Northwestern University, the University of Michigan, and Howard University, as well as senior research and management positions industry and the federal government. At Northwestern University, then a leader in quantitative political research, he taught some the basic graduate courses in research methodology, including an advanced graduate course in multivariate analysis, which he developed. In addition to his doctoral studies at New York University, he was trained in survey research and other research methods at the Survey Research Center of the University of Michigan, a leader in its field.

Image courtesy of Adventist News Network.

 

Further Reading:
Unity Oversight Committee Survey Results
General Conference Re-asks the Questions of 2017
Unity Oversight Committee Releases Statement Regarding Way Forward
Unity Oversight Committee Continues to Gather Data
General Conference Re-asks the Questions of 2017

 

We invite you to join our community through conversation by commenting below. We ask that you engage in courteous and respectful discourse. You can view our full commenting policy by clicking here.

Subscribe to our newsletter
Spectrum Newsletter: The latest Adventist news at your fingertips.
This field is for validation purposes and should be left unchanged.