Everyone has opinions. Some offer their views free for the asking. Some need no invitation. Most folk never get asked and never volunteer. Some opinions are worth having, others are worthless. The same is true of the surveys and polls to which many of us respond. Town planners, the Martha's Vineyard Commission, the daily newspaper, the political party: they all want to know what we think. And, that would be all right, if they didn't intend to use our responses to their questionnaires to make public policy decisions. When our answers will be used to make important civic choices, we need to consider the poll or survey results with a critical eye. Deciding which poll or survey is authoritative, which is interesting but inconclusive, and which is meaningless is a challenge facing newspaper reporters and newspaper readers, and that important subset, voters.
Last week, we published an account of some results from a youth risk survey, administered to middle and high school students on Martha's Vineyard by the youth task force, an arm of the Dukes County Health Council. This survey resembled others given to Vineyard students in recent years, but earlier versions of the survey were administered by the school system. Students in grades six through 12 were surveyed, using a questionnaire modified to suit the middle school and high school settings. Parents could review the questionnaires in advance and ask that their children be excused from participating. The surveys were conducted during a school period set aside for the purpose. The surveys required no personal or identifying information from respondents. A total of 1,075 students completed the survey. Participation was voluntary.
Although participation in the youth risk survey was high, Social Science Research, the professional firm that compiled and analyzed the survey result for the youth task force, warns that, "It is important to keep in mind that the survey results can be generalized only to students who were present when the survey was administered. The results may not reflect responses that might have been obtained from students who were absent or truant on the day that the survey was administered, nor from students who have dropped out of school."
For instance, if there were 50 students out sick on survey day, and all of them were teetotalling teens committed to just saying no to sex, the results concerning alcohol use and sexual experience would have been substantially different.
This is not to say there is nothing to be learned from the youth risk survey. Of course, there is much to be learned about what young Vineyarders are up to and how their parents and the community generally might undertake to help them make good choices about tempting behaviors. But readers must keep their wits about themselves. According to Sheldon R. Gawiser and G. Evans Witt, writing for the National Council on Public Polls, whose mission is to help educate journalists on the use of public opinion polls, "One major distinguishing difference between scientific and unscientific polls is who picks the respondents for the survey. In a scientific poll, the pollster identifies and seeks out the people to be interviewed. In an unscientific poll, the respondents usually "volunteer" their opinions, selecting themselves for the poll. The results of the well-conducted scientific poll provide a reliable guide to the opinions of many people in addition to those interviewed - even the opinions of all Americans. The results of an unscientific poll tell you nothing beyond simply what those respondents say. The method pollsters use to pick interviewees relies on the bedrock of mathematical reality: when the chance of selecting each person in the target population is known, then and only then do the results of the sample survey reflect the entire population. This is called a random sample or a probability sample. This is the reason that interviews with 1,000 American adults can accurately reflect the opinions of more than 210 million American adults."
The nature of the survey sample is not the only critical issue that reporters and their readers must consider in deciding the value of poll or survey results. The NCPP writers list 20 important questions that reporters should ask of the folks touting survey results. They are questions that readers and voters should ask as well. So, when someone like Mark London, executive director of the Martha's Vineyard Commission, assures you, as he did the West Tisbury selectmen recently, that the composition of the MVC's regional planning committee will represent business as well as environmental concerns because the MVC surveys have found that business people and environmentalists pretty much look at planning goals for the Vineyard the same way, bring all your critical faculties to bear. Here's an instance when twenty questions - and maybe more - are required.