Resources: Conducting a Survey
Clemente News Scouts
How to Conduct a Survey
A survey is, by definition, an assessment, usually based on information gathered about a topic and having some mathematical background or reasoning. For the purpose of a small newsletter in which surveys may be taken only sporadically, a survey is a poll – a tally of “who says what” about a specific opinion question or topic.
Conducting a survey is somewhat like conducting an interview, with the exception of the number of people involved and the purpose of the ordeal. In an interview, one person (usually one noteworthy for an opinion in the matter) is asked a series of generic and/or specific questions that guide the reader to a better understanding of the topic and the person describing it.
However, in a survey, many people are asked both generic and specific questions, fact and opinion. The results of a survey are usually displayed in charts, graphs, or other visual media that include percentages, ratios, or other relational values. Interviewing remains an integral part of the survey because of the principal and method behind the way information must be gathered and presented.
One person’s opinion may be substantial for a topic that affects many people regardless of age, gender, ethnicity, or other factors. However, when an opinion question affects people differently because of those factors, it is more accurate to conduct a survey, which includes a wider range of people. Surveys may include as few as two people, so the basics of interviewing still play a major role here.
Most surveys take in a much wider range of people, though, and it is necessary to take a poll. A poll is just a tally – a head count or count of votes. More specifically a poll can be described as a questioning of persons to obtain information or opinions to be analyzed. The analyzed data is parlayed into graphs or charts so it may be easily compared and contrasted.
Some background information about the survey is helpful for readers. Again the principals of interviewing are important. Readers should be told WHAT the topic or opinion question was, WHO conducted the survey and WHO was included in it, WHEN the survey was conducted, WHY it was conducted and WHY the named persons were included, and WHERE the poll took place. It is not necessary to explain HOW the information was analyzed.
Actually conducting the survey is a matter of finding an opinion that will affect the general public (Know the audience!), and knowing who can best answer general or specific questions about it. Al Día Newspaper runs a feature called “Que Hable La Gente” (What the People Say) that describes the opinions of five persons interviewed briefly about a given topic that affects the public in some way. The interviewees are not people with the most pull or weight in the system, but people from among the public that is affected.
Case Study: In the November issue of Clemente News, an article “Do Our Safety Measures… Measure Up?” was printed with the Principal’s Message. It contained an example of the basic interview and of a survey. The reporter asked questions specific to the concerns of two main groups responsible for the security and safety of the school, while the graph indicated the views of the students there as well.
To conduct a successful survey, first determine an opinion of great impact on the audience; otherwise, the results may not be taken seriously. Then ask both general and specific questions (on paper or to a “test” subject) that will help answer questions or unearth other opinions. Develop a time when the survey may be conducted quickly and easily: Too much time lapse will make a topic “stale” and no longer newsworthy, and too much work put into a survey when only the minimum of information is needed just wastes time and effort.
In the Clemente News case study, the opinion question was “Do You Feel Safe in Your School?” There are five grade levels in the school, divided into four groups called “Small Learning Communities.” Therefore, the results of the survey had to represent ALL THE STUDENTS, though not ALL THE STUDENTS were interviewed. Information was gathered from each grade in each community. Other considerations were the bilingual students in each community. Because the survey was a simple one to determine the voice of the general student body, it was not necessary to explore the demographics of the students by telling about the ethnicity, age, and gender of the students, as this data wouldn’t affect the results anyway.
Teachers were asked to fill out a form containing the specific questions to be asked, with space for the teacher to fill in how many students answered “yes,” “no,” or “not sure.” (Note: Surveys of this type MUST be kept basic and simple, or too much information will result in a hastily prepared article.) All the forms were returned to the personnel conducting the interview. A count was done per class. (This was simplified by the spaces left on the form for the teacher to fill out.) Then the data was divided among grade level (5th, 6th… etc.) and small learning community (A, B, C… etc.). Finally, a grand total of all the students included in the interview was displayed.
Although it is nearly impossible to indicate specifically WHO shared in such a survey (in this case, 306 students and 13 teachers) it is still necessary to acknowledge generally who contributed or helped out. In “Do Our Safety Measures… Measure Up?” acknowledgement went to “teachers who helped gather data used” in the article. This encourages the contributors to continue to help the people responsible for such a publication.
NOTE: A common way to express thanks to contributors is to send them “premier” copies, compliments of the newsletter editors/reporters.