Survey Research

Directions:
Select the BEST response alternative for each of the questions below.


1. When a participant tends to respond to survey questions using a particular perspective/strategy rather than providing answers directly related to the questions, this is called:
A) a response set.
B) researcher bias.
C) selection bias.
D) All of the above.
2. In the context of the previous example (question 1), research volunteers who try to present themselves in a positive way while answering a questionnaire ae demonstrating:
A) cognitive dissonance.
B) social desirability.
C) the carryover effect.
D) a self-serving bias.
3. When a survey asks participants to include their age, sex, and marital status, it is collecting:
A) attitudes and beliefs.
B) facts and demographics.
C) behaviors.
D) objectives.
4. When a survey asks participants to questions such as, "Are you satisfied with the emergency services in your city?" it is collecting:
A) attitudes and beliefs.
B) facts and demographics.
C) behaviors.
D) objectives.
5. When an RMU survey came out, it asked students to respond to the ________ question, "Don't you think it is time to slightly increase the absurdly low tuition rates at RMU?"
A) double-barreled
B) confusing
C) loaded
D) simplistic
6. An advantage to closed-ended questions over open-ended questions is that they:
A) let people answer with anything they want.
B) take more time to categorize responses.
C) cost more to create and score.
D) are easier to code.
7. If this quiz was a survey, it would be a fair example of using ________ questions.
A) loaded
B) open-ended
C) closed-ended
D) double-barreled
8. If you were asked to evaluate the internet services provided by RMU using 7-point scales with bipolar adjectives such as fast-slow and unreliable-reliable you would be answering questions with ________ scales.
A) graphic rating
B) comparative rating
C) semantic differential
D) behavioral measurement
9. It is a common strategy for researchers to place the most interesting and important questions ________ their surveys.
A) at the beginning of
B) in the middle of
C) at the end of
D) scattered randomly throughout
10. While asking a student some questions the researcher nods his head and smiles whenever the student provides an answer that he likes. The researcher's behavior is most probably an example of:
A) carryover effect.
B) social desirability.
C) sampling bias.
D) interviewer bias.
11. Researchers should use ________ sampling when they want to use the results of their survey to make precise statements about a specific population.
A) haphazard
B) quota
C) probability
D) purposive
12. When the administration was looking for feedback from students about which presidential candidate to hire at RMU they classified students by major, class, and whether they were commuters or lived on campus. They then randomly selected students from each subgroup according to its proportion to the overall student population. All this work was needed in order to use a sampling technique called ________ sampling.
A) quota
B) haphazard
C) simple random
D) stratified random
13. Claire wanted to know how many students at RMU are interested in developing and joining a horticulture club. She stands near the entrance of ROMO's and asks students passing by about their views and interests in horticulture. Claire's sampling technique is an example of:
A) cluster sampling.
B) stratified random sampling.
C) convenience sampling.
D) quota sampling.
14. A researcher stands outside the RMU library and asks only female students if they feel safe on campus. The type of sampling technique being used is:
A) simple random sampling.
B) stratified random sampling.
C) cluster sampling.
D) purposive sampling.


End of Quiz!

Your score out of 100%: %

The correct answers are marked by a "C" in the box before each question. The incorrect questions are marked by an "X".