First Monday

Expressiveness and conformity in Internet-based polls

Abstract
Expressiveness and conformity in Internet-based polls by Sebnem Cilesiz and Richard Ferdig
This paper reports a research study that examined whether people are expressive and whether they conform to external opinion in Internet Based Opinion Polls. Four versions of an opinion poll via the Internet were administered to 69 college students. Specific conditions to operationalize expressiveness and conformity are explained in the paper. The results suggest that social science findings regarding interpersonal relations do not necessarily apply to Internet environments. The major implication of the study is that poll data collected via the Internet constitute reliable information.

Contents

Introduction
Background
Research questions
Rationale
Methodology
Findings
Discussion and conclusion

 


 

++++++++++

Introduction

Internet-based polls are widely used by researchers, journalists, and educators to collect data on public opinion and attitudes (Taylor et al., 2001). While easy access to participants and the ease of survey creation and online data processing make Internet-based polling convenient, the lack of the need for personal interaction for data collection can make them more economical in terms of time and labor. Additionally, since the administration of the poll is done via a computer, it is easier to incorporate interaction (e.g. confirmation of responses), employ more appropriate layouts (e.g. one/multiple questions on a page), and to manipulate the structure and delivery of the survey (e.g. skip sections determined irrelevant based upon user response).

Although Internet-based polls are becoming more common, there are still many disadvantages associated with their use. Some of the disadvantages of online surveys include the need for participants’ computer literacy, the participants’ vulnerability of their privacy being violated due to the ease of collecting certain information (e.g. IP address, time responded, etc.), and the impossibility of choosing random samples (Gunn, 2002). However, many of the disadvantages of Internet-based polls are linked to our lack of research on the topic (Venier, 1999; Carey, 1998). Smith stated, "At any given moment there are thousands of surveys and polls being conducted on the Web, yet surprisingly little scholarly research is reported about this new technique" [1]. The need for more research stems from the desire to better utilize the advantages of the Internet, while reducing the aforementioned disadvantages. In addition, the Internet is a relatively new medium for data collection, and we need to determine if this particular collection instrument makes a difference in the results we obtain. In other words, we need to address if participants respond differently to Web-based surveys from other forms of surveys (Gunn, 2002). A more complete understanding of these issues will allow us to enhance data collection and polling processes.

 

++++++++++

Background

Reeves and Nass in their seminal work, The Media Equation, argued that people have social relations with media (Reeves and Nass, 1998). They suggested that for people, the mediated world equals the unmediated world. In other words, people take what they see on screen for real. They stated "individuals’ interactions with computers, television, and new media are fundamentally social and natural, just like interactions in real life" [2]. In a series of research studies called "Social Responses to Communication Technologies", they found that many social science findings regarding interpersonal relations apply also to people’s relations with media. For example, one of the findings was that people are "polite" to computers and that they do not criticize computers "to their face." This framework provides an important and unique way to examine Internet-based Polling because of two critical findings about data collection of opinions.

The first finding comes from the well-known Spiral of Silence Theory (Noelle-Neumann, 1993). Noelle-Neumann suggested that people have a greater disposition to express their opinions if they expect the social environment to accept their opinion. This theory has led to several research studies and has had a tremendous impact on public opinion research. For example, Hayes, Shanahan, and Glynn’s research supported that people are more willing to express their opinions when they perceive greater support for their opinion from the social environment (Hayes et al., 2001). Shoemaker, Breen, and Stamper discussed the reasons of such willingness in their article, arguing that people avoid being isolated from the society (Shoemaker et al., 2000). In other words, they avoid confronting any mismatch between their opinions and those of the society’s. They also tend not to express their opinions when they do not expect the society to support their beliefs.

The second social science finding this study tests regards conformity. A social rule holds that people tend to conform to the society even when the opinion of the society is different from their own. Jones, Hendrick, and Epstein suggest that people conform at the expense of changing their opinions due to a desire for social approval — and likewise a fear of rejection or ridicule (Jones et al., 1979). This study observes whether people report conforming opinions or change their opinions to fit that of the computer’s.

 

++++++++++

Research questions

Adhering to Reeves and Nass’ suggestion, we chose to test whether two social science findings on opinions and public opinion transfer to computer-based environments. The first finding is that people tend to express their opinions when they expect them to be supported by the society, and the second finding is that people would choose to change their opinions to fit that of the society’s. Therefore, we formulated two research questions in this study — the answers to which will provide information on administering online data collection and online polling.

  1. Are people more willing to express their opinions or ideas in a computer-based polling environment given an opinion by the computer? In other words, do both the Spiral of Silence Theory and the Media Equation hold true together?
  2. Is there a tendency to change opinion — or at least report doing so — when their original opinions do not match the opinion the computer displays? In other words, do people change their responses to the computer’s opinion — i.e. Does Jones et al.’s (1979) argument hold true in computer-based environments?

 

++++++++++

Rationale

Other than affirming or invalidating the Media Equation, this study has several additional implications. First of all, since debates about electronic polling occur every time there is an election (Ledbetter, 2000), people’s reactions in online polling environments could be more dependable with research that enables us to collect data more accurately.

Second, this research may enable us to find ways to have distance students be more involved in discussions. If people are grouped with relatively similar ideas in small group interactions, everyone will be more inclined to express and discuss their opinions freely. Of course, this is not always the path we want to go in teaching, because we do not want students to always interact with people with the same attitude (and nothing else). Rather we want them to be exposed to various perspectives and beliefs. However, distance education programs already face challenges in having students participate in online discussions (Nonnecke, 2000; Preece, 2000). Given that people will be unwilling to express their ideas in environments where they do not expect to be supported, grouping people with similar backgrounds and ideas may overcome this barrier. Therefore, we could possibly minimize dropouts in distance education programs, by forming homogenous groups in the early days of courses. Moreover, once the rapport is established between people with similar views, students may be open to learn from each other or teachers. Jones et al. stated that "we turn to others similar to ourselves on pertinent attributes to evaluate our attitudes and allow ourselves to be influenced by them" [3].

A final implication relates to attitude change, both in business advertising and in education. Changing people’s attitudes/opinions in certain ways is a major goal of the advertising industry (Gaeth et al., 1997; Petty et al., 1997; Siomkos et al, 2001). There is also a large need for attitude change in education (Loehrer, 1998; Milem, 1998; Sullivan and Johns, 2002) in such areas as environmental education, drug abuse prevention programs, and character education. Affirmation of the second question of this study may imply that we can actually present the desired opinions on polls as "social opinion" and expect to have the participants conform to it, thus potentially changing their attitudes.

 

++++++++++

Methodology

The main research instrument used in this study was a public opinion poll administered online. The questions were on relatively controversial topics, most of which were selected from a poll used by Hayes et al. in their study (Hayes et al., 1997). They chose topics based on their current popularity in the media. Other questions in this study were selected from current polls on the Web. This selection process made it likely that potential participants had thought about survey topics and formed opinions about them [See Appendix for the questions]. All questions had three possible answers: "Yes", "No", and "I do not want to respond". Participants were not given the option to skip a question, but were presented with the option "I do not want to respond" as a legitimate response, as we wanted to capture the willingness to share an opinion as well as the actual opinion.

The participants consisted of 69 college students who volunteered to participate in the study. The students, of age 18 and above, attended a large southeastern U.S. university. Subjects were not selected on the bases of gender, age, ethnic origin/nationality, or academic specialization. Although the polls were available on the Internet, participants were asked to respond to polls in a computer lab attended by one of the researchers ensure that correct instructions were given and the responses were their own.

Participants were randomly assigned one of four versions of the poll. The four versions of the poll had 17, 18, 17, and 17 respondents, respectively. Each version included the same questions, but differed in whether or not they included presentation of opinions and/or confirmations. Every page included one question and the participants needed to respond each to move on to the next question, with all participants receiving the questions in the same order. Participants were not allowed to go back to change their answer.

Version 1 simply asked eight questions in the sequence provided in Appendix. Version 2 followed the same pattern; however, subjects were given an opinion prior to answering the question. For instance, the third question displayed "In my opinion English spelling should be modernized." Version 3 asked the same questions in sequence, however questions were repeated. When a participant responded to a question and clicked to go to the following question, the computer would display the message "Your input has not been recorded. Please enter your response to Question X again" along with the same question below it. Finally, in Version 4, subjects were asked the same questions twice, as in Version 3, but the computer displayed an opinion the second time the question was asked (see Table 1).

 

Table 1: Polling variability.

Group number
Number of participants
Polling variation
1
17
Participants were given eight questions in order, with one opportunity to respond.
2
18
Participants were given eight questions in order, with one opportunity to respond. Prior to responding, they were given an opinion on the question being answered.
3
17
Participants were given eight questions in order. After responding, they were told their answer was not recorded, and they were asked to repeat their answer.
4
17
Participants were given eight questions in order. After responding, they were told their answer was not recorded, and they were asked to repeat their answer, as with Group #3. The second time they were asked, however, an opinion was offered, as with Group #2.

 

Version 1 was administered to observe the general distribution of the population’s opinions. Version 2 was administered to observe the extent to which the population’s responses were aligned with that of the computer’s. Version 3 tested whether giving people a second chance to respond made a difference in their responses (whether they would change their response given an opportunity to do so). Finally, Version 4 tested whether presenting an opinion would change the participant’s responses.

 

++++++++++

Findings

Finding 1

As could be expected, opinions varied in all surveys. Although the topics were diverse, participants seemed willing to share opinions on all topics. Except for Question 4 there were not any sharp preferences or a strong desire to withhold an opinion (see Figure 1).



Figure 1: Version 1 results.

Finding 2

In Version 2, where participants were given the computer’s opinion before they answered the opinion question, results indicate that the responses were varied and that they did not follow a pattern. In other words, there was no significant correlation between the computer’s and the participants’ responses. There were not any strong preferences except for a high "yes" for Question 4, which did not match the computer’s opinion (see Figure 2).



Figure 2: Version 2 results.

Finding 3

Respondents were asked to enter their answer a second time in condition 3. There were no significant changes between the first and second entries, and in only three of the questions did any of the 17 participants change their answer (a total of three changes). Results show that participants tended to keep their responses consistent across the first and second time they responded (see Figure 3).



Figure 3: Version 3 results.

Finding 4

In Version 4, participants were given the opportunity to re-enter their answer after seeing the computer’s opinion. There were only four changes across all eight questions and across all 17 participants. There was no significant difference (p<.05) between the first and second responses. Moreover, data showed that participants kept their responses exactly the same for questions four through eight (see Figure 4).



Figure 4: Version 4 results.

 

++++++++++

Discussion and conclusion

The high response rates, despite the given "I do not want to respond" option and the controversial nature of the topics, demonstrate that people tend to participate and share their opinions in Internet-based polls. This suggests that Internet-based polls are effective and should be used where they are advantageous or convenient.

However, this study provides evidence counter to the social science findings about conformity. Specifically, our results suggest that people adhere to their initial opinions, even when given a second chance to answer (Groups 3 & 4). Moreover, people do not necessarily conform to external opinion in an Internet-based polling environment, even when given the opinion of the computer administering the survey (Groups 2 & 4). That people do not respond to manipulations or external interventions implies that most likely they express their "own" opinion. This finding suggests that Internet-based polls are highly valid and reliable instruments. This, in turn, suggests also that Internet-based polls should confidently be used where advantageous and convenient.

It is interesting to note that changes occurred in the first three questions for groups 3 & 4. Thus, participants seemed to find a pattern in the questioning and maintained their responses in questions four to eight. Therefore, it is possible that the previously discussed results might be due to the specific structure of the instrument used in this study. Different results might have been attained if there were more questions with some asking for confirmation or presenting an opinion, and the rest being dummy questions to distort the pattern. Given this is the first study examining expressiveness and conformity behaviors on Internet-based polls, there is room for further replications of this study by altering the conditions. Future research could ask more questions but stagger the condition (i.e. a second chance for an answer or a computer-opinion). To continue the social science research, future work should also consider representing the opinion provided as a societal opinion rather than a computer one (i.e. your city currently is in favor of X).

The results of this study add to the literature that Internet-based polls are good instruments to use, by showing that they are effective in having high response rates, as well as having high validity and reliability. These are essential characteristics in the areas Internet-based polls are being used, such as data collection in research for researchers, finding and reporting public opinion for journalists, and attitudinal testing for teachers. End of article

 

About the Authors

Sebnem Cilesiz is a graduate student in Educational Technology at the University of Florida.
Web: http://grove.ufl.edu/~cilesiz/portfolio/
E-mail: cilesiz@grove.ufl.edu

Richard Ferdig is Assistant Professor of Educational Technology at the University of Florida.
Web: http://ferdig.coe.ufl.edu/
E-mail: rferdig@coe.ufl.edu

 

Notes

1. Smith, 1997, p. 1.

2. Reeves and Nass, 1998, p. 5.

3. Jones et al., 1979, p. 215.

 

References

J. Carey, 1998. "Electronic voting: pros and cons," Telecommunications, volume 32, number 3, p. 26.

G. Gaeth, I. Levin, and S. Sood, 1997. "Consumers’ attitude change across sequences of successful and unsuccessful product usage," Marketing Letters, volume 8, number 1, pp. 41-53. http://dx.doi.org/10.1023/A:1007933226810

H. Gunn, 2002. "Web-based surveys: Changing the survey process," First Monday, volume 7, number 12 (December), at http://www.firstmonday.org/issues/issue7_12/gunn/, accessed 16 February 2003.

A.F. Hayes, J. Shanahan, and C.J. Glynn, 2001. "Willingness to express one’s opinion in a realistic situation as a function of perceived support for that opinion," International Journal of Public Opinion Research, volume 13, number 1, pp. 45-58. http://dx.doi.org/10.1093/ijpor/13.1.45

R.A. Jones, C. Hendrick, and Y.M. Epstein, 1979. Introduction to Social Psychology. Sunderland, Mass.: Sinauer Associates, Inc.

J. Ledbetter, 2000. "Should voter data be released? New media, old media disagree," Columbia Journalism Review, volume 39, number 1, pp. 71-72.

M.C. Loehrer, 1998. How to Change a Rotten Attitude: A Manual of Building Virtue and Character in Middle and High School Students. Thousand Oaks, Calif.: Corwin Press.

J.F. Milem, 1998. "Attitude change in college students," Journal Of Higher Education, volume 69, number 2, pp. 117-140. http://dx.doi.org/10.2307/2649203

E. Noelle-Neumann, 1993. The Spiral of Silence. Second edition. Chicago: University of Chicago Press.

R.B. Nonnecke, 2000. "Lurking in email-based discussion lists," Unpublished doctoral dissertation, South Bank University, London.

R.E. Petty, D.T. Wegener, and L.R. Fabrigar, 1997. "Attitudes and attitude change," Annual Review of Psychology, volume 48, pp. 609-647. http://dx.doi.org/10.1146/annurev.psych.48.1.609

J. Preece, 2000. Online Communities: Designing Usability and Supporting Sociability. New York: Wiley.

B. Reeves and C. Nass, 1998. The Media Equation: How People Treat Computers, Television and New Media Like Real People and Places. New York: CSLI/Cambridge University Press.

P.J. Shoemaker, M. Breen, and M. Stamper, 2000. "Fear of isolation: Testing an assumption from the spiral of silence," Irish Communications Review, volume 8, pp. 65-78.

G. Siomkos, S. Rao, and S. Narayanan, 2001. "The influence of positive and negative affectivity on attitude change toward organizations," Journal of Business and Psychology, volume 16, number 1, pp. 151-161. http://dx.doi.org/10.1023/A:1007800124297

C.B. Smith, 1997. "Casting the Net: Surveying an Internet Population," Journal of Computer-Mediated Communication, volume 3, number 1, at http://www.ascusc.org/jcmc/vol3/issue1/smith.html#abstract, last accessed 3 March 2003.

E. Sullivan and R. Johns, 2002. "Challenging values and inspiring attitude change: Creating an effective learning experience," Social Work Education, volume 21, number 2, pp. 217-231. http://dx.doi.org/10.1080/02615470220126444

H. Taylor, J. Bremer, and C. Overmeyer, 2001. "Using Internet polling to forecast the 2000 elections," Marketing Research, volume 13, number 1, pp. 26-30.

P. Venier, 1999. "Polling online," Marketing Magazine, volume 104, number 3, p. 22.

 

Appendix

Poll Questions

Note: The words in brackets indicate the opinion presented by the computer in Versions 2 & 4.

  1. Do you think that all new university students should be tested for transmitted diseases? [yes]
  2. Do you think the USA should offer social services (e.g. healthcare, education) to illegal immigrants and their families? [yes]
  3. Do you think that English spelling should be modernized? [no]
  4. Do you think students should not be graded in K-12 schools? [no]
  5. Do you think the federal government should be allowed to restrict the content of communications over the Internet? [yes]
  6. Do you think all pornography should be stopped? [yes]
  7. Do you think genetic engineering should be stopped? [no]
  8. Do you think local police should be allowed to conduct unannounced locker searches at high schools? [no]


Editorial history

Paper received 13 April 2003; accepted 23 June 2003.


Contents Index

Copyright ©2003, First Monday

Copyright ©2003, Sebnem Cilesiz and Richard Ferdig

Expressiveness and conformity in Internet-based polls by Sebnem Cilesiz and Richard Ferdig
First Monday, volume 8, number 7 (July 2003),
URL: http://firstmonday.org/issues/issue8_7/cilesiz/index.html