Introduction

The purpose of this chapter is to compare and contrast survey research administration between direct paper and pencil (manual) and Internet-based (electronic) survey data collection methods (Lippert, 2002). Social dynamics play an important role in influencing respondent participation. A review of the existing literature suggests that the medium and administration context affect differences in survey instrument performance parameters, i.e., response rate, participation ease, attractiveness of survey, novelty effect, administrative costs, response flexibility, response time, population size, sample bias, instrument validity, the management of non-response data, and response error. This chapter attempts to identify, describe and map the differences between survey data collection media as a function of selected social variables.

Differences exist between electronically based and manually administered surveys. Responses to survey questions can be affected by the survey medium (Ayidiya & McClendon, 1990), and can result in response rate differences (Heberlein & Baumgartner, 1978). Response rates by different data collection methods exhibit high variance. Internet-based surveys can produce double-digit response rates (McCooey, 2000). Ease of use, as reported by Cook, Heath and Thompson (2000), is cited as a response enabler when answering Web-based surveys. Novelty effects of Internet-based surveys encourage participant response by attracting users to investigate available features (Dillman, Torora, Conradt & Bowker, 1998). Administrative costs for Internet-based surveys are less than those associated with paper administration (Parker, 1999). Greater response flexibility as a function of respondent options is increased in paper-based administrations (Matz, 1999). Web-based surveys offer reduced response time from initial distribution to time of reply (Oppermann, 1995). Large and geo-spatially dispersed populations of respondents are more efficiently accessed through Web-based surveys (Mehta & Sivadas, 1995; Schmidt, 1997a). Respondents of Web-based surveys exhibit self-selection bias due to participation of only technology-active individuals (Gorman, 2000). Content validity maybe reduced through Internet data collection formats (Dillman & Bowker, 1996). Internet-based data collection permits greater response reliability (Quality Progress, 1999). Higher frequencies of non-response data are found with Web-based formats (Schmidt, 1997a). Response error represents a class of variables that includes various data response problems (Fiske, 1971; Subman & Bradburn, 1974).

Stanton (1998) examined three parameters or problems in Web-based applications of survey research — participant motivation, response consistency and sampling problems. Participant motivation is a series of phenomena that address a respondent's willingness or rationale for participating in a data collection effort. Response consistency is the internal reliability of responses for a fixed population — e.g., sampling all women with doctorates born after 1962 suggests that there will be similarity among the participants' responses. Sampling problems include the ability and convenience to collect data under controlled sampling conditions — e.g., the solicitation of a specific individual through the creation of criteria related listservs.

As an extension of the Stanton (1998) classification parameters for Web-based survey administration, this chapter introduces 12 instrument performance parameters. Figure 1 depicts the 12 parameters as they cross-link to the Stanton (1998) criteria. A descriptive summary and comparison of the differences between paper and Internet-based survey administrations as a function of social dynamics are discussed.

click to expand
Figure 1: Survey Instrument Performance Parameters

A limited number of comparative analyses exist which contrast effects of one survey medium to another (Ayidiya & McClendon, 1990). Through examining the advantages and disadvantages of the two primary survey data collection media — electronic and manual administration — information systems researchers can be better informed on expected variances in response parameters. This enables more systematic data format selections and decisions. Narrative, tabular, and graphic summary representations are provided.

Further differences in instrument performance output parameters can be identified for various survey methods, such as electronic mail, Internet-based, telephonically recorded, paper and pencil, and postal mail administrations. Researchers should consider response inconsistencies, which might result from employing different survey administration media. Understanding and controlling for the administration variance should be considered prior to using any survey instrument.



Computing Information Technology. The Human Side
Computing Information Technology: The Human Side
ISBN: 1931777527
EAN: 2147483647
Year: 2003
Pages: 186

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net