Internet security: Who is leaving the ‘virtual door’ open and why?
First Monday

Internet security: Who is leaving the virtual door open and why? by Daniel M. Downs, Ilir Ademaj, and Amie M Schuck

The purpose of the present paper is to study Chicago residents’ knowledge about Internet security and their utilization of prevention and detection tools. The results from hierarchical linear models suggest that there are significant gender, race, age, and community differences in knowledge about firewalls, spyware, phishing and data encryption, as well as the utilization of prevention and detection tools such as anti–virus programs, pop–up blockers and parental control software. Further, diffusion of innovation theory and utopian and dystopian perspectives toward technology help to explain some, but not all, of the variation in peoples’ knowledge about Internet threats and their use of security measures. These findings should help experts identify those people that may be more susceptible to cyber victimization, and highlight the importance of users’ behavior in the realm of Internet security.


Social inequality
Diffusion of innovations theory
Utopic vs. dystopic perceptions of computers in society
Knowledge and utilization of prevention and detection tools
Who is knowledgeable and likely to use security tools?
Explaining gaps in knowledge and use




The Internet has increasingly become an important part of modern life. About 166.5 million Americans spend nearly 66 hours per month online, viewing over 2,335 Web pages (Nielsen, 2008). Every day, more people are turning to the Internet to get information about current events and politics, pursue hobbies and interests, shop, manage their finances, and keep in touch with family, friends and business associates. The Internet, however, has become more than simply a place to get information or a source of entertainment. The Internet has become an important part of the American and world economies. E–commerce is one of the fasting growing economic activities and in the U.S. it accounted for over US$1,500 billion in manufacturing shipments (or about 31.2 percent of all shipments) (U.S. Census Bureau, 2008). The Internet has spawned a social and economic revolution, forever changing how people communicate with one another and conduct business.

As our reliance on the Internet has grown, so have our concerns about Internet security and cyber crime. Symantec (2008) estimates that there are over one million viruses, worms, Trojan horses and other malicious code on the Internet (up over 570 percent since 2006) (see Appendix A for definitions of relevant Internet security and cyber crime terms). Robert Mueller, the director of the U.S. Federal Bureau of Investigation (FBI), has made cyber attacks and high-technology crimes the number three priority of the organization, behind only terrorism and espionage. While it is difficult to know how much Internet security and cyber crime are costing society, government data suggests that in 2005 businesses spent about US$67 billion dealing with spyware, viruses, data theft and other computer–related crimes (U.S. Federal Bureau of Investigation, 2005), and that in 2007 consumers lost about US$239 million (or about US$2,529.90 per victim) to Internet–related fraud and identity theft (Internet Crime Complaint Center, 2008).

There are also concerns that inadequate Internet security may threaten national security. For example, in April 2007 a spree of denial–of–service (DoS) attacks crippled many of Estonia’s public and private Web sites including sites for the prime minister and many of the country’s banks. Representatives from Estonia believe that the attacks originated from somewhere inside Russia and were in retaliation for the government’s removal of a Soviet–era war memorial. The conflict between Estonia and Russia is not the only example of political hostilities being played out in the virtual world: in the past five years, we have also seen cyber conflicts break out between Israel and Palestine, India and Pakistan, and the U.S. and China. According to the FBI, most cyber attacks attributed to terrorists are unsophisticated and limited to e–mail bombing and defacing Web sites. However, the FBI predicts that in the future terrorist organizations will hire or train hackers for the purpose of using sophisticated cyber attacks in tandem with conventional attacks (Lourdeau, 2004).

While inadequate Internet security may threaten national security, it is important to note that the vast majority of cyber crime is economically motivated. Cyber criminals and the underground cyber crime economy have increasingly become more professional, organized and commercially driven. According to the FBI, cyber criminals have been adapting organizational structures similar to the mafia (McMillan, 2006), and the illegal underground cyber economy has taken on the characteristics of more traditional economies such as specialization of production, outsourcing, differential pricing and flexible business models (Symantec, 2008).

As companies and governments work to secure their networks, cyber criminals have turned their attention to targeting ordinary users. Research suggests that education and the use of simple security measures such as a firewall and using an up–to–date anti–virus program are some of the best ways to prevent victimization. Although there is a significant amount of literature on the technical aspects of Internet security and cyber crime, we know very little about ordinary users’ Internet security knowledge or their use of prevention and detections tools.

In general, the research suggests that about half of consumers do not know how to protect themselves from cyber criminals (McAfee–NCSA Online Safety Study, 2007). In a study conducted jointly by AOL and the National Cyber Security Alliance (NCSA) (2004), 81 percent of the respondents reported having some type of anti–virus software on their computer; however, only 68 percent reported updating the program at least weekly. In a more recent study conducted by McAfee and the NCSA (2007), researchers found that 87 percent of respondents reported they had an anti–virus program, but only about 52 percent had updated their program in the last week. Further, 44 percent of respondents did not understand how a firewall worked, and one in four had not heard of the term phishing. While these studies are valuable, they are limited in that they do not provide us with much information about who is knowledgeable and why people choose to use prevention and detection tools.

The purpose of this paper is to advance the literature by examining knowledge about Internet security and utilization of prevention and detection tools in a random sample of Chicago residents. We are also interested in testing whether factors related to social inequality, general computer usage, or perceptions of the social consequences of the Internet can help explain peoples’ use of security tools and their knowledge about Internet threats. Below, we place this paper in the context of prior research and describe the methodology of the study. Then, we present the data on respondents’ knowledge about spyware, firewalls, phishing, and data encryption, and their use of anti–virus protection programs, pop–up blockers and parental control software. Finally, we conclude by offering suggestions for future directions in research and policy.



Social inequality

One possible explanation for users’ knowledge and security practices may be drawn from theories of social inequality. In the field of sociology there is a long history of studying how inequalities are institutionalized. That is, the ways in which people are unevenly rewarded for their social contribution because of their membership in socially defined categories (i.e., class, race, gender, religion, etc.), rather than according to their abilities and/or talents.

In the 1990s, the term ‘digital divide’ was coined to highlight the growing gap between those who had access to telecommunications technology (i.e., telephones, computers, Internet, etc.) and knowledge about information technology (IT) and those who did not. Many feared that those without access and knowledge (i.e., the ‘have nots’) would be left behind and face adverse consequences such as being unable to compete in the employment market, or have less of a voice in government as more agencies were projected to move to e–government models of participation (see Lynch, 2002). Within this context, advanced digital technologies were viewed as commodities and, rather than having an equalizing effect in society, they tend to reinforce social inequalities. Research suggests that the diffusion of telecommunications technology and IT knowledge follows traditional patterns of social inequality, including socioeconomic status, education, race, gender, age, disability status, and geography (see Norris, 2001). Disparities in access to, and usage of, technology appear to mirror disparities in other social phenomena such as life expectancy, access to quality health care, homeownership and involvement with the criminal justice system.

Social inequality is both a cause and a consequence of access to advanced digital technology. Access to computers and the Internet fortifies the social influence of those with the most resources (van Dijk, 1999). Members of society that have access to advanced technologies and are proficient are capable of building social networks and influencing governments (Willis and Tranter, 2006). This further perpetuates divisions in society, since the disadvantaged will not have access to the technological developments, and thus will remain left behind in society (Nicholls, 1999).

Social inequality is important because it may also perpetuate a division between those who have the know–how to protect themselves from cyber attacks and those who do not. That is, disadvantaged social groups may become more marginalized with regard to knowledge about the emergent Internet threats and more vulnerable to cyber crimes. As a result, they may lack the proficiency to utilize protection measures. Drawing from the structured social inequality perspective, we would hypothesize that gender, race, education and income will all be related to knowledge about Internet security and the use of prevention and detection tools.



Diffusion of innovations theory

Social inequality may help to explain the differences in the initial distribution of advanced IT technologies; however, after most people have access, a digital divide of knowledge and skill may still exist. Theories of diffusion of innovation may help provide an explanation for the observed differences in computer knowledge and skills between those who already have access to advanced digital technologies vs. those who do not (Rogers, 2003).

According to diffusion of innovation theory, the progression of advanced digital technologies in society will create new groups of ‘haves’ and ‘have nots’. Instead of the traditional patterns of social stratification (i.e., gender, income, education, occupation, race, religion, etc.), diffusion of innovation theories focuses on the evolution of new technologically centered stratifications such as the early adopter vs. the late adopter. For example, Rogers (2003) asserts that groups of people adopt innovations at different times and at different rates; the S–curve rate of adoption. The S–curve shows a cumulative percentage of innovation adopters over time (i.e., slow at the beginning, more rapidly as adoption increases, then tapering off until a small amount of laggards have not adopted) (see Rogers, 2003). As a result, early adopters are believed to gain IT proficiency more rapidly than late adopters.

As a consequence, early adoption, in comparison to late adoption, may be associated with greater know–how with regard to Internet security. Early adopters may be more familiar with and capable of dealing with threats to Internet security as they evolve, compared to late adopters. Moreover, being a late adopter may impede one’s ability to fix a computer if it becomes infected or understand key concepts in Internet security (e.g., phishing and spyware), which may make one more susceptible to victimization. Drawing from this perspective, we expect that more Internet usage and greater general computer skills, which are generally associated with early adopters, will be related to more knowledge and use of prevention and detection tools.



Utopic vs. dystopic perceptions of computers in society

Knowledge and use of security measures may also be related to their perceptions of the utility of emerging technologies, and whether the Internet fosters or hinders things like human expression, social interaction, democratic participation, and solutions to social problems (DiMaggio, et al., 2001; Katz and Rice, 2002; Norris, 2001; Wilhelm, 2000). Public discourse about the impact of computers and the Internet is often ideologically driven and portrays these technologies as both a panacea for solving society’s problems (i.e., the utopian perspective), and as instruments of harm that will impede democratic deliberation, further fragmenting society and increasing the populace’s feelings of anomie (i.e., the dystopian perspective) (Fisher and Wright, 2001).

Utopian–based arguments largely revolve around the positive impacts of developing computer–mediated communication networks that facilitate civic engagement and democratic participation by transcending traditional geographic and social boundaries. In contrast, dystopian–based arguments tend to focus on how technology causes members of society to become more socially isolated from one another due to fewer face–to–face interactions. While there is a small, but growing body of research on the positive and negative effects of computers and Internet usage (see Katz and Rice, 2002), much less is known about how cultural orientations toward technology influence their knowledge and computing practices. Drawing from this perspective, we hypothesize that more utopian views of technology will be associated with more knowledge about Internet security and use of prevention and detection tools, while more dystopian views will be associated with less knowledge and use of security measures.




Data for this study came from the Chicago Internet Project, which was originally designed to test the feasibility of a new policing measurement system using Internet–based surveys of community residents (Rosenbaum, et al., 2007). The original project involved two telephone surveys and seven waves of Web–based surveys that were conducted between January 2005 and November 2005 in Chicago. Respondents were selected using a complex sampling design, which involved first sampling Chicago police beats, and then using the reverse telephone directory to randomly sample residents within each of the selected beats. Residents were excluded from the study if they did not have access to the Internet, were under 17 years old, or if they regularly participated in Chicago Police Department community policing meetings (see Rosenbaum, et al., 2007 for more information).

The Chicago Police Department ( is divided into administrative districts and then further into beats. Beats are the small geographic units that the department uses to organize activities and deploys resources. Each beat averages about 9,500 residents and 3,600 households (Skogan, 1999).



Knowledge and utilization of prevention and detection tools

Respondents were asked nine questions regarding their knowledge about Internet security issues and their use of prevention and detection tools. The items are presented in Table 1 with the response categories and descriptive statistics. About one–third of the respondents stated that they understood spyware and firewalls very well, whereas just 23 percent of respondents were familiar with phishing. Only 23 percent of the respondents strongly agreed that they could fix their computer if it got infected. The same percentages said that they used file encryption software to protect their information. Most of the respondents reported having anti–virus software and some type of pop–up blocker on their computers. Additionally, about half reported updating their anti–virus weekly. Of those who had children (n=260), about 22 percent reported using some type of parental control software on their computers.


Table 1: Internet security questions and descriptive statistics (N=437)
General knowledge    
 Very wellSomewhat wellNot very wellNot at all well
How well do you understand what [...] is and how well it works?    
 Strongly agreeSomewhat agreeSomewhat disagreeStrongly disagree
I feel confident fixing my computer if it got infected with a virus or spyware.23.2%31.0%21.8%24.1%
I feel confident using file encryption software to protect my information.23.4%26.9%26.9%22.7%
Internet security practices    
 YesNoDon’t know 
Do you currently have anti–virus software on your computer?94.7%2.8%2.6% 
 Daily/WeeklyMonthlyYearlyNever/don’t know
How often do you update your anti–virus or virus scanning software?47.2%23.8%12.3%16.7%
 YesNoDon’t know 
Do you use a pop–up blocker as part of your Internet service or browser toolbar?84.3%9.9%5.8% 
Do you currently use any parental control software on your computer?22.7%77.3%  




Who is knowledgeable and likely to use security tools?

The next step in the analysis was to determine who reports more knowledge and utilization of prevention and detection tools. To simplify the analyses, a general knowledge scale was created using the five Internet security knowledge items presented in Table 1. The general knowledge scale is a unidimensional construct with high internal consistency (Cronbach’s alpha=.842). The scale was coded so that higher values indicate more knowledge about security issues. Descriptions of the variables are presented in Table 2.


Table 2: Descriptive statistics for measures used in multivariate analyses
 MeanStandard DeviationMinMaxDefinition
Dependent variables 
General knowledge2.67.8014Five–item Internet security knowledge scale
Updating anti–virus3.021.1201Index regarding how often anti–virus software is updated
(1=never/don’t know, 2=yearly, 3=monthly, 4=weekly/daily)
Pop–up blocker.84.3601Dichotomous measure for using a pop–up blocker
(0=no, 1=yes uses a pop–up blocker)
Parental control software.23.4201Dichotomous measure for using parental control software
(0=no, 1=yes uses of parental control software)
Independent variables 
Male.38.4901Dichotomous measure of gender
(0=female, 1=male)
African–Americans.24.4301Dummy coded measure of race
(0=non–African–American, 1=African–American)
Latinos.04.2001Dummy coded measure of race
(0=non-Latino, 1=Latino)
Other.03.1701Dummy coded measure of race
(0=non–other, 1=other)
Age47.5013.821887Coded in actual years
Education4.371.7407Index of educational attainment
(0=less than high school, 1=high school graduate, 2=some college, 3=associates degree/2 year program, 4=bachelors degree, 5=some graduate school, 6=masters degree, 7=doctoral degree)
Income3.681.0915Index of income
(1=less than US$20,000, 2=US$20,000–US$40,000, 3=US$40,000–US$60,000, 4=US$60,000–US$100,000, 5=more than US$100,000)
Usage at home4.461.0415Index of Internet usage at home
(1=never, 2=just a few times a year, 3=several times a month, 4=several times a week, 5=everyday)
Usage at work3.541.8315Index of Internet usage at work
(1=never, 2=just a few times a year, 3=several times a month, 4=several times a week, 5=everyday)
General computer skills.70.4601Dichotomous measure of skills related to word processing, accessing the Internet and using e–mail
(0=not good computer skills, 1=good computer skills)
Utopian views2.54.7114Three–item scale with items designed to tap into positive feelings about computers
Dystopian views2.00.7514Two–item scale with items designed to tap into negative feelings
Community level 
Crime rate
(logged) rate per 1,000 residents
Percent in poverty
(logged) of residents in poverty
Percent linguistically isolated
.67.4801.57Percentage of residents who do not speak English very well


Because of the nested data structure (i.e., people in police beats) hierarchical linear modeling (HLM; Raudenbush and Bryk, 2002) was used to account for the complex error structure of the data (see Appendix B for more information). The coefficients for general knowledge and updating anti–virus protection software were estimated using a normal sampling model with an identity link function, and are interpreted like ordinary least squares regression coefficients (i.e., for every one unit change in the independent variable the estimated coefficient is the change in the dependent variable). The coefficients for using a pop–up blocker and using some type of parental controls were estimated using a binary outcome model with a logit link function, and are interpreted similar to logistic regression coefficients (i.e., for every one unit change in the independent variable the estimated coefficient is the logged odds change in the dependent variable).

The results are presented in Table 3. Compared to females, males reported more knowledge about Internet security issues, updated their anti–virus software more frequently, and were more likely to use a pop–up blocker when surfing the Web. These findings are consistent with theories of social inequality and mirror other research on gender differences in computer skills (see Bimber, 2000; Hargittai and Shafer, 2006; Ono and Zavodny, 2003).


Table 3: Hierarchical linear modeling (HLM) results for socio–demographics and community variables
Note: *ρ<.05 **ρ<.01 ***ρ<.001
 Linear modelsBinary models
 KnowledgeUpdating anti–virusUse pop–up blockerUse parental controls
Individual level predictors 
African–Americans (vs. whites)-.060.127-.569**.194.358.364-.272.748
Latinos (vs. whites)-.476*.185-.282.419-.445.6841.1981.256
Other (vs. whites).243*.098.402.2202.514*.973-.419.567
Variance explained15% 5%   
 N=427 N=427 N=427 N=246 
Best level predictors 
Crime rate
Percentage in poverty
Percentage linguistically isolated
Variance explained<1% 15%   
 N=50 N=50 N=50 N=49 


These findings suggest that educators need to be more proactive in bridging the gender gap regarding knowledge and use of Internet security tools. It is, however, important to note that in general, research suggests that men are more likely to be victims of cyber crime than women (e.g., 57.6 percent vs. 42.4 percent), and further, when victimized men tend to suffer more financial loss than women (median loss US$765 vs. US$552) (Internet Crime Complaint Center, 2008; also see McMillan, 2008). Research also suggests that there are gender differences in the type of cyber victimization experienced by women compared to men. For example, in a study conducted by AVG, women fell victim to fraudulent e–mail messages, credit card fraud and theft of bank details, whereas men were more likely to fall victim to phishing scams (Hendrey, 2008) and business investment schemes, which are associated with higher financial losses (McMillan, 2008). Gender differences in risk may help to explain why women are less knowledgeable and less likely to engage in security practices than men. In general, women may feel that they are less at risk of becoming cyber victims than men.

In addition to differential risk of cyber victimization, the gender gap may also exist for other reasons. For example, research suggests that compared to men, women are less likely to have: (1) a general interest in computers and technology (Campbell, 1990; Levin and Gordon, 1989); (2) positive attitudes about computers and perceptions of self–efficacy using computers (Campbell, 1990; Torkzadeh and Van Dyke, 2002); (3) less leisure time to use computers and increase skills (Kennedy, et al., 2003); (4) be overtly or covertly discriminated against in terms of educational and occupational opportunities to learn about advanced technology (Eccles, 1994; Margolis and Fisher, 2002); and, (5) use the Internet in ways that are less likely to result in greater knowledge about computers and technology [1].

Because of the pronounced differences between women and men, closing the gender gap may be difficult and require more than education. For example, Internet security experts may need to combine education with active components such as automatic updates (i.e., Firefox browser) or incentives (i.e., everyone who updates their anti–virus program is entered in a weekly drawing). Further, a significant amount of Internet security and cyber victimization information is delivered through the Internet and alternative forms of distribution may be needed in order to reach more women.

The findings in Table 3 suggest that females are more likely than males to report using some type of parental control software. These findings are not surprising given that women tend to function as childcare providers and are often responsible for the protection of their children in a variety of ways.

Moving on to the influence of race and ethnicity on knowledge about Internet security and security practices, the results are somewhat surprising. In general, there are far fewer racial differences than expected. Compared to whites, African–Americans reported updating their anti–virus program less frequently. There were, however, no differences between African–Americans and whites in terms of their security knowledge, their use of a pop–up blocker or their use of parental controls.

Compared to whites, Latinos reported less general knowledge about Internet security. Differences in English proficiency might explain some of the disparity between Latinos and whites. With search engines in Spanish such as AOL Hispanic, Yahoo Telemundo, and HispanoClick and more online advertisers dedicating specific budgets to Latinos, Hispanics will likely begin accessing the Internet more regularly (Wentz, 2005). As Hispanics gain more computer and Internet experience, they may become more knowledgeable with regard to Internet security. However, with the existence of a language barrier, Hispanics may remain more at risk of Internet threats and cyber attacks.

Interestingly, respondents with higher income were more likely to use parental control software, where as respondents with higher education were less likely to use parental control software. These findings may reflect the current controversy over Internet filtering software. While Internet control software may prevent exposing young children to inappropriate content, it may also block them from viewing important information, particularly online health resources.

At the community level, respondents from poorer areas reported updating their anti–virus program less frequently than respondents from more affluent areas. One possible explanation for this finding may be related to the type of Internet connection users maintain. For example, it is estimated that about 83 percent of poor families (i.e., family income less than US$20,000 per year) reported using dial–up to connect to the Internet compared to 78 percent of non–poor families (i.e., family income greater than US$20,000) [2]. A dial–up connection is significantly slower than a broadband or cable connection. Users with dial–up services may be less likely to update their anti–virus programs because of the amount of time it takes to download updates.

Respondents from poorer communities were more likely to report using a pop–up blocker than residents from more affluent communities. This may also be related to the type of connection users maintain. Many of the dial–up providers include pop–up blockers as part of their service. For example, NetZero and PeoplePC — two of the largest dial–up providers — explicitly advertise that they include a free pop–up blocker as part of their service. Further, other dial–up providers — like AOL, Earthlink, and MSN — also include comprehensive security suites as part of their services.

Interestingly, respondents from communities that are more linguistically isolated reported updating their anti–virus program more often. In other words, respondents from communities where a significant number of residents have some difficulty with English reported updating their anti–virus program more frequently.



Explaining gaps in knowledge and use

The last step in the analysis was to examine several possible explanations for differences in Internet security knowledge and use of prevention and detection tools. The results are presented in Table 4. As stated earlier, based on the diffusion of innovation perspective, we would expect more Internet usage and greater general computer skills will be associated with more knowledge and a greater usage of security tools. Consistent with this perspective, more Internet usage at home was associated with more knowledge, updating anti–virus protection software more frequently, and being more likely to use a pop–up blocker. Accordingly, these differences may be due to differential rates of diffusion with earlier adopters being able to understand and apply technical knowledge.


Table 4: Hierarchical linear modeling (HLM) results explaining Internet security knowledge and practices
Note: *ρ<.05 **ρ<.01 ***ρ<.001
 Linear modelsBinary models
 KnowledgeUpdating anti–virusUse pop–up blockerUse parental controls
Individual level predictors 
Usage at home.134***.032.138*.065.365**.128-.026.205
Usage at work.063**.021.068.038-.
General computer skills.427***.*.367-.364.362
Utopian views.134*.052.003.090-.063.268-.131.260
Dystopian views-.142**.
African–Americans (vs. whites).061.124-.601**.205.506.343-.344.667
Latinos (vs. whites)-.411*.174.315.200-.188.6751.2081.257
Other (vs. whites).006.150-.246.0052.666*1.046-.380.595
Variance explained30% 6%   
 N=414 N=414 N=414 N=246 
Best level predictors 
Crime rate
Percentage in poverty
Percentage linguistically isolated
Variance explained<1% 15%   
 N=50 N=50 N=50 N=49 


Research indicates that compared to late adopters, early adopters have more exposure to interpersonal channels and have more social participation, which may be keys to Internet security knowledge and practices (Rogers, 2003). Another possible explanation may be that when using one’s own computer at home there may be a greater propensity to use detection tools (i.e., in order to protect personal property and feel a sense of security). In addition, increased computer usage may be related to a greater chance of victimization, consequently security becomes a focal concern.

As predicted, individuals with more utopian views report more knowledge while those with more dystopian views indicate less knowledge. People that have utopian views may be more knowledgeable of Internet threats through general usage of the computer and Internet. Carey, et al. (2002) posit that as people increase their computer competence, their attitude becomes increasingly positive. On the other hand, individuals who have pessimistic views about computers may not be willing to develop technological competencies needed to protect themselves from Internet threats.

Utopian and dystopian views were not significantly related to any of the other measures. The incongruence between knowledge and behavior may be the result of culture. With the Internet being ever present, knowledge of Internet threats becomes more customary, however, it may be difficult to put that knowledge into practice. Implementing Internet security may require educational efforts about specific and proper procedures. Individuals may gain some awareness about Internet security since many computers come with anti– virus and other programs installed; still, this does not mean these same individuals have the appropriate skills to use these and other programs.

Interestingly, except for age, the estimates for socio–demographic variables changed very little when other factors were added. For example, more Internet usage, general computer skills and utopian or dystopian attitudes did not explain away gender or race differences. These results suggest that diffusion and cultural perspectives on technology may not be very helpful in explaining traditional patterns of social inequality seen in technology usage and knowledge.

The notable exception was age, which was no longer significantly related to any of the dependent variables after adding the usage, general knowledge and utopian/dystopian variables. These findings suggest that difference in knowledge and utilization of a pop–up blocker between younger and older respondents may be explained by factors related to diffusion and cultural perspectives.




The purpose of this paper was to examine the knowledge and use of prevention and detection tools in a random sample of Chicago residents. A secondary purpose was to test the relationships between Internet security knowledge and practice, and individual demographic factors, community composition characteristics and factors, drawn from theories of diffusion and cultural perspectives on technology.

In general, the findings from this study suggest that social inequality, diffusion of technology, and cultural perspective theories are important to understanding knowledge about Internet security and use of prevention and detection tools. The findings also suggest that these theories are better at explaining knowledge than they are at explaining behavior. More research is needed to better understand Internet security knowledge and why some translate information and education about risk of cyber victimization into action while others do not.

Awareness is an effective weapon against many forms of identity theft and cyber crime, which has the potential to be more economically damaging than many other types of traditional crimes. Educational efforts that teaches individuals how to protect their computers properly are warranted. The government should consider engaging in an organized public service campaign to teach people how to protect their computers and Internet connections from hackers, viruses and theft. Education, however, is probably not sufficient and will need to be coupled with other active components such as automatic updates and incentive programs.

Like all research, this study is not without limitations. First, and foremost, the research was conducted in one city and was based on responses from a relatively affluent sample, which may be high on the technological learning curve. As a consequence, the generalizability of the results may be limited. More research is needed to determine the degree to which these findings can be generalized to other cities and other populations. Second, all of the data are based on self–reports and suffer from all of the issues related to self–reports (i.e., the possibility of under– and over–reporting of knowledge and behavior). Future research should combine self–reported information with independent assessments in order to gain a better understanding of the nature of systematic error in self–reports. Finally, this research is limited in the number and type of Internet security issues surveyed. In recent years, the explosion in popularity of social networking sites such as MySpace, Facebook and Twitter have presented Internet security experts with a whole new set of user–focused threats.

In spite of these limitations, this study helps to identify those that may be more susceptible to Internet security threats and cyber victimization, and suggests several possible reasons why. Additionally, it highlights the importance of behavior in the realm of Internet security, and emphasizes the need to continue a dialog about how to increase a basic understanding of Internet security topics coupled with the utilization of prevention and detection tools. End of article


About the authors

Daniel M. Downs, MA in Experimental Psychology, is a PhD student in Criminology, Law and Justice at the University of Illinois at Chicago. His areas of interest include youth attitudes toward the law, legal cynicism, Internet security and cyber crime, and quantitative methodology.

Ilir Ademaj, BA in Criminology, Law, and Justice at the University of Illinois at Chicago. His areas of interest include international law, genocide studies, human rights and non–profit organizations. He plans to pursue a J.D. at DePaul University.

Amie M. Schuck, PhD in Criminal Justice, is an Assistant Professor in the Department of Criminology, Law and Justice at University of Illinois at Chicago. Her areas of research interest include Internet security and identity theft, public safety partnerships, youth attitudes toward the police, and quantitative methodology.
E–mail: amms [at] uic [dot] edu



1. Women are more likely to use the Internet to communicate, search health issues, and obtain religious information, where as men are more likely to use the Internet to get news, do job–related research and download software; see Pew Internet and American Life Project, 2006.

2. U.S. National Telecommunications and Information Administration, September 2001.



America Online (AOL) and the National Cyber Security Alliance (NCSA), 2004. “AOL/NCSA online safety study,” at, accessed 22 December 2008.

Bruce Bimber, 2000. “Measuring the gender gap on the Internet,” Social Science Quarterly, volume 81, number 3, pp. 868–875.

N. Jo Campbell, 1990. “High school students’ computer attitudes and attributions: Gender and ethnic differences,” Journal of Adolescent Research, volume 5, number 4, pp. 485–499.

Jane M. Carey, Ines Chisholm, and Leslie Irwin, 2002. “The impact of access on perceptions and attitudes towards computers: An international study,” Education Media International, volume 39, number 3, pp. 223–235.

Paul DiMaggio, Eszter Hargittai, Russell Neuman and John Robinson, 2001. “Social implications of the Internet,” Annual Review of Sociology, volume 27, pp. 307–336.

Jacquelynne S. Eccles, 1994. “Understanding women’s educational and occupational choices,” Psychology of Women Quarterly, volume 18, number 4, pp. 585–609.

Dana R. Fisher and Larry Michael Wright, 2001. “On utopias and dystopias: Toward an understanding of the discourse surrounding the Internet,” Journal of Computer–Mediated Communication, volume 6, number 2, at, accessed 22 December 2008.

Eszter Hargittai and Steven Shafer, 2006. “Differences in actual and perceived online skills: The role of gender,” Social Science Quarterly, volume 87, number 2, pp. 432–448.

Andrew Hendrey, 2008. “Aussies top US, Europe in falling victim to cyber crime,” at, accessed 14 July 2008.

Internet Crime Complaint Center (IC3), 2008. “2007 Internet crime report, 2008,” Washington, D.C.: Bureau of Justice Assistance, at, accessed 15 June 2008.

James E. Katz and Ronald E. Rice, 2002. Social consequences of Internet use: Access, involvement, and interaction. Cambridge, Mass.: MIT Press.

Tracy Kennedy, Barry Wellman, and Kristine Klement, 2003. “Gendering the digital divide,” IT & Society, volume 1, number 5 pp. 72–96, at, accessed 22 December 2008.

Tamar Levin and Claire Gordon, 1989. “Effects of gender and computer experience on attitudes toward computers,” Journal of Educational Computing Research, volume 5, number 1, pp. 69–88.

Keith Lourdeau, 2004. “FBI Deputy Assistant Director, testimony before the U.S. Senate Judiciary Subcommittee on Terrorism, Technology, and Homeland Security” (24 February), at, accessed 22 December 2008.

Beverly P. Lynch, “The digital divide or the digital connection: A U.S. perspective,” First Monday, volume 7, number 10, at, accessed 15 July 2008.

Jane Margolis and Allan Fisher, 2002. Unlocking the clubhouse: Women in computing. Cambridge, Mass.: MIT Press.

“McAfee–NCSA Online Safety Study,” 2007. at, accessed 22 December 2008.

Robert McMillan, 2008. “Men fall harder than women For Internet fraud, study finds,” IDG News Service, at, accessed 22 December 2008.

Robert McMillan, 2006. “FBI: Cybercriminals taking cues from mafia,” PCWorld (7 August), at, accessed 22 December 2008.

S. Nicholls, 1999. “The digital divide,” The Australian, cited at, accessed 11 June 2008.

Nielsen, 2008. “Nielsen online reports topline U.S. data for May 2008,” news release, at, accessed 15 June 2008.

Pippa Norris, 2001. Digital divide: Civic engagement, information poverty, and the Internet worldwide. New York: Cambridge University Press.

Hiroshi Ono and Madeline Zavodny, 2003. “Gender and the Internet,” Social Science Quarterly, volume 84, number 1, pp. 111–121.

Pew Internet and American Life Project, 2006. “How women and men use the Internet,” at, accessed 22 December 2008.

Stephen W. Raudenbush and Anthony S. Bryk, 2002. Hierarchical linear models: Applications and data analysis methods. Second edition. Newbury Park, Calif.: Sage.

Everett M. Rogers, 2003. Diffusion of innovations. Fifth edition. New York: Free Press.

Dennis P. Rosenbaum, Amie M. Schuck, Lisa M. Graziano, and Cody Stephens, 2007. “Measuring police and community performance via Web–based surveys: The Chicago Internet project,” Final report, National Institute of Justice, Center for Research in Law and Justice, University of Illinois at Chicago; version at, accessed 22 December 2008.

Wesley G. Skogan, Susan M. Hartnett, Jill DuBois, Jennifer T. Comey, Justine H. Lovig, and Marianne Kaiser, 1999. On the beat: Police and community problem solving. Boulder, Colo.: Westview Press.

Symantec, 2008. “Symantec global Internet security threat report: Trends for July–December 07,” Cupertino, Calif.: Symantec, at, accessed 15 June 2008.

Gholamreza Torkzadeh and Thomas P. Van Dyke, 2002. “Effects of training on Internet self–efficacy and computer user attitudes,” Computers in Human Behavior, volume 18, number 5, pp. 479–494.

U.S. Census Bureau, 2008. “E–stats: Measuring the electronic economy,” at, accessed 15 June 2008.

U.S. Federal Bureau of Investigation, 2005. “2005 FBI computer crime survey,” at, accessed 22 December 2008.

U.S. National Telecommunications and Information Administration, 2001. Household’s Internet connection type, by selected characteristics of reference person: Total, urban, rural, and central city (September), at, accessed 8 August 2008.

Jan van Dijk, 1999. The network society: Social aspects of new media. Thousand Oaks, Calif.: Sage.

Laurel Wentz, 2005. “U.S. hispanics online use surges” (19 July), at, accessed 18 July 2008.

Anthony G. Wilhelm, 2000. Democracy in the digital age: Challenges to political life in cyberspace. New York: Routledge.

Suzanne Willis and Bruce Tranter, 2006. “Beyond the digital divide: Internet diffusion and inequality in Australia,” Journal of Sociology, volume 42, number 1, pp. 43–59.


Appendix A: Internet security terms

Back doors: In case the original entry point has been detected, this makes reentry easy and difficult to detect.

Botnet: A collection of software robots that run autonomously. They run on groups of remotely controlled “zombie” computers all linked over the Internet — nearly any machine can function as a zombie computer.

Firewall: A device configured to permit, deny, encrypt, or proxy all computer traffic between different security domains based upon a set of rules and other criteria.

Malicious applets: Tiny programs, sometimes written in the popular Java computer language, that misuse your computer’s resources, modify files on the hard disk, send fake e–mail, or steal passwords.

Malware: Software designed to infiltrate or damage a computer system without the owner’s consent. It includes computer viruses, worms, trojan horses, rootkits, spyware, and adware.

Pop–up blocker: Prevents advertisers and Web sites from displaying information or installing software.

Pharming: When someone hijacks a domain name and redirects users to a fabricated Web site (e.g., a bank) where individuals enter their personal information.

Phishing: Attempts to criminally obtain sensitive information (e.g., social security numbers and credit cards) by pretending to be a legitimate businesses.

Spam: Commercial advertising sent through e–mail and usually consists of get rich schemes.

Spoofing: Faking an e–mail address or Web page to trick users into passing along critical information like passwords or credit card numbers.

Spyware: Executes other programs and can collect information from the user. It is used to track user history and capture personal information from Web sites.

Trojan horse: A program that, unknown to the user, contains instructions that exploit a known vulnerability in some software.


Appendix B: Hierarchical linear models (HLM)

Individual level model

Internet security knowledge and practiceij = math

Where Β0j is the intercept; Χqij is the value of independent variable q associated with respondent i in police beat j; and Βqj is the partial effect of that independent variable on the Internet security knowledge and practice dependent variable. The error term, rij, is the unique contribution of each individual, which is assumed to be independent and normally distributed with constant variance σ2.

Beat level model

Β0j = ϒ00 + ϒ01 (crime rate)
     + ϒ02 (percentage in poverty)
     + ϒ03 (percentage linguistically isolated) + U0j,

where ϒ00 is the overall average value for the Internet security knowledge and practice dependent variables and ϒ01 through ϒ03 are the regression coefficients of beat level effects of the crime rate (logged), the percentage in poverty (logged) and the percentage linguistically isolated (logged) on the dependent variables.


Editorial history

Paper received 12 August 2008; accepted 6 December 2008.

Creative Commons License
This work is licensed under a Creative Commons Attribution–Noncommercial–No Derivative Works 3.0 United States License.

Internet security: Who is leaving the ‘virtual door’ open and why?
by Daniel M. Downs, Ilir Ademaj, and Amie M Schuck
First Monday, Volume 14, Number 1 - 5 January 2009

A Great Cities Initiative of the University of Illinois at Chicago University Library.

© First Monday, 1995-2019. ISSN 1396-0466.