All around the world, the phenomenon of Internet regulation is on the rise as more and more countries implement such policies, from Asian authoritarian regimes to Western democracies. At the same time, the great majority of Internet users are not aware that they access a filtered version of World Wide Web due to the “non–transparent” policy of many governments, something that results to a very dangerous precedent for the future of the Internet.
In this paper, the authors promote and encourage the participation of Internet users in the designing procedure of Internet Regulation Systems (IRSs), as a way to develop effective and ethically correct systems. This can be done via well–formatted surveys conducted in national level in order to measure public opinion and point out user’s needs. To justify their approach, the authors discuss the results of the available related surveys conducted around the globe. Last, in order to attract researchers in the field, they launched a portal for the International project WebObserver.net (http://webobserver.net/) via which they provide all the needed tools for researchers to conduct such surveys with ease and with the minimum time needed.
Overview of Internet regulations around the world
Open vs. silent (invisible) Internet regulation systems
What do Internet users worldwide think about Internet regulations?
WebObserver.net: An international project to measure public opinion
Internet regulations in Greece
Conclusions and future plans
There is a common (but sadly false) impression today that Internet is the only media that, thanks to its nature, cannot be regulated. “The Internet treats censorship as a malfunction and routes around it,” John Gilmore, co–founder of the Electronic Frontier Foundation, said a decade ago . Unfortunately, since then many things have changed in disfavour of freedom of speech on the Web.
According to Reporters Without Borders, the number of Internet journalists that end up in prison each year worldwide is on the rise. Sixty–six Internet journalists were imprisoned in 2008, 95 in 2009, 116 in 2010 and already 119 in 2011 (up to 4 March) . The imprisonment of Internet journalists is the simplest and most straightforward method of Internet censorship, but not the only one in use around the globe. There are plenty of more technologically advanced and effective Internet regulation methods, which not only focus on journalists but on Internet users at the national level.
For example, without even trying to be discreet, the Egyptian government earlier this year ordered ISPs to cut off international connections to the Internet (see Figure 1) . In 2010, Google announced that it had stopped complying with China’s Internet censorship rules, admitting officially that for many years it was delivering censored search results to the world biggest Internet market . Many telecommunication companies around the world tend to comply with government-level pressure to filter Internet access. For example, Research in Motion (RIM) had filtered Internet content to all of its users in Indonesia .
Figure 1: Internet traffic in Egypt during recent Internet filtering actions .
This paper focuses on one of the most sophisticated and technologically advanced methods of Internet regulation in use today: the Internet Regulation Systems (IRSs), implemented at the national level by governments, delivering to Internet users filtered versions of content on the World Wide Web.
The development of Internet regulations
When the World Wide Web was created over two decades ago, Internet users were able to access Web sites through a very simple and direct procedure (Figure 2). By the end of 1994, there were already 3.2 million servers and 3.000 sites that Internet users could access freely . But freedom of Internet access soon changed, with China banning 100 Web sites and Germany attempting to block a site .
Figure 2: Usual way of accessing the Internet .
More countries have begun to replace the simple way of accessing a Web site (Figure 1) with more sophisticated procedures that give ISPs the opportunity to regulate Internet traffic with ease and efficiency .
A decade ago, regulation at a national level was used solely by authoritarian regimes, but since then many Western democracies have implemented (or tried to implement) similar systems. By 2009 the number of countries that had experienced some form of Web censorship had doubled over 2008 . This trend reveals a gradual implementation of Internet regulation systems at a national level.
The need for Internet regulation and the role of Internet users
On the other hand, there is a need for certain online information to be regulated due to regional and international laws . Many university researchers, IT experts and media representatives are engaged in a discussion regarding what kinds of restrictions might be implemented: transparent (open) to the user or non–transparent (silent). Surprisingly, there have been few surveys been conducted to understand how Internet users might react to various Web censoring systems.
In this paper, we present a brief history of Internet regulation systems, the technological aspects of current systems and the main reasons behind their implementation. Then, we proceed in discussing the categorization of Internet regulation systems (a. transparent or open; and, b. non–transparent or silent) and why a critical point has been reached in the development of such policies. We present data regarding the opinions of Internet users on this issue, based on available surveys conducted worldwide. Last, we discuss the results of a related survey conducted in Greece regarding Internet regulation in general and the possible future implementation of such a system in that country. We propose that a balance is needed between Internet regulation policies and freedom of Internet access, which can be found by conducting carefully designed surveys at a national level.
To that end, we present a new international effort called WebObserver.net (http://webobserver.net/) to assist researchers in conducting surveys in their countries, measure the opinions of Internet users opinion and offer valuable data for designing effective and broadly accepted Internet regulation systems in each country. WebObserver.net offers all the necessary tools in order to produce valuable survey data in order to assist the development of Internet policies.
A brief history of Internet regulation systems
Internet regulation systems have a long history of development and implementation in many countries worldwide .
China spends every year extensive resources building and maintaining one of the largest and most sophisticated filtering systems worldwide . Saudi Arabia uses a Web proxy system to block requests for banned Web sites, generated from a list created by reports of citizens . Norwegian Telenor and KRIPOS (Norwegian National Criminal Investigation Service) introduced a child pornography blocking system in Norway in October 2004, which sends to the user a Web page containing information about a filter and a link to KRIPOS . Telenor introduced a content blocking system in Sweden in 2005, based on Norwegian system . Similar content blocking systems have been implemented in many European countries .
Even though many countries have individually started a limited and voluntary implementation of ISP–level blocking programs, the landmark model for large–scale blocking came from the U.K. with the implementation of CleanFeed model . It was created in 2003 by BT (British Telecom) in consultation with the U.K. Home Office and was implemented in BT’s network on 9 June 2004 . The U.K. government gave Internet service providers (ISPs) a deadline in 2008 to block all access to Web sites that host illegal images of child abuse .
Soon, many other countries followed U.K.’s model. In 2006, Canada’s largest ISP announced the launch of Project CleanFeed Canada  based on U.K. CleanFeed model in order to block access to child pornography sites. In 2007, Australia’s Telecommunications Minister Stephen Conroy announced that a mandatory content blocking system would be implemented and that it would focus on child abuse content and “inappropriate material” .
Technical aspects: Accessing the Internet now and then
There are many different ways to control what Internet users inside a country can access, from sophisticated systems that filter search results from search engines  to the simplest and most efficient way with a few government controlled access points within a country, such as in Cuba .
Content blocking (or content filtering) mechanisms, using “black lists”, are frequently used in many countries. Filtering by ISPs and network operators can take a variety of forms including packet dropping, DNS poisoning and content filtering . All three mechanisms use black lists to determine which sites to filter. Black lists are collections of domain names or very specific URLs; the exact procedure for creating these lists differs from country to country. For example, authoritarian regimes prefer to have government officials prepare the lists, while some democracies, such as the United Kingdom, depend on Internet users to specifically complain about specific sites.
Figure 3: Packet dropping system.
Packet dropping (Figure 3) is considered quite simple. Its operation is based on a list of IP addresses to be blocked. Requests for these addresses are discarded, thus no connection to a given server is made. The main disadvantage of packet dropping is that it is not accurate.
An important advantage of packet dropping is that it can identify a type of IP and thus implement selective filtering, that is block HTTP packets for a particular IP address but leaves e–mail (SMTP) unblocked. But, since this particular approach blocks all Web content at a specific IP address, it tends to neutralize more content than necessary. Some experiments demonstrated a significant risk of over–blocking with systems based exclusively on IP addresses .
Figure 4: DNS poisoning system.
Systems based on DNS poisoning (Figure 4) interfere in the DNS lookup process for the blocked sites’ hostnames in order to prevent the correct IP address to be returned. Its main advantage is that this blocking doesn’t affect other domain names hosted on the same server, as in the case of packet dropping. On the other hand, its main disadvantage of this system it over–blocks . As Clayton notes “Thus it would not be an appropriate solution for blocking content hosted somewhere like geocities.com; blocking one site would also block about three million others” . Additionally, DNS poisoning blocks other services such as e–mail. There is also an issue of under–blocking in cases where a user types in an IP address and not a hostname. In this case, the browser uses the IP address and doesn’t make a DNS lookup.
Figure 5: Content filtering system.
Content filtering systems (see Figure 5) are based on one–by–one URL examination. They are very accurate in blocking exactly everything that is contained in specific list, such as images, videos, and audio clips. At the same time, they are by far very demanding in processing power and, for this reason, are expensive. Hence, content filtering is not preferred by ISPs.
The crucial disadvantages of all of these systems have forced the development of hybrid systems in order to combine the advantages of several mechanisms. BT and the U.K. Home Office attempted, through CleanFeed, to develop an accurate and inexpensive system, a two–stage system using both packet dropping and content filtering mechanisms (see Figure 6).
Figure 6: CleanFeed’s design, Based on a figure in .
All of these sorts of content blocking systems are used in many countries, such as China, Saudi Arabia, the United Kingdom and Canada. According to the OpenNet Initiative (ONI), there were at least 26 countries in 2006 that were using content blocking systems and ONI predicted that many more would follow in the years to come. According to Reporters Without Borders, in 2009 Internet users in approximately 60 countries experienced some form of Web censorship .
Reasons behind Internet regulation policies
The reasons behind the implementation of these systems differ from state to state. According to the OpenNet Initiative , they can be categorized as a) political; b) social; c) conflict and security; and, d) Internet tools (such as proxy servers and anonymizers).
For example, South Korea employs an Internet regulation system mainly to block online content related to conflict and security reasons. Singapore, Sudan and Oman focus on content related to specific social issues in each country. Different Internet regulation systems implemented around the world target specific content regarding free expression and media freedom, political transformation and opposition parties, human rights, environmental issues, public health, gay/lesbian content, pornography, gambling, minority faiths, search engines, anonymizers and circumvention, and hate speech . For an extensive list, please refer to Figure 7.
Figure 7: Categories subject to Internet filtering. Source: .
On the kind of online content targeted in each country, refer to Figure 8, based on the research of OpenNet Initiative.
Figure 8: Summary of filtering. Source: .
While it is perhaps easy to understand the reasons behind the implementation of these systems in authoritarian regimes, it is quite complicated to understand the logic of Western democracies. A number of questions have to be addressed. Why do democratic governments need to use content blocking systems? Was public opinion considered in those countries in advance of the implementation of these tools?
CleanFeed is a mandatory content blocking system — non–transparent for BT users — that appeared online in June 2004, designed by the U.K. government and British Telecommunications plc (BT). It is used by UK Internet service providers to block access to sites that host illegal child abuse content. The list of the banned sites is prepared and circulated by Internet Watch Foundation; in addition, there is a U.K. hotline for reporting illegal online content .
While blocking sites is considered to be a highly controversial issue, it is widely accepted that many countries employ various procedures to regulate specific illegal content such as child pornography. The issue is not child pornography, but instead the ease in which other kinds of content could be blocked. Martin Bright of the Observer stated that UK’s CleanFeed was the first mass censorship of the Web to be implemented in a Western democracy . Indeed, CleanFeed has set a dangerous precedent for Internet freedom of speech, as the system could block any kind of content . These concerns led to the development of non–transparent (or silent) Internet regulation systems in Western democracies.
Transparent vs. non–transparent systems
There are two different ways of filtering online content: the open way and the silent way. The Saudi Arabian government uses the open way, which means that the user can understand that a site is blocked and even react to that decision. For example, when a site is blocked, a user sees a page that indicates that access to that site has been denied. Users can decide to fill out online forms stating why they think a given site should be unblocked. Requests are sent to the government’s Internet services unit for consideration. Hence, this form of censorship is considered to be honest censorship .
Other systems simply provide the user with an error message. This message does not inform the user if the site has been blocked or if there was a problem with a connection. This method — non–transparent or silent — is used by CleanFeed, and others.
Why is it dangerous?
There are many critical questions that arise from the use of silent Internet regulation systems. Why is the user not informed that a site is blocked? What procedure should someone use to clarify the situation? Who will be responsible if a site without illegal content becomes blocked? The release of CleanFeed’s statistics certainly drew a variety of concerns and complaints .
Edwards (2006) noted that U.K.’s Home Office had admitted to asking ISPs to block sites that ‘glorify terrorism’ even before such content was criminalised by the Terrorism Act. Edwards (2006) also noted that the Home Office retained ‘flexibility’ for such action and argued that CleanFeed–like technology could be the most perfect invisible censorship mechanism ever invented .
From U.K. to Western democracies
More democracies worldwide have been following the U.K. model. In 2006, Canada’s largest ISP announced the launch of Project CleanFeed Canada based on U.K. CleanFeed model in order to block access to child pornography . In 2007, Australia’s Telecommunications Minister Stephen Conroy announced that a mandatory content blocking system would be implemented focusing on child abuse content and “inappropriate material” . IWF’s list was the basis of Australia’s CleanFeed list, initially containing 9.000 sites . Earley (2009) revealed that many non–child abuse sites were on the Australia’s black list, such as a site promoting voluntary euthanasia .
Given the nature of national policies regulating the Internet, it would be important to examine public opinion on these issues. We will examine the results of a few related surveys, in order to demonstrate that Internet users in some countries are willing to accept, under conditions, the implementation of an Internet regulations.
In 1998, a survey asked: “I believe that certain information should not be published on the Internet” . Among the 4.247 respondents from the U.S. 24.84 percent answered “agree strongly”, 22.30 percent answered “agree somewhat,” while 15.26 percent answered “disagree somewhat” and 28.37 percent answered “disagree strongly.” Summarizing, there were slightly more respondents who noted that they agreed with the non–publication of certain information online (47.14 percent) than those who did not agree (43.63 percent).
In 2007, the Australian Broadband Survey (based on 17,881 participants) found that 74.3 percent of them disagreed and only 13.4 percent agreed with the statement: “Do you support the government’s policy for mandatory ISP&nash;level content filtering (opt–out)?” . In another survey in 2008 , among the 19.763 participants a staggering 88.9 percent answered “yes” to the question: “The federal Labor government plans to require ISPs to filter adult material from the Internet. There will be an ability for customers to opt out of this filter. Would you opt out?” A similar survey conducted in 2009 by Galaxy found out that only five percent of 1.100 participants wanted ISPs to be responsible for protecting children online and only four percent wanted Australia’s government to be responsible for this .
A survey in 2010 found that Internet users were divided over Internet regulations. For example, Canadians are relatively supportive of Internet regulation (51 per cent disagree that the internet should never be regulated), along with Australians and many European Union Web users, in France, the U.K., Spain, Germany, and Portugal. On the other hand, there was disagreement in Mexico (72 percent agree that no government should regulate the Internet) and other non–EU countries, such as Nigeria (77 percent) and South Korea (83 percent) .
In the U.K., there was no “Internet regulation” survey until 2007 when a limited scale survey was conducted . In summary, the survey showed that 90.21 percent of the participants were unaware of the existence of CleanFeed system and from the few that heard about it before, only 14.81 percent understood it completely. Even fewer learned about CleanFeed by official statements of participating bodies (11.1 percent from U.K. government’s statements and 22.2 percent of BT’s statements). Almost 61 percent (actually 60.87 percent) of the participants did not trust BT and 65.22 percent did not trust IWF to be responsible for a silent content blocking system in the U.K.
In the U.K. survey, a majority of the participants preferred an open content blocking system targeting child abuse content, rather than no Internet regulation. More specific, 65.2 percent of the surveyed U.K. Internet users would prefer to see a message stating that a given site was blocked, 57.3 percent would like to have access to a form for unblocking a given site, and 68.5 percent would prefer more frequent briefing by BT, IWF and the UK.
Taking into account the absence of important related surveys around the world, an international project was started in 2010 focusing on measuring public opinion of global Internet users. WebObserver.net gathers in a single site most related surveys conducted in the past; it is designed to conduct new surveys in Greece and Germany, with the aid of course of scholarly researchers in each country.
Figure 9: The first page of the International project WebObserver.net’s portal.
About the project
WebObserver.net is an international project measuring what Internet users “think” about emerging Internet regulations as well as inform citizens about this crucial issue. Surveys are conducted in different languages worldwide with the cooperation of researchers and scholars in different countries. These surveys have many questions in common, providing a means to collect data on Internet regulations and measure the opinion of users around the globe.
Participation in the project
WebObserver.net is looking for participants in many countries in order to conduct surveys regarding Internet regulations. Priority countries include the U.S., Germany, France, United Kingdom, Canada, Australia, Russia, Italy and Turkey. The project provides to researchers a variety of questions in English plus various tools. Collaborators can translate these questions for their own target audiences. For further details, visit www.WebObserver.net (http://webobserver.net/) or contact the project’s moderator at nkoumart [at] jour [dot] auth [dot] gr.
Survey in Greece
A survey was conducted in June 2010 at Aristotle University of Thessaloniki with the aid of WebObserver.net. This pilot survey was based on a limited but highly educated sample of Internet users, which included M.A. and Ph.D. students and teaching stuff in the Department of Journalism and Mass Media Communication. This audience were quite well acquainted with Web technologies and sensitive to Internet regulations.
The survey demonstrated that 49.1 percent were well informed about Internet regulations, while an additional 30.9 percent were aware of these issues. Only 2.7 percent knew about Internet regulations by official statements from participating bodies. The majority of the participants were positively inclined towards the implementation of an Internet regulation system in Greece (47.3 percent answered “yes” while 30.9 percent answered “Yes, but only under certain conditions”) rather than no Internet regulation at all (only 21.8 percent answered “no”).
Figure 10: Do you agree with the implementation of an Internet regulation system in Greece?
If an Internet regulation system was implemented, 37.9 percent wanted it to be operated by university–based institutes; 19.5 percent by non–governmental organizations, such as Reporters Without Borders; 12.6 percent by institutes outside universities; and, 11.5 percent by a government service supervised by a related ministry (Figure 11).
Figure 11: In Greece, who would operate an Internet regulating system?
Regarding what kind of content should be targeted by such a system, 35.1 percent pointed to pornographic sites (hosting illegal pornographic content); 34 percent hate speech sites; 13.4 percent defamation online content; and, 7.2 percent sites providing illegal access to multimedia, such as movies, music, and books. (Figure 12).
Figure 12: In Greece, what kind of content should be targeted?
Other responses were gathered in this survey that could be a starting point for testing Greek public opinion on Internet regulations. A replication of the survey will be conducted in the near future.
The 2010 Greek survey results agree with earlier independent surveys in Australia and the U.K. The majority of surveyed Greeks indicated that they preferred some form of Internet regulation over no regulation at all, similar to those measured in related surveys in Canada. On the other hand, Internet users in Australia did not agree with planned government policy for the implementation of opt-out content filtering system (74.3 percent in 2007 survey). Secondly, the Greek survey found that few Internet users were aware of Internet regulation systems based on official announcements. Only 2.7 percent of those Greeks surveyed knew about these systems from official statements, compared to 11.1 percent of U.K. Internet users who learned about CleanFeed from the government and 22.2 percent who learned about it from BT. Thirdly, Greek, British, and Australian users generally don’t trust their governments to be responsible for the implementation and operation of these systems. Only 11.5 percent of Greek users trusted a government service supervised by the related ministry for the implementation of an Internet regulation system, while in Australia the percentage falls to four percent. U.K. users don’t trust both of the chosen participating bodies. Only five percent of Australian Internet users trusts ISPs to be responsible for protecting children online.
Internet regulation systems have a substantial history in limited Internet access. In Western democracies, there has been a specific focus on illegal content, defined as child abuse content to simply “inappropriate material.” The use of these systems in many states has been invisible, without clear notice to citizens about their operation. We wonder if governments as a whole are ready to examine public opinion on these issues, and modify policies accordingly. WebObserver.net provides a mechanism for understanding public opinion about these issues as well as a resource to provide access to more details about Internet regulations in various states. We hope that scholars in different states will use this tool to understand local opinion on these important issues and in turn provide details on public sentiment to policy–makers.
About the authors
Nikolaos Koumartzis is a Ph.D. research student in the Computer Lab of the Department of Journalism and Mass Communication at the Aristotle University of Thessaloniki and moderator of the international project WebObserver.net. His research interests include Internet regulation policies and online content filtering systems implemented at the national level.
E–mail: nkoumart [at] jour [dot] auth [dot] gr
Andreas Veglis is an associate professor in the Computer Lab of the Department of Journalism and Mass Communication at the Aristotle University of Thessaloniki. His research interests include distributed publishing systems, Web applications, and information technology in journalism. Web: http://pacific.jour.auth.gr/veglis/
E–mail: veglis [at] jour [dot] auth [dot] gr
2. Reporters Without Borders, “Netizens imprisoned,” at http://en.rsf.org/press-freedom-barometer-netizens-imprisoned.html, accessed 28 September 2011.
3. J. Cowie, 2011. “Egypt leaves the Internet,” at Renesys blog (27 January), at http://www.renesys.com/blog/2011/01/egypt-leaves-the-internet.shtml, accessed 28 September 2011.
5. International Telecommunication Union (ITU), 2011. “Indonesia: RIM to filter Internet for BlackBerry users,” at http://www.itu.int/ituweblogs/treg/Indonesia+RIM+To+Filter+Internet+For+BlackBerry+Users.aspx, accessed 28 September 2011.
6. D. O’Brien 2011. “Watching Egypt disappear from the Internet,” Committee to Protect Journalists (28 January), at http://www.cpj.org/internet/2011/01/watching-egypt-disappear-from-the-internet.php, accessed 28 September 2011.
9. R. Clayton, 2005. “Technical report: Anonymity and traceability to cyberspace,” Technical report, UCAM–CL–TR–653, University of Cambridge, Computer Laboratory (November), at http://www.cl.cam.ac.uk/techreports/UCAM-CL-TR-653.html, accessed 28 September 2011.
10. N. Koumartzis, 2008. “BT’s CleanFeed and online censorship in UK: Improvements for a more secure and ethically correct system,” M.A. thesis, London College of Communication (University of the Arts, London), at http://webobserver.net/?p=146, accessed 28 September 2011.
11. Reporters Without Borders, 2010. “Web 2.0 versus control 2.0” (18 March), at http://en.rsf.org/web-2-0-versus-control-2-0-18-03-2010,36697, accessed 15 October 2010.
12. U.K. Office of Public Sector Information (OPSI), “Protection of Children Act 1978,” at http://www.opsi.gov.uk/Acts/acts1978/PDF/ukpga_19780037_en.pdf, accessed 6 September 2008; Out–Law.com, 2004. “How do child porn laws affect UK businesses?” (28 September), at http://www.out-law.com/page-4927, accessed 6 September 2008.
13. J. Zittrain and B. Edelman, 2003. “Documentation of Internet filtering worldwide” (24 October), at http://cyber.law.harvard.edu/filtering/; OpenNet Initiative, 2008. “Europe,” at http://opennet.net/research/regions/europe, accessed 25 July 2010; Libertus.net, “ISP voluntary/mandatory filtering,” at http://libertus.net/censor/ispfiltering-gl.html, accessed 29 August 2008.
15. King Abdulaziz City for Science and Technology, Internet Services Unit, 2006. “Local content filtering procedure,” at http://www.isu.net.sa/saudi-internet/contenet-filtring/filtring-mechanism.htm, accessed 20 August 2010.
16. Telenor Group, 2004. “Telenor and KRIPOS introduce Internet child pornography filter,” Telenor press release (21 September), at http://www.telenor.com/en/news-and-media/press-releases/2004/telenor-and-kripos-introduce-internet-child-pornography-filter, accessed 28 September 2011.
17. Telenor Group, 2005. “Telenor and Swedish National Criminal Investigation Department to introduce Internet child porn filter” (18 May), at http://telenor.com/en/news-and-media/press-releases/2005/telenor-and-swedish-national-criminal-investigation-department-to-introduce-internet-child-porn-filter, accessed 2 September 2010.
20. M. Bright, 2004. “BT puts block on child porn sites,” Observer, (5 June). at http://www.guardian.co.uk/technology/2004/jun/06/childrensservices.childprotection, accessed 25 August 2010.
21. M. Ballard, 2006. “Govt sets target for blocking child porn sites” (18 May), at http://www.theregister.co.uk/2006/05/18/uk_site_blocking/, accessed 28 September 2010.
22. CTV News, 2006. “New initiative will see ISPs block child porn sites” (23 November), at http://www.ctv.ca/CTVNews/Canada/20061123/isps_childporn_061123/, accessed 25 October 2010.
23. ABC, 2007. “Conroy announces mandatory internet filters to protect children,” ABC News (31 December) at http://www.abc.net.au/news/stories/2007/12/31/2129471.htm, accessed 15 October 2010.
24. M. Meiss and F. Menczer, 2008. “Visual comparison of search results: A censorship case study,” First Monday, volume 13, number 7, at http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2019/1988, accessed 28 September 2011; T. Moroi and N. Yoshiura, 2008. “;Discovering Web pages censored by search engines in Japan,” CIMCA ’08: Proceedings of the 2008 International Conference on Computational Intelligence for Modelling Control & Automation, pp. 1,171–1,176.
26. R. Clayton, 2005. “Technical report: Anonymity and traceability to cyberspace,” Technical report, UCAM–CL–TR–653, University of Cambridge, Computer Laboratory (November), at http://www.cl.cam.ac.uk/techreports/UCAM-CL-TR-653.html, accessed 28 September 2011; N. Koumartzis, 2008. “BT’s CleanFeed and online censorship in UK: Improvements for a more secure and ethically correct system,” M.A. thesis, London College of Communication (University of the Arts, London), at http://webobserver.net/?p=146, accessed 28 September 2011.
27. B. Edelman, 2003. “Web sites sharing IP addresses: Prevalence and significance,” Berkman Center for Internet & Society, Harvard Law School (12 September), at http://cyber.law.harvard.edu/archived_content/people/edelman/ip-sharing/, accessed 20 August 2008.
28. M. Dornseif, 2003. “Government mandated blocking of foreign Web content,” Security, e–learning, e–services: Proceedings of the 17. DFN–Arbeitstagung uber Kommunikationsnetze, pp. 617–648, and at http://md.hudora.de/publications/200306-gi-blocking/200306-gi-blocking.pdf, accessed 20 August 2008.
30. Figure 1, in R. Clayton, 2008. “Failures of a hybrid content blocking system,&edquo; op. cit.
31. Reporters Without Borders, 2010. “Web 2.0 versus control 2.0” (18 March), at http://en.rsf.org/web-2-0-versus-control-2-0-18-03-2010,36697, accessed 15 October 2010.
32. R. Deibert, 2008. Access denied: The practice and policy of global Internet filtering. Cambridge, Mass.: MIT Press.
34. ONI, “Table 2.4: Spectrum of Cyberspace Content Controls in the CIS,” In: R. Deibert, J. Palfrey, R. Rohozinski, and J. Zittrain, 2010. Access controlled: The shaping of power, rights, and rule in cyberspace. Cambridge, Mass.: MIT Press, p. 23.
35. ONI, “Table 1.5: Summary of filtering,” In: R. Deibert, 2008. Access denied: The practice and policy of global Internet filtering. Cambridge, Mass.: MIT Press, p. 19.
36. N. Koumartzis, 2008. “BT’s CleanFeed and online censorship in UK: Improvements for a more secure and ethically correct system,” M.A. thesis, London College of Communication (University of the Arts, London), at http://webobserver.net/?p=146, accessed 28 September 2011; M. Ballard, 2006. “Govt sets target for blocking child porn sites” (18 May), at http://www.theregister.co.uk/2006/05/18/uk_site_blocking/, accessed 28 September 2010.
37. M. Bright, 2004. “BT puts block on child porn sites,” Observer, (5 June). at http://www.guardian.co.uk/technology/2004/jun/06/childrensservices.childprotection, accessed 25 August 2010.
40. T. Richardson, 2004. “ISPA seeks anaysis of BT’s ‘CleanFeed’ stats: Web filtering figures ‘could be misleading’,” Register (21 July), at http://www.theregister.co.uk/2004/07/21/ispa_bt_cleanfeed/, accessed 25 October 2010.
41. L. Edwards, 2006. “From child porn to China, in one Cleanfeed,” scripted, at http://www.law.ed.ac.uk/ahrc/script-ed/vol3-3/editorial.asp, accessed 28 September 2011.
42. CTV News, 2006. “New initiative will see ISPs block child porn sites” (23 November), at http://www.ctv.ca/CTVNews/Canada/20061123/isps_childporn_061123/, accessed 25 October 2010.
43. ABC, 2007. “Conroy announces mandatory internet filters to protect children,” ABC News (31 December) at http://www.abc.net.au/news/stories/2007/12/31/2129471.htm, accessed 15 October 2010.
44. D. Pauli, 2008. “No opt–out of filtered Internet,” Computerworld Australia (13 October), at http://www.computerworld.com.au/article/263637/no_opt-out_filtered_internet/, accessed 28 September 2011.
45. D. Earley, 2009. “Rudd’s Internet blacklist includes dentist, kennel, tuckshop,” Courier–Mail (20 March), at http://www.news.com.au/couriermail/story/0,23739,25214413-3102,00.html, accessed 20 October 2010.
46. C. Depken, 2006. “Who supports Internet censorship?” First Monday, volume 11, number 9, at http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/1390/1308, accessed 28 September 2011.
49. A. Moses, 2009. “Web censorship plan heads towards a dead end,” Syndey Morning Herald (26 February), at http://www.smh.com.au/news/technology/biztech/web-censorship-plan-heads-towards-a-dead-end/2009/02/26/1235237810486.html?page=fullpage, accessed 23 October 2010.
50. BBC World Service, “Four in five regard Internet access as a fundamental right: Global poll,” conducted at 7 March 2010 for BBC World Service at http://www.globescan.com/news_archives/bbc2010_internet/, accessed 28 September 2011.
51. N. Koumartzis, 2008. “BT’s CleanFeed and online censorship in UK: Improvements for a more secure and ethically correct system,” M.A. thesis, London College of Communication (University of the Arts, London), at http://webobserver.net/?p=146, accessed 28 September 2011; M. Ballard, 2006. “Govt sets target for blocking child porn sites” (18 May), at http://www.theregister.co.uk/2006/05/18/uk_site_blocking/, accessed 28 September 2010.
Received 25 November 2010; revised 6 February 2011; accepted 12 August 2011.
Copyright © 2011, First Monday.
Copyright © 2011, Nikolaos Koumartzis and Andreas Veglis.
Internet regulation: The need for more transparent Internet filtering systems and improved measurement of public opinion on Internet filtering
by Nikolaos Koumartzis and Andreas Veglis.
First Monday, Volume 16, Number 10 - 3 October 2011
A Great Cities Initiative of the University of Illinois at Chicago University Library.
© First Monday, 1995-2016.