This article outlines the nature of hacking and then draws implications from this for understandings of technology and society in the digital age. Hacking is analysed as having a material practice related to computers and networks taken up by two core groups: crackers who break into other people’s computers and network and the free software and open source who produce software based on an understanding of property as distribution. Hacking works constantly to develop determinations between technology and society in both directions. This conclusion is then theorised in relation to Hutchby’s concept of affordances and is compared to classic accounts of technological determinism. Accounts of technology and society in the digital age need to consider both technological and social determinations, that such determinations are particularly fluid in relation to programming and that understanding power and politics in relation technology needs a concept of technological and determination.
Hacking and cracking
Hacking, free software and open source software
Hacking the social and hacking the non–hack
Power and hacking
The possibilities for power and society in times of digital media and the Internet are still not well understood. The changed possibilities for relations of power, particularly for something we might call “media power”, implied by radically changed relations of production, distribution, alteration and redistribution of any object that can be digitised at times seem obvious and at times hard to bring into focus. Does electronic voting change the nature of democracy? Does the emergence of citizen journalism and grassroots documentary–making significantly alter relations of power in the media? Do peer–to–peer file distribution systems fundamentally change the political economy of music?
Alongside this uncertainty about what is and is not significant in such changes, we can immediately see that technological determinism is often implied in accounts of the changes that come with the rise of digital media and the Internet. The emergence of new technologies is seen as producing new possibilities, and in questions —such as “how would citizen journalism, like Indymedia (http://www.indymedia.org), be possible without radically altered technologies of recording video and distributing it via the Internet?” — there are embedded notions of technology and causation.
Hacking is a significant contributor to such discussions, offering an empirical example that engages intimately with relations of power, technology and society. Hacking’s location at the heart of the central technologies of information societies also offers analysis a privileged vantage point for tracing the nature of twenty–first century societies. However, the nature and meaning of hacking is uncertain. It remains unclear exactly what “to hack” means. There are three existing groups or ideas about hacking that offer a beginning point: there is the hacker who breaks into computer systems; the hackers who write software; and, hacking as the essence of twenty–first century creativity. To take advantage of hacking’s central role in twenty–first century relations of power and technology it is necessary to develop a two–stage argument. The first stage will explore the nature of hacking providing a broad definition. The second will take these conclusions and explore what they may tell us about power, technology and society.
The main works that explore the overarching social and political consequences of hacking are by Wark and Himanen. Himanen (2001) argues that hacking should be understood as a new philosophy of business. He believes hackers have created a new way of working appropriate to the twenty–first century that can be captured in seven values: passion, freedom, social worth, openness, activity, caring and, the highest value, creativity.
“… creativity — that is, the imaginative use of one’s own abilities, the surprising continuous surpassing of oneself and the giving to the world of a genuinely valuable contribution.” 
In opposition to this business orientation, Wark (2004) believes hackers are the new radicals of the twenty–first century. Hackers in their pursuit of free creativity turn out to be, for Wark, the revolutionary class of the twenty–first century.
“To hack is to differ … . Hackers create the possibility of new things entering the world. Not always great things, or even good things, but new things. In art, in science, in philosophy and culture, in any production of knowledge where data can be gathered, where information can be extracted from it, and where in that information new possibilities for the world produced, there are hackers hacking the new out of the old.” 
These are, in one way, opposed views of the political and social meaning of hacking, with one seeing hacking as an ally of business and the other seeing hacking as the agent of revolution . Yet, from a different angle, Wark and Himanen are similar because both see the hack as a moment of creativity or of making a difference. This means that both definitions of hacking have become detached from material practices related to computer, communications and network technologies and have made hacking into an abstract process of creativity or of producing differences. Their analysis eliminates empirical specificity, lifting hacking out of time and space and making it into an abstract “difference engine”. For this reason their work has limited value despite their seeming identification of the core of hacking.
To move an understanding of hacking forward we can adopt the claim that hacking means creativity but remember that hacks are also material practices that generally take place in relation to computer, communication and network technologies. Hacks involve interactions with such technologies and are thus also material practices, as opposed to the abstract practices proposed by Wark and Himanen. A hack is a form of material practice that creates a difference in computer, communication and network technologies. To this initial definition we should add two conclusions that emerge from the most significant existing empirical work on hacking which is found in Taylor (1999), Turkle (1984) and Thomas (2002). All three find that hackers are attracted to illicit acts, both in individual moments when they may transgress a law or social convention and in collective moments when they set up alternative technical infrastructures. From their work we can also draw the second proposition that hackers are deeply concerned with differentiating and identifying themselves in relation to each other through arguments over the quality of hacks couched in technical language. Drawing on this prior work on hackers, I suggest the initial definition to be that a hack is a material practice that involves making a difference in computer, communication and network technologies, which may well be illicit and be subject to seemingly technical criteria of excellence through which community relations are negotiated.
Hacks are material practices because they require a whole infrastructure of computers, wires, programming languages, etc. and of positions of the body (such as the ability or not to touch–type). From this basis the repeated practices of entering commands into a computer can occur, creating moments when a hack happens. To place this initial definition into the world we can look at, in turn, the two central types of material practices of hacking and their communities: cracking and free programming (free software and open source). To understand the nature of hacking we need to engage with these two types of hacks, which I will do by outlining their key features. From this it will also be possible to see a central dynamic that both share and which, I argue, constitutes hacking as hacking. Once these are understood a range of related practices that draw on this central dynamics can also be added, to ensure that richness and diversity of hacking is visible. Following this it will be possible to draw some key lessons from the nature of hacking and to pose these in relation to issues of technological determinism in the digital age .
Hacking and cracking
Cracks are alterations to the technologies of computers and networks that turn the existing state of such computers and networks against their current uses, opening up illicit and unintended access to the cracker. In the mid to late 1990s, particularly through popular media interest, what I am calling cracks were largely what was known as hacking. However, with the return to prominence of hacking as clever programming, it is important to both include cracking within hacking and to distinguish it. To do so I will use the term cracking for the kinds of practices I am about to describe but will also argue that this is one of two key groups that define the overall nature of hacking.
The essentials, though not the richness, of cracking can be explained in the interaction between acts and community. Acts can be briefly outlined through four different types of cracks: day zero exploit, day zero+one exploit, social engineering and script–kidding. The acts are different types of “cracks”, and the community in which these actions are taken can be understood as being driven by a particular interaction between peer recognition and peer education. I will briefly outline these in turn.
A day zero exploit is a way of altering technologies to produce an unexpected change in authority over a computer/network technology that has never been achieved before. The famous crack of a U.S. government IT security consultant’s home network, which Kevin Mitnick was accused of doing, was a day zero exploit. The technique employed in this case was IP spoofing, which had been theorised but no recorded instance had been publicly discussed. In this case the person who was attacked scoured the log files of his computer network to reconstruct the attack, having understood it he was then able to use his reconstruction to demonstrate that an IP spoofing attack had happened . A day zero plus one (or more) exploit is the utilisation of an existing technique. For example, someone seeing Shimomura present a talk about the attack on his network, who then explored the Internet and found further theoretical documentation of how to do such an attack, could then try to find targets and launch an attack. They may well also seek advice from other crackers, through online fora or in chatrooms. Though lacking the absolute originality of a zero–day exploit, running day zero plus one exploits often requires a significant level of expertise and innovation within known vulnerabilities.
Social engineering is the cracking of computers without computers. Classic techniques include trashing or dumpster diving (looking through rubbish bins for information); shoulder surfing; and roleplaying (tricking people into giving out their details). As Wall (2007) has argued, social engineering has moved into automation. Whereas “classic” social engineering would have involved techniques like phoning up, pretending to be something like a computer engineer from the same company or institutions as the person being phoned and then talking someone into giving out their username and password. Wall has identified the automation of social engineering. One example is when hacked Web sites open up, unrequested, a window that tells the user their computer is likely to be infected with a virus. Then, no matter what the user does with this window, a graphic automatically opens up which shows a virus scan being run on the computer. This scan is actually doing nothing, it is simply a graphic but it appears legitimate and at its finish it tells the user it has found an infected file and then, again automatically, asks the user to install software to clean the computer. Now, no matter what the user clicks on to get rid of the demand to install software (cancel, close window) the box requesting an installation of a “virus removal programme” continually reappears unless the user knows how to kill off the process. The supposed programme is in fact itself a virus, not a virus–removal, and installation will then infect an uninfected computer. Though this example uses network technologies and computers, the essence of the hack is social in its attempt to trick a user into infecting their uninfected PC . Script–kidding involves taking known exploits and automating them. This reduces the knowledge necessary to launch an attack to the pressing of buttons on gui programmes. For example, there is a programme called BackOrifice that automates installing Trojan programmes on a Microsoft network, allowing the person using BackOrifice to act as a systems administrator and view all that is happening across a network . This is an often derided form of cracking, with “script–kiddy” used as an insult, because automation means less knowledge is necessary than for other types of cracks.
We can already see that at heart cracking is about taking a social ambition (to control a computer or network) and reordering technologies to fulfil that ambition, while at the same time starting from a technological basis (only being able to take certain actions due to the technologies to hand) from which only certain social ambitions may be realised. There are a number of elements here that need untangling and which this argument will return to, the core three being social, technological and causation/determination. Cracking is about using social norms to determine technological constraints and it is about using technological determinations to cause different social norms.
We can also see that across the four types of crack there is both a reliance on knowledge and a willingness to embody knowledge. This combination refers us to the community of crackers in the connection of peer education and peer recognition that occurs in cracking and that gives cracking its dynamism. Peer recognition refers to the ways crackers validate each other as members of the cracking community, a process that exists in some form in all communities. Crackers have particular difficulties creating a process of recognising other peers because it can be difficult to prove the things they have done; it is easy for a cracker to claim they entered a particular site or invented a new technique, but how can they convince other crackers? The primary way of solving this problem is to explain techniques, which may be portable techniques that new crackers can take off to test or may be techniques specific to a particular site. If the proof of doing something is, fundamentally, showing someone else how to do it then inevitably crackers are drawn into not just explaining something once but coaching others into repeating the crack. In this way a form of peer education is intimately tied into one of the basic processes of any community — how to recognise other members.
Crackers who wish to remain secret can do so but they run the risk of being excluded from their community; for example, losing chances at being enrolled in secured chatrooms or simply missing out on elite pizza–eating contests at conferences. The only alternative to secrecy and the only consistent and convincing way to peer recognition for crackers is to teach other crackers. This embeds a strong dynamic within cracking that constantly drives techniques and exploits upwards. This drive to greater complexity means that crackers are always attacking the existing state of technology and re–ordering it at the everyday level. One of the few quantitative academic investigations of cracking demonstrates this upward spiral by analysing reports to CERT (http://www.cert.org/) of cracks, CERT being the major Internet security organisation operated with U.S. Howard (1997) created a taxonomy of security breaches and analysed CERT records between 1989–1995. The two trends in severe attacks that he identified were:
“First, the sophistication of intruder techniques progressed from simple user commands, scripts and password cracking, through the use of tools such as sniffers (1993) and toolkits (1994), and finally to intricate techniques that fool the basic operation of the Internet Protocol (1995). The second trend was that intruders became increasingly difficult to locate and identify.” 
The increasing sophistication of cracks is primarily the result of an ongoing dynamic in which peer recognition is fused with peer education.
This everydayness of cracking is an important quality and the ephemeral nature of cracks is one consequence; defaced Web sites can be repaired by copying backed–up files, often in minutes; sites removed from the Internet by denial of service attacks are usually back again within hours or a day; cracked open networks can be patched, requiring new exploits. This makes crackers like ghosts, constantly interacting with technology, having their own actions determined by the technologies they are using while revising the technologies of others but all at an everyday level. Crackers cause and are caused by intermingled technological and social determinations. This intermingling of social, technological and determination is a dynamic we will see repeated in the second key component of hacking, the free software and open source movement (FOSS).
Hacking, free software and open source software
The free software and open source movement (FOSS) always existed alongside crackers but was for a time barely in public view, particularly during the early 1990s when sensational accounts of cracks dominated media understandings of hacking. However, in the last 10 years the understanding of hacking as programming has become important again. Here we find experts making programmes in collaborative and open ways, with a novel understanding of property. Such programmes are significant particularly in the backrooms of IT, where they construct much of the virtual world; for example, BIND is the dominant DNS server and Apache runs around 60 percent of Web sites (Netcraft, 2007). On top of these numbers are some qualitatively important programmes such as the Emacs text editor and the GNUC compiler. In addition are some minority programmes, such as the Firefox Web browser or GNU/Linux operating system, which are growing in importance and already have both significant symbolic effects (in proving the ability of FOSS methods to create complex, stable programmes) and market effects (providing significant alternatives in quality and freedom to commercial dominance).
Two quick examples will help to stake out the territory occupied by FOSS. Both Linux and Apache are collaborative software writing projects. Apache began when a number of administrators of existing Web sites met to improve their work and began to integrate different programmes they were using. From this beginning a collaborative network of programmers developed which contributes code to the ever developing Web server programme called Apache. Programming is distributed among volunteers, who may also work for corporations willing to donate programmer time, but decisions are controlled through a council and formal voting rules . Linux began as a project by Linus Torvalds to add some functionality to an existing programme of his (a terminal emulator). By the time he had added more functions, he realised he had a working kernel (the central “butler” of an operating system) that could be added to various existing free software programmes to create a complete operating system that is now usually called simply Linux (though the combination of Torvald’s kernel and free software programmes mean it might be more accurately called GNU/Linux). Torvalds released the Linux kernel and from the beginning began to receive corrections from other users that he was happy to integrate. From this beginning Linux is now structured with anyone able to look at the code and contribute changes, though the scale has grown so vast that there are now a series of “lieutenants” around Torvalds who oversee specific areas of code, with Torvalds remaining at the top as the final arbiter of what goes into the new versions of the Linux kernel .
The massive programming efforts that characterise Linux, Apache and a series of other programmes — from full office software suites in OpenOffice, to powerful graphics programmes in GIMP and so on — are part of a movement and community. Weber’s (2004) influential work on the nature of open source identified the three components of FOSS to be property, community and politics. I have adapted these three to community, object and property because I believe that politics permeates property and community to such an extent that it cannot be sensibly separated. I also think that Weber underestimates the role of the object of programming. While still largely agreeing with much of Weber’s work, I argue the three components of FOSS are a community of collaborative experts, the importance of objects and new meanings of property.
Free software and open source programmers often refer to themselves as being part of a movement or a community and they may well discuss how one of their defining characteristics is total access to source code, both to view and change it. Source code refers to the series of instructions that make up any functioning software programme, access to the code means access to the ability to fundamentally understand how a programme works and to intervene into the programme changing how it works. This access creates the FOSS characteristic of voluntary selection into tasks that are perceived to be important, interesting or likely to generate esteem. Such a form of contribution is codified by Raymond’s (2001) famous comparison of the cathedral and the bazaar as different models for programme production. Raymond compared the solo programming efforts of “genius” programmers, who produced to a single conception like building a cathedral, to a babbling bazaar in which all kinds of small scale programming efforts go on in parallel finally, somehow, producing overall programmes. Though there are intermediate forms, as Apache demonstrates with its combination of distributed programmers but formalised council and voting, Raymond’s bazaar captures how FOSS has developed into a distributed, collaborative programming effort from a wide range of coders. The community is composed overwhelmingly of those who can programme and so presumes a quite particular form of expertise.
The second element of FOSS is the importance of objects, or as hackers sometimes put it “does it run?”. One often under–appreciated facet of FOSS is the way it has a particular object around which there are possibilities for the resolution of social and technological debates. How do the communities of distributed, collaborating experts organise themselves to focus on particular projects and so to achieve things like Linux or Apache? A key answer is that controversies revolve around whether programmes “run”; this is not in a simple technological sense but “to run” becomes a framework within which a controversy or direction can be resolved.
There are cultural factors at work in what it means to “run” but the object that is a software programme produces a way of closing debate by executing a command and drawing some conclusions from whatever happens next. “Does it run” captures how FOSS has a means of resolving disputes and moving projects forward by focusing cultural and technological concerns on specific and tangible tests of software. That the object is a software programme whose inner workings are open to change means that FOSS produces a dynamic relationship to the opening and closing of its particular technologies. This dynamism is driven not just by collaborating experts who have a particularly protean form of technology at their fingertips but also derives from the third component of FOSS, its re–conception of property.
As Weber argues, FOSS is fundamentally organised around property as the right to distribute, in opposition to our normal notion of property as the right to exclude. We usually use property relations, and all their attendant institutions such as law courts and the police, to exclude people from using the things that are our property; my car can only be used by myself and those I offer it to, I control access to my house . FOSS builds on the rights of exclusive use of property, and hence existing laws and legal frameworks, to invert “property as exclusion” and enforce distribution. This is particularly in relation source code and the right to change source code but it also requires that any changes to source code have to be redistributed to the world. In this moment exclusion is turned into distribution on the basis of the owner of property’s right to define what exclusion means in relation to their property. The moment of inversion is that someone who owns a programme effectively claims: “I have exclusive rights to define use of this programme within the law and I define legitimate use as free access, free ability to change and the demand to redistribute changes, I therefore also define illegitimate use as denying others access to changes.” Though the essential insight is defined in this way, to enforce it in existing legal contexts requires significant legal work, which has been undertaken through the definition of a range of licenses that aim to codify the right of distribution based on the right of exclusion. The most famous such licence is the GNU Public Licence (of which there have been three versions) but there are others.
Taken together the trio of community of experts, resolution through objects and property as distribution define a powerful engine for programme production, which also embeds within itself the nature of digital objects. As with cracking we see in FOSS a constant interaction of technology and society, including moments of determination and being determined. A FOSS hacker can take a particular technology, in the source code of a programme, and so be determined by what that code allows but can then, because they have expertise, community support and access to the code, start to alter the programme thereby injecting various social imperatives within that technology, and so redefining whatever determinations will affect a subsequent user of that programme. Unlike cracking FOSS’s determinations tend to result in infrastructures rather than everyday moments. While the making and unmaking of social and technological determinations occurs in the everyday moments of programmers typing on computers to churn out code and then passing such code around to be integrated into programmes, the result has been a wide range of programmes that form much of the infrastructure of digital life (like my use of OpenOffice to write this which is in turn running on top of a version of Linux which is in turn reliant on BIND, among other programmes, to connect to the Internet). This is clearest in the internals of the Internet, where many FOSS programmes are found running in ways invisible to most of us.
Cracking and FOSS create the core relations of hacking, both of them engaging in varied and contradictory ways with society, technology and causation, one primarily at the everyday level of computers and networks and one primarily within digital infrastructures. It is this intermingling of social and technological determinations that marks out the dynamism of hacking and its broad social significance.
Hacking the social and hacking the non–hack
Between cracking and FOSS we have the core of hacking in their joint explorations of how to engage in social and technological determinations of computer and network technologies and socialities, but this is not all there is to hacking. Around the material practices of crackers and FOSS hackers there are a range of other activities that demonstrate the breadth and richness of hacking. I will group these roughly under the two headings of hacking the social and hacking the non–hack and briefly introduce them.
Hacking the social refers to groups that take up the interaction of technological and social causation in computer and networked technologies created by hackers and who then apply it to changing society. The key distinction of these groups from cracking/FOSS is often not personnel, techniques or cultures but that the constant and simultaneous use and denial of both technological and social determinisms is in itself the key social change for hacking, whereas for the following groups it is a tool to effect different types of social change. Instead of playing within constantly revised determinations of technology and society, some take the ability to make such revisions and apply it to making social change through political activism, war, terrorism or crime.
Hacktivists are political activists, most often associated with the alter–globalisation movement, who utilise hacking techniques to create grassroots activist political campaigns. Hacktivists produce both ephemeral electronic civil disobedience actions, such as blocking online sites with mass electronic action, and they try to create infrastructures of secure anonymous communication often to support human rights workers (Jordan and Taylor, 2004). Cyberwar is the use of cracks by one nation–state against another nation–state. For example, it is widely believed that the Chinese government has been hacking both the U.S. and European governments for some time, seeking out illicit information and in one incident crashing the U.K. Parliament’s network (AFP, 2005; Norton–Taylor, 2007; Pilkington and Johnson, 2007). Cyberterror is the use of cracking techniques in a strategy of seeking social change through attacks that produce psychological as much as physical damage. There are almost no recorded cyberterror attacks, though the potential for damaging communication systems, infiltrating civilian infrastructure or attacking military targets seems clear (Verton, 2003; Weimann, 2006). Finally, cybercrime is the use of cracks to generate personal gain, usually financial gain. The possibilities here range from widespread “phishing” attacks that seek to compromise individual’s computers to cracking open bank systems to illicitly transfer money (Wall, 2007; Yar, 2006).
All four groups here take an existing social phenomena and reinvent it by injecting hacking’s ability to order and reorder social and technological determinations, they all apply hacking to society. The second cluster of activities related to hacking I have termed hacking the non–hack and they particularly help to see the boundaries of hacking, for example by separating programming from hacking. There are three groups here.
The first group is the Creative Commons, largely based on the arguments and organisation of Lawrence Lessig (2001). The Creative Commons tries to extend hacking’s property principles to all copyrightable materials. Here one of hacking’s key innovations, the inversion of property as exclusion to property as distribution, is taken outside of hacking itself and applied widely. A second group are non–programmers. Eric Raymond’s role as a publicist and marketer for open source included helping to organising a new name for the movement in “open source” . Eben Moglen founded the Software Freedom Law Centre (http://www.softwarefreedom.org/) to provide free legal advice to FOSS hackers and was one of two people who worked on writing the third version of the GNU Public License (Moody, 2006). Here is hacking without programming. The third group is the programming proletariat who, like most FOSS hackers, programme the code that makes up software programmes but, unlike FOSS hackers, do not choose or control which technological and social determinations they engage with. In the great coding factories of corporations like Microsoft, Oracle and Google are many programmers who are often allowed symbols of hacking’s freedom — such as factories that are laid out like university campuses, for example in Microsoft’s Redmond home — but actually have none of the freedom hackers have. Here is programming without hacking (Bronson, 1999).
Three points emerge from considering these seven groups that draw on the central dynamics of hacking. First, we see from “hacking the social” that hacking’s approach to society and technology can be politicised and utilised in the service of groups of non–hackers who have non–hacking aims. It is accordingly important to hold onto and explore a bit further what is meant by “determination”, because it is hacking’s ability to determine and redetermine that these politics find useful.
Second, we can see boundaries being drawn around hacking but these boundaries cannot be simply equated with programming or software. Rather, we have seen programming non–hackers and non–programming hackers. This means it is important to keep in focus the dynamics of hacking when trying to understand what hacking is, as opposed to some of the obvious markers of hacking (such as, programming skills).
Finally, the example of Creative Commons emphasises that it is the protean nature of programming that underpins the dynamism hackers can give to determining and redetermining. Creative Commons shows this when it develops non–hacking positions because of the different nature of its object. With artistic products, what is finished is perhaps finished when the artists says so and a revision is an entirely new object, not a revised or 1.1 object. For this reason Creative Commons allows what is rejected in FOSS in a license that says you can take but cannot change the content of the object. Software continually develops or can be developed while remaining the same object, whereas the re–use of words or sounds of a poem or song tend to create a new poem or song. Such malleability while remaining the same thing is one factor underlying the simultaneous and contradictory determinations of society and technology that lie at the heart of hacking. It is because crackers and FOSS programmers can constantly take a particular software and then open it up that they can be determined by a technology as they simultaneously re–determine what the technology can do.
We now have the whole of hacking before us in the central dynamic produced by cracking and FOSS, with the various groups who hack the social or hack the non–hack. We can also see that this central dynamic is based on the nature of software and programming, that it can be politicised in non–hacking ways though the notion of determination and that the boundaries of hacking are determined by this dynamic and not by what are revealed as more superficial definitions such as whether someone is a programmer or not. Having reached this position, we can now turn to look at how all this relates to our understandings of technology and society in the information age. The starting point for such an analysis can be most usefully made by returning to the concept of technological determinism.
“Old fashioned” technological determinism looks something like this:
“In view of the simplicity of technological engineering and the complexity of social engineering, to what extent can social problems be circumvented by reducing them to technological problems? Can we identify Quick Technological Fixes for profound and almost infinitely complicated social problems, ‘fixes’ that are within the grasp of modern technology, and which would either eliminate the original social problem without requiring a change in the individual’s social attitudes, or would so alter the problem as to make its resolution more feasible?” 
Weinberg’s (1986) answer to his questions was a qualified “yes”, citing the hydrogen bomb as a technological fix for the problem of war. Weinberg’s version of technological determinism sees implementing a technology as an intervention from outside the social that corrects a social problem or directs society in a certain way. Winner’s (1977) classic discussion of technological determinism argues that an extreme view of technological determinism requires two hypotheses: “(1) that the technical base of society is the fundamental condition affecting all patterns of social existence and (2) that changes in technology are the single most important source of change in society.”  He notes that almost nobody holds this extreme version but many people hold related lesser versions. For example, a version is common in which technology is not “the single most important source of change” but is still an essential and important source of social change.
These kinds of technological determinism places technology at a “social” or macro level in which technological changes, usually a series of in retrospect linked technological changes, force widespread social change. Such views have been widely disproven across the social sciences, for a number of reasons but chiefly because they position technology outside of society and extended work in the social studies of technology and science demonstrates that technologies are as much social enterprises as anything else (Mackenzie and Wajcman, 1999). Yet despite the widely accepted refutation of technological determinism that social studies of technology offers, hacking suggests a difficulty with views that fail to take account of some kind of mutually determinative power between technologies and societies. An absolute rejection of technological determinism in any form makes activities such as FOSS programming very difficult to understand because of the interaction between determinations that constitute it.
It follows that it is important to acknowledge a problem with accounts of technology that imply or argue that technologies do not have specific material and determinative effects. Hutchby’s work is relevant here  and he suggests the concept of affordances provides an answer.
“… different technologies pose different affordances, and these affordances constrain the way that they can be read. … The concept of affordances is associated with the work of Gibson in the psychology of perception. For Gibson, humans, along with animals, insects, birds and fishes, orient to objects in their world (rocks, trees, rivers, etc.) in terms of what he calls their affordances: the possibilities that they offer for action. … Affordances may differ from species to species and from context to context. However, they cannot be seen as freely variable. While a tree offers an enormous range of affordances for a vast variety of species, there are things a river can afford which the tree cannot, and vice versa.” 
Hutchby suggests that while technologies can be read or utilised in variable ways, they also possess “affordances” that constrain the range of actions that can be taken with a particular technology. This is useful for this account of hacking because captures the sense in which technologies both are and are not open to social influence and it reasserts the influence of technology on society. It is this mutual sense of determination and redetermination that is at the core of hacking. A further aspect of Hutchby is worth drawing out as, though he does not use the term, the everyday seems to be the site of these mutual determinations.
“… the uses and the ‘values’ of things are not attached to them by interpretative procedures or internal representations but are a material aspect of the thing as it is encountered in the course of action. We are able to perceive things in terms of their affordances, which in turn are properties of things; yet those properties are not determinate or even finite, since they only emerge in the context of material encounters between actors and objects.” 
It seems reasonable to connect Hutchby’s locating of affordances in the moment when action is taken to the notion of the everyday. In this way, we can retain the overall conclusion of social studies of technology, which make a strong case for the difficulty of distinguishing society and technology at a macro or conceptual level, while retaining the inter–relations of affordances (or determinations) that we can see at work in the everyday. Hacking suggests not only that technological determination is a productive attitude for hackers to take in their everyday, but that conceptually social scientists need to be careful in locating the different levels at which they analyse inter-relations of technology and society. The most important consequence of this is the recognition that technological determinism is alive and well in all our everydays and we can understand better what this means if we utilise a concept like Hutchby’s of affordance, which understands determination as the constraint on different possible actions and not as a compulsion on human actors to take one specific action only.
Hacking shows us that determinism or affordance is present and productive. Hacking also shows that it is not one form of determinism covering all forms of technological and social relations, rather there is a constant interweaving of affordances, of both restraints and productions of possible actions in socio–technological contexts. Hacking embodies a politics in its constant renegotiations of technological and social affordances. It is for this reason that I think the term “determinism” remains useful because it does not entirely lose the idea of compulsion. By paying attention to the combined negative and productive aspects of determinations or affordances and by looking at the range of potential actions that are disabled and enabled by particular moments of technological or social determination, we can identify and explore the politics of socio–technological moments. For example, we should see in the attempts of hacktivists to construct secure, private online communications an attempt to instantiate the technological determinism that the Internet by definition enables secure communications. Or in the constant programming by those working in FOSS we should see software that puts into effect a politics whose ethics involves both a wider freedom and technical excellence.
Affordances productively point us toward the ways technologies are open to re–use and re–interpretation by different people, leading to different uses of technology, while at the same time noting that an individual technology is not open to any use; you cannot make ice cream with a word processing programme, though you can edit an ice cream–making recipe. This re–insertion of the material into everyday actions (leaving aside the issue whether the actor is human or not) allows a politics of technology to emerge in the ways different technologies create different sets of possible actions which in turn build different social norms and institutions. This politics is inaccessible to those who reject any discussion of determination and seek the neutrality of connections and associations, instead of the politics of affordances and determinisms. To keep this notion of politics to the fore in the analysis I have proposed, it seems appropriate to pull the notions of affordance and determination together. For this reason, hacking can be thought of as a community engaged in constant determinative affordances between technology and society in the realm of computer and network technologies.
Power and hacking
In conclusion, there are three wider points that emerge from my account of hacking. These relate to the nature of technological determinism, the specificity of hacking and software programming and the nature of socio–technological power. All three of these are drawn from the account of hacking I have given which sees hacking as a living community that has at its heart a complex interaction between cracking and FOSS, around which some seek to hack the social and others explore the boundaries of hacking, programming and non–hacking. The overall meaning of hacking emerges in a constant negotiating and renegotiating of determinative affordances between technology and society and between society and technology.
The first wider point emerges in that, social and technological determinisms exist but they exist in everyday moments; in the moments when actors of all kinds take actions that depend on a certain technology and may intervene into a technology. A consequent issue, beyond the scope of this article, is the extent to which such everyday moments coalesce into social institutions or structures of some kind. At first glance, such wider perhaps “macro” socio–technological structures seem likely given the widespread impact of such things as FOSS’s revisions of property relations or work on encryption. However, it is beyond the scope of the present arguments to establish this, instead the key point is that we will fail to grasp socio–technologies if we fail to put into play not only the everyday social shaping of technology but also the everyday technological shaping of society.
Second, everyday moments of determinative affordances are malleable and unstable but this is particularly the case with software (and to an extent any digitisable object). The openness to revision that is part of software programming is the main producer of the acceleration of interaction between technological and social everyday moments that hacking thrives on. Not all objects are so open to reworking, not all technological forms can be so quickly and easily altered and then re–distributed as software can.
Third, the power and politics of the socio–technological world, as seen from the standpoint of hacking, is driven by combinations of determinative affordances, in the sense of the production both in technologies and societies of only certain possible courses of action; determination is not linear nor singular but exists in the production of fields of possibilities. It is not determination in the sense of being forced to do one thing, but it is determination in the creation and denial of different options to take actions. Hacking allows us to see ways of emphasising the politics of technology.
Finally, at the end of this argument I have reached hacking and power. The politics of hacking lies both in the particular, for example in the free information politics of crackers or the property–as–distribution politics of FOSS, and in the general, in the identification of everyday determinative affordances as the central process of technological politics. For analysts of information, communication and society, hackers and hacking offer both an important and complex object and a way to rethink approaches to society, technology and socio–technological power.
About the author
Tim Jordan is a Reader in Sociology at the Open University. He has researched and published in both Internet studies and social movement studies. In relation to Internet studies, he has recently worked on a politically motivated hacking with Paul Taylor (Hacktivism and cyberwars: Rebels with a cause? — Routledge, 2004) and is building on this and previous work on hacking communities to create an overview of hacking (Hacking: Digital media and technological determinism — Polity, 2008). He is currently exploring the world of massive multiplayer online games both as a player and analyst. For social movement studies, he was a co–founder of the journal Social Movement Studies: Journal of Social, Cultural and Political Protest (Taylor and Francis) and published Activism! Direct action, hacktivism and the future of society (Reaktion Books, 2002).
E–mail: t [dot] r [dot] jordan [at] open [dot] ac [dot] uk
1. Himanen, 2001, p. 141.
2. Wark, 2004, pp. 3–4.
3. Wark, 2004, p. 72 f/n.
4. More detailed evidence for the following account can be found in Jordan (2008).
5. Shimomura, 1996, pp. 86–91; Littman, 1996.
6. Weblog, 2008; Wall, 2007, pp. 130–141; Mitnick and Simon, 2005, pp. 221–245.
7. Jordan and Taylor, 2004, pp. 111–114.
8. Howard, 1997, section 16.6.
9. Weber, 2004, pp. 186–187.
10. Torvalds, 2001; Weber, 2004, pp. 163–168.
11. Weber, 2004, pp. 1–4.
12. Moody, 2002, pp. 166–167.
13. Weinberg, 1986, p. 32.
14. Winner, 1977, p. 76.
15. Hutchby, 2001, pp. 14–23.
16. Hutchby, 2001, p. 26.
17. Hutchby, 2001, p. 27.
AFP, 2005. “Hacker attacks in U.S. linked to Chinese military: researchers” (12 December), at http://www.spacewar.com/news/cyberwar-05zzq.html, accessed 1 June 2007.
P. Bronson, 1999. The nudist on the late shift: And other tales of Silicon Valley. New York: Secker and Warburg.
P. Himanen, 2001. The hacker ethic: A radical approach to the philosophy of business. New York: Random House.
J.D. Howard, 1997. “An analysis of security incidents on the Internet 1989–1995,” PhD dissertation, Carnegie Mellon University, at http://www.cert.org/research/JHThesis/, 10 October 2007.
I. Hutchby, 2001. Conversation and technology: From the telephone to the Internet. Cambridge: Polity.
T. Jordan, 2008. Hacking: Digital media and technological determinism. Cambridge: Polity.
T. Jordan and P. Taylor, 2004. Hacktivism and cyberwars: Rebels with a cause? London: Routledge.
L. Lessig, 2001. The future of ideas: The fate of the commons in a connected world. New York: Vintage Books.
J. Littman, 1996. The fugitive game: Online with Kevin Mitnick. Boston: Little, Brown.
D. Mackenzie and J. Wajcman (editors), 1999. The social shaping of technology. Second edition. Buckingham: Open University Press.
K.D. Mitnick and W.L. Simon, 2005. The art of intrusion: The real stories behind the exploits of hackers, intruders & deceivers. Indianapolis, Ind.: Wiley.
G. Moody, 2006. “‘A lawyer who is also idealist — How refreshing’,” Guardian (30 March), at http://technology.guardian.co.uk/weekly/story/0,,1742104,00.html, accessed 3 July 2007.
G. Moody, 2002. Rebel code: Linux and the open source revolution. London: Penguin.
Netcraft, 2007. “January 2007 Web server survey,” at http://news.netcraft.com/archives/2007/01/05/january_2007_web_server_survey.html, accessed 15 February 2007.
R. Norton–Taylor, 2007. “Titan rain: How Chinese hackers targeted Whitehall,” Guardian (5 September), p. 1, and at http://www.guardian.co.uk/technology/2007/sep/04/news.internet, accessed 1 July 2009.
E. Pilkington and B. Johnson, 2007. “China flexes muscles of its ‘informationised’ army,” Guardian (5 September), p. 12, and at http://www.guardian.co.uk/technology/2007/sep/05/hacking.internet, accessed 1 July 2009.
E.S. Raymond, 2001. The cathedral and the bazaar: Musings on Linux and open source by an accidental revolutionary. Revised edition. Sebastopol, Calif.: O’Reilly.
T. Shimomura, with John Markoff, 1996. Take–down: The pursuit and capture of Kevin Mitnick, America’s most wanted computer outlaw — by the man who did it. New York: Hyperion.
P. Taylor, 1999. Hackers: Crime in the digital sublime. London: Routledge.
D. Thomas, 2002. Hacker culture. Minneapolis: University of Minnesota Press.
L. Torvalds, with D. Diamond, 2001. Just for fun: The story of an accidental revolutionary. New York: HarperBusiness.
S. Turkle, 1984. The second self: Computers and the human spirit. London: Granada.
D. Verton, 2003. Black ice: The invisible threat of cyber–terrorism. New York: McGraw–Hill.
D.S. Wall, 2007. Cybercrime: The transformation of crime in the information age. Cambridge: Polity.
M. Wark, 2004. A hacker manifesto. Cambridge, Mass.: Harvard University Press (This text is written as numbered epigrams without number pages, references are therefore to the epigrams.).
S. Weber, 2004. The success of open source. Cambridge, Mass: Harvard University Press.
Weblog, 2008. “‘In the wild’ social engineering,” at http://www.roundtripsolutions.com/blog/2008/02/08/320/in-the-wild-social-engineering/, accessed 3 March 2008.
G. Weimann, 2006. Terror on the Internet: The new arena, the new challenges. Washington D.C: United States Institute of Peace Press.
A. Weinberg, 1986. “Can technology replace social engineering?” In: A. Teich (editor). Technology and the future. London: St. Martin’s Press, pp. 30–39.
L. Winner, 1977. Autonomous technology: Technics–out–of–control as a theme in political thought. Cambridge, Mass.: MIT Press.
M. Yar, 2006. Cybercrime and society. London: Sage.
Paper received 13 February 2009; accepted 10 June 2009.
This work is in the Public Domain.
Hacking and power: Social and technological determinism in the digital age
by Tim Jordan.
First Monday, Volume 14, Number 7 - 6 July 2009