Studying the materiality of media archives in the age of digitization: Forensics, infrastructures and ecologies
First Monday

Studying the materiality of media archives in the age of digitization: Forensics, infrastructures and ecologies by Zack Lischer-Katz



Abstract
Discourses on the immateriality of digital information have given way to a radical rethinking of digital materiality within information studies, cinema and media studies, and other related fields. This rethinking occurs as media archives are increasingly converting their collections into digital form, stored less and less on shelves in climate-controlled vaults, and instead stored and accessed through data servers. This newfound complexity and potential for the media archive to become “black boxed,” call for a critical approach to media archives that takes into account these new realities and their material entanglements at multiple scales of analysis. Newfound scholarly interest in multiple scales of materiality, from the micro (computer code, transmission protocols, etc.) to the macro (large, sociotechnical infrastructures) may offer new directions for rethinking the nature of media archives in the age of digitization. This paper reviews the literature on these recent trends and identifies three intertwined research approaches for analyzing emergent phenomena related to the digitization of media archives: Critical forensic, institutional/infrastructural, and global/ecological perspectives. These approaches help to extend postmodern conceptualizations of the archive by showing how the archive as infrastructure is bound up in unfolding political, ecological and epistemological struggles.

Contents

Introduction
New materialist perspectives on media
New directions for conceptualizing media archives in the age of digitization
Directions for future inquiry
Conclusion

 


 

Introduction

Media archives are increasingly digitizing their holdings and adding them to data centers, contributing to the online agglomeration of media collections through commercial streaming media services. Our experience of the culture of media is increasingly mediated by Internet-based services that rely on large-scale infrastructures. Digitization seemingly replaces the materiality of media artifacts with a simultaneously evanescent and omnipresent immateriality; media content is increasingly accessed through search boxes across multiple devices, seemingly everywhere and nowhere at once. In fact, the materiality of the archive has been merely displaced to dwelling within the infrastructures that undergird the Internet, which leads to new entanglements with ecological, political and technological processes that have global implications.

Since the early 2000s, a growing number of scholars in information studies, media and communication studies and related fields have begun to radically reconceptualize the materiality of digital media and infrastructures. Information infrastructures — the fiber optic cables, network switches, and servers — all exist somewhere on earth, frustrating the modernist urge of separating information from its material support, seeking to dislocate it from place, time and context. Instead, information is shown to still always require some form of material support, only now it is increasingly moved to off-site storage in data warehouses. Ignoring these large-scale infrastructures is now seen as increasingly risky: not only does the rhetoric of de-materialization risk concealing the political economies that shape and sustain information infrastructures, including world intellectual property regimes (Vaidhyanathan, 2006) and digital rights management technologies (Gillespie, 2007), and the shaping of scholarly knowledge production (Manoff 2013; 2006), but it also conceals the ecological toll imposed by computing technology, its carbon footprint and toxic materials, which pose a threat to both human civilization and its vast archives of recorded knowledge (Cubitt, et al., 2011; Davis, 2015). Storing data requires considerable energy resources, both in terms of electricity for running the servers and their peripherals, but also for cooling and dehumidifying the ambient air around the servers. This extends the needs of earlier analog archival collections of film and videotape, which are not necessarily replaced by digital storage, but will also have to additionally be maintained. Starosielski (2014) suggests an ongoing linkage between storing archival analog media collections and the ecological impact of the necessary heating and cooling systems:

The need for cooling is shaping the geography of global Internet distribution, relocating some of its nodes to the colder climates of Oregon and Scandinavia. [...] This is not a strictly digital phenomenon: Archives also need to be cooled, and our ability to access media history via nitrate film, for example, is dependent on extensive temperature control. In a related vein of media research, climatic zones, including the cooled archives of digital content, the development of “polar media” across the Arctic regions, and the regulation of heat inside museums, have come to define a new set of parameters for analysis. [1]

For media archives that are digitizing existing tape-based analog originals, the movement from analog to digital will require an enormous amount of data storage capacity that will need to be refreshed and migrated several times each decade. The amount of energy and mineral resources required to operate data centers and produce new storage media for the long-term preservation of media content is quite high. It has become widely accepted in the video preservation community that digitization is the only way to ensure long-term preservation of analog video content. It is estimated that globally, there are roughly 400 million analog video and audiotapes that are at risk of soon becoming unplayable due to decay or the unavailability of playback equipment (Tadic, 2016). Linda Tadic (2016) has estimated that storing the data from all those digitized tapes would require 14.6 exabytes [2] of storage space, and with the need to store multiple copies to ensure long-term preservation, that number doubles or triples. Tadic (2016) also points out that those 400 million tapes will need to go somewhere once they have been digitized, either to landfills or to recycling plants, as the equipment used to playback tapes becomes increasingly scarce and tapes become virtually unplayable artifacts. The ecological impact of digitization and the long-term storage of digitized media collections will likely be significant.

The digitization process also effectively moves the media artifacts away from the direct engagement of archivists, scholars, and the public. By moving media archives to data servers, often off-site and/or hosted by a cloud service, we no longer have direct experience of the material conditions of the media. Miller (2015) suggests that the study of media is in fact more intimately tied with industrial production, mining and the search for rare metals than ever before:

We were brought up believing that mining and manufacturing were the world’s principal polluting culprits. The difficult news for media and cultural studies and the art world is that our beloved electronics are also crucial components of this destructiveness. Their toxic parts, forms, and norms pervade our world, from old fat-screen television sets to modish computing clouds [... .] The deleterious effects of these technologies is felt in the mines and factories that produce them, the offices and cars that house them, and the municipal dumps and fire pits that bury them. [3]

How does our understanding of media archives change when we start to look to the infrastructural systems, the granular materialities of codes and their magnetic inscriptions, and their entanglement with these global ecologies, networks of global distribution and ways of knowing the world? Answering this question involves opening up the “black box” [4] of the archive in order to fully understand the multiple scales, temporalities and logics that shape media archives, and are typically hidden from analysis within the archive’s opacity. Without opening the “black box” of the archive and tracing its infrastructural entanglements with complex micro and macro systems and processes, we risk losing our critical awareness of the epistemological, institutional and/or ecological factors that shape the ontology and epistemology of archival media.

Taking into account the multiple scales of the materiality of media is particularly important because the media archive no longer stops at the boundaries of its walls (if it ever did). Instead, it uncoils over a landscape, serpentine and flickering; it does not restrict itself to one epistemological position, but generates and multiplies many that may twist and claw for dominance; and the archive itself is only one institution among many embedded in networks of standards, specifications, protocols, and other entities that construct the conditions for the encoding, storage, transmission and display of digital media. Integrating new materialist perspectives on media into thinking about media archives will provide the necessary critical tools with which to consider the new reality of the media archive as digitized collections become enmeshed in complex media infrastructures.

 

++++++++++

New materialist perspectives on media

New materialist perspectives on media and communication question several key assumptions of traditional media scholarship. First, they question the “division between content and medium” [5], in which media technology is seen as merely a channel for communicating content. Packer and Crofts Wiley (2012) suggest: “A division between content and medium remains in much contemporary scholarship. In such work, media technologies only seem to matter in terms of how the meaning of content might be altered by differing modes of technological transmission” [6]. Second, new materialists seek to uncover the hidden role played by the infrastructures that support media systems, enabling the storage, transmission and display of media content. Infrastructures include traditional media production models (e.g., the studio system), but also networked digital infrastructure, HVAC systems, electrical grids, which are all implicated in the transmission of media. With digitization, access to media archives becomes networked, distributed and ubiquitous, making it difficult to see the materiality of the media we consume, except when things breakdown. We stop seeing media infrastructures, because like all infrastructures, when they are fully functioning they become “transparent” to users (Star and Ruhleder, 1996). As Bowker and Star (1999) suggest, “the easier they are to use, the harder they are to see. As well, most of the time, the bigger they are, the harder they are to see” [7], and they only become visible when they breakdown (Star and Ruhleder, 1996). The “materialist turn” in some ways can be seen as a corrective to this tendency to lose sight of our information infrastructures.

The turn to materiality offers a range of new approaches that consciously work against attempts to create subject/object divides, and instead embrace the entanglement of human perception and media technologies within the historically, geographically and ecologically situated materialities. This turn to materiality is also a push back against the discourses of immateriality that have accompanied the last three decades of digital culture and have had its apotheosis in the ethereal and all-knowing “cloud.” New materialist approaches are particularly important in rethinking our approach to media archives given the radically new relationships between technology, people and information, and in light of what Coole and Frost (2012) call “the saturation of our intimate and physical lives by digital, wireless, and virtual technologies” [8].

 

++++++++++

New directions for conceptualizing media archives in the age of digitization

With technological change, maintaining the media archive has become increasingly integrated into complex systems that conceal their inner workings and obfuscate a full picture of their political, ecological and epistemological dimensions. Storage of physical media on shelves in climate-controlled vaults gives way to a more complex world of media preservation. With the digitization of nearly all legacy and contemporary media formats, the material support of media has been increasingly black-boxed behind layers of hardware, software, network systems, standards and protocols. Librarians and archivists, on the other hand, are very conscious of the “materiality of bits” (Blanchette, 2011), and are going to great lengths to produce effective policies and systems to organize and maintain the logical integrity of physical bits stored on magnetic discs and tapes, alongside their traditional custodianship of physical documents and artifacts. However, they may at times treat their preservation formats, tools and systems as transparent media that produce unproblematic, “authentic” copies for access and preservation [9]. Like any well-designed tool, digitization technologies become extensions of our bodies/minds (like Heidegger’s hammer, the tool becomes an extension of one’s self). The epistemological status of the copies they produce are not entirely unproblematic, but are rather products of particular social and technical conditions of production that operate under particular epistemological assumptions about what counts as evidence, knowledge and authenticity [10]. Additionally, even as media preservationists are concerned with the technical systems that support digital and analog collections, such as heating and cooling, shelving, computer networks, software, hardware, etc. and their integration with other systems, this focus on information infrastructures may overlook larger ecological and geopolitical forces that can impact long-term sustainability and the multiple scales at which the materiality of media technologies operate. Thus, in the following sections I would like to integrate the “material turn” with the sociotechnical construction of knowledge in media archives in the age of digitization in order to generate three important directions forward for critically approaching the materiality of media archives, which I term the critical forensic perspective, the infrastructural/institutional perspective and the global/ecological perspective.

These cases are drawn from the first decade and a half of the twenty-first century, which follows an extended period of digital library research wherein libraries and archives experimented with scanning their collections of books and photographs. By the early 2000s, media archives were experimenting with ways of digitizing their own collections of film, audio and video materials. These examples were chosen for their variety of sites and scales, and are drawn from recent historical events in order to draw attention to emergent trends that warrant these critical approaches to the materiality of media archives.

Critical forensic perspective on media archives

The digitization of archival media transforms the materiality of the media artifact, and the proliferation of born-digital media (i.e., media originally created in digital form) have encouraged the adoption of a new set of analytic tools for the media archive. These new approaches need to be critically considered because the digitization of analog media collections and the use of the techniques of digital forensics are producing new ways of seeing and knowing media artifacts held by archives. New epistemologies are brought into play within the digitization process, as preservationists and system designers embed assumptions about visual perception within their tools and techniques of digitization. The techniques of digital forensics are increasingly being applied in digital humanities scholarship. Shep (2016) suggests “new media forensics, part of the digital humanists’ intellectual toolkit, depends on recognizing the fundamental materiality of digital forms, extracting evidence of its existence, and interpreting its individual, unique manifestations” [11]. The assumptions of forensics transforms our understanding of digital artifacts, making them analyzable as “palpable bits and bytes of electronic hardware and software that are ubiquitous, that leave traces, and that can be read as evidence of the creation, dissemination, reception, and preservation of these new communication forms” [12].

However, caution must be exercised when considering forensics as a guiding approach to archives. The epistemological basis of forensic science embeds particular assumptions about knowledge and particular systems of verification and evidence that are based on hierarchical relations of power, positivist constructions of knowledge, and the role of evidence. By adopting these techniques, we have to be cautious about importing epistemological traditions of knowledge production that may operate in conflict with hermeneutic traditions of humanistic and interpretive social scientific research paradigms. Archives have traditionally been sites for maintaining political and social control, so it is important to ensure that the tools we employ to analyze them do not replicate those power imbalances.

A critical approach to the tools of digital forensics by archivists and media scholars requires thinking through how the forensic imagination may impose forms of knowing that reproduce particular power relations. What power-effects are inherent in the methods by which forensic science makes and validates its truth claims? How do these truth claims fit into systems of power via the material practices of law enforcement, the courts and forensic scientists? Algorithms can’t testify in a court of law, but expert witnesses do, so how does the construction of expertise through the methods of digital forensics construct particular subjects? These and other related questions are necessary to consider when utilizing digital forensic tools, as will be seen in the following case of the Jeremy Blake archive. The case of the Jeremy Blake archive shows the intersection of media, art, digital archiving and the ingest of this new content into a traditional archive, as well as the parallel mystery of the investigation of Blake and his wife’s apparent suicides.

In 2009, the collection of the late visual artist, Jeremy Blake, was donated to New York University’s Fales Library (Darms and Giffin, 2010). The main part of this collection consisted of hard drives and disks from Blake’s computer, which he had used to compose much of his visual artwork. In the processing of this collection, the individual files, as well as the file structure were cataloged. To future researchers, the hierarchical arrangement of files is critical for supporting claims about the artist’s production activities, and is analogous to the concern for preserving “original order” within traditional archival collections. Ensuring that no data is changed as the files are accessed is also of primary of concern, analogous to the archivist’s conception of assuring authenticity. Thus, digital forensics facilitates the translation of archival imperatives to digital objects. While the issue of uniqueness, once critical to maintaining the social legitimacy of archives, would seem to crumble in the age of infinite reproducibility, in fact, digital archivists have actually seen the idea of authenticity become increasingly critical to their work, albeit in a very different way. To make sure that no bits in a file change while it is under archival custodianship, the preservation community has adopted the tools of legal forensics to begin to tackle this problem of ensuring authenticity via maintaining data integrity. According to Kirschenbaum, et al. (2010),

The same forensics software that indexes a criminal suspect’s hard drive allows the archivist to prepare a comprehensive manifest of the electronic files a donor has turned over for accession; the same hardware that allows the forensics investigator to create an algorithmically authenticated “image” of a file system allows the archivist to ensure the integrity of digital content once captured from its source media ... . [13]

This passage suggests a potentially ambivalent melding of the tools of law enforcement and the tools of preserving cultural heritage collections. The fact that the new digital tools of the archivist were developed in different professional contexts with very different ethical concerns, suggests the need for critical analysis of how these techniques may import problematic epistemologies into humanistic research contexts. This seemingly odd alignment of legal and archival imperatives suggests a movement towards similar truth claims and a normalizing of relationships to the past. Of course, while we might want to see archival ethics as independent of criminal justice and legal proceedings, it is also important to recall that archival documents were initially born out of legal documents and institutional records. Thus, perhaps it is more historically accurate to suggest that the maintenance of the digital archive via forensic tools actually reaffirms the originary position of the archive as the seat of the law. Derrida (1995), in their etymological analysis of the word “archive” remind us,

This name [Arkhe] apparently coordinates two principles in one: the principle according to nature or history, there where things commence — physical, historical, or ontological principle — but also the principle according to the law, there where men and gods command, there where authority, social order are exercised. [14]

Thus, to return to the case of the Jeremy Blake collection, we can see a tension between the aesthetic expression and regimes of forensic knowing that make these collections palatable for institutional ingestion. Given the authoritative, law-enforcing roots of digital forensic techniques, further unraveling of these ethical tensions seems quite necessary to understanding the long-term impact on archival ethics and professional conduct. We might wonder if these legal techniques might someday help to unravel the mysterious double-suicide of Blake and his wife, which consequently lead to the posthumous deposit of these materials at the Fales Library. While the tools of digital forensics are clearly useful for preserving authenticity and integrity in digital records, there are still many unexamined ethical questions around the epistemological assumptions and power relations that sustain them, which deserve further critical inquiry.

As Leonardi (2010) points out: “materiality is not a property of artifacts, but a product of the relationships between artifacts and the people who produce and consume them” [15]. Alternative epistemologies for understanding the materiality of digital archives make different epistemological and ontological assumptions about the relations between things and people. Such approaches include close readings of file formats and digital codecs and the ways in which they interlink human eyes with media infrastructures (e.g., Mackenzie, 2010); historical-genealogical studies of the ways in which epistemological assumptions about human perception shape media infrastructures (e.g., Sterne, 2012); as well as radical feminist interventions in archival practice that offer new temporal schemas for understanding the relation between past, present and future (Eichhorn, 2013). These alternative epistemologies offer ways of approaching the materiality of digital artifacts that fill in some of the blind spots of digitization and digital forensics.

Infrastructural/institutional perspective on media archives

As we move beyond epistemological questions related to the digitized media artifacts themselves, we begin to see an array of associated problems that arise between the digital codes of these artifacts and the larger systems that support them. A critical approach to the materiality of media archives must also investigate the preservation infrastructures and standards that support archival practice.

Infrastructures increasingly shape the patterns of media distribution and how media appear, as well as shaping the practices of archivists working with media and the construction of legitimized institutional knowledge in preservation institutions. By their nature, social institutions work to stabilize and reproduce particular practices and forms of knowledge. In a sense, institutions are social infrastructures in themselves. Technical infrastructures are intertwined with the social infrastructures of institutions, often times mediated by standards, protocols, documents and artifacts that bind social and technical aspects of infrastructure.

Networked information and media infrastructures both enable and require political power from a far. Like the roads of Rome, networked infrastructures bind the empire together and enable their own management and maintenance. Packer and Crofts Wiley (2012) suggest, “media technologies are envisioned as part of a series of historical attempts to implement differing governmental rationalities wherein media are mechanisms for extending and organizing governance and formation of subjects” [16]. Digital archivists are likely cognizant of the infrastructures they depend on for access to digital collections, but their awareness needs to be extended beyond the pragmatics of preservation to include the larger political economic and ecological phenomena within which these infrastructures are enmeshed. Standards are increasingly understood to be particularly important tools in sustaining infrastructures, and they circulate around the globe and ensure that infrastructures can function by promoting uniformity across different institutions. Fuller (2005) suggests that they imply “exchange, trade, command, commonality ... .” [17] and wonders, “what arises when two or more standard processes, with their own regimes, codes, modes of use and deportment, systems of transduction, and so on, become conjoined?” [18].

According to Bush (2013), standards are “where language and the world meet,” implying that the phenomena that we call standards straddle the area between what can be linguistically specified and categorized and the properties of the physical world [19]. Standards also invoke categories and make distinctions about things and actions in the world, shaping infrastructure and institutions (Bowker and Star, 1999). Because standards are both technical and textual, we can study their circulation as documents, and the processes by which they become embedded in particular technological forms, and then diffused or integrated into local practices. Winner (1986) has pointed out how infrastructures can embed social biases within the built environment, and Boczkowski (2004) suggests that “the actions of relevant social groups in the construction of a new artifact may be informed by issues affecting the diffusion of pre-existing infrastructures such as the presences of unfavorable technological standards” [20].

Standards (including Internet protocols, see Galloway, 2004) are common forms of codified knowledge that circulate across communities to ensure uniformity and sameness within processes or products across space and time. In the case of formal technical standards, such as JPEG2000, they are represented in written documents developed through standardizing organizations (e.g., International Organization for Standardization).

Standards play an increasingly important role in shaping preservation practice and they integrate the work of archivists into digitization systems and digital repositories. Standards increasingly impact work done in preservation institutions. Donaldson and Yakel (2013), in their work on the adoption of the metadata standard PREMIS, suggest “standards in archives are ubiquitous. They reflect the most current knowledge about professional practices and increase interoperability, consistency, and the safety and security of collections” [21]. Digitization standards are particularly important, as more and more institutions are adopting digitization as a strategy for access and long-term preservation. Conway (2010) suggests that “in the age of Google, nondigital content does not exist, and digital content with no impact is unlikely to survive” [22]. Digitization is providing access to the archives of the future, and standards are effectively shaping how collections will appear to future generations.

Standards also play an important role in the circulation of digitized media. They help establish common formats for distribution and access, and for long-term archiving (which could be seen as a sort of temporal distribution to a future, indeterminate time, when it is hoped that the content can be decoded and displayed). Standards produce control and uniformity across “cultures, time and geography” (Timmermans and Epstein, 2010), they can be used to exclude and marginalize individuals and organizations who choose not to adopt the standard, or who do not fit the standard. As Brunsson and Jacobsson (2000) suggest, “if one does not follow certain standards people may doubt that one is really a particular kind of actor” [23]. An individual’s actions as a certain type of social actor (e.g., archivist), may be tied to adopting a particular standard. Because of these embedded power dynamics, the process of adopting standards within a community can become a process fraught with controversy and can reveal institutional and epistemological tensions between social actors.

The case of JPEG2000, and the controversy surrounding its adoption in the preservation field for preserving analog video material, offers an example of the ways in which seemingly mundane standards can embody political, social, and epistemic forces that have real effects on preservation institutions, archival practice and the form of digitized media artifacts. Lischer-Katz (2014) conducted a discourse analysis on a corpus of 435 postings collected from the Association of Moving Image Archivists (AMIA) listserv between the years 2000 and 2013. These e-mail messages contained discussions within the moving image preservation community about adopting the JPEG2000 file format for encoding digitized analog video content for long-term preservation. These postings were analyzed in terms of how knowledge was constructed around the topic. The “debate” that took place on whether or not to adopt this standard displayed rhetorical techniques and the presentation and evaluation of evidence on both sides of the debate. This fits with Timmermans and Epstein (2010) claim that the process of standardization itself can be seen as a form of knowledge production, suggesting “standardization also raises questions about the role of science and expertise in regulation: What evidence is sufficient or necessary to implement standards?” [24]

A key finding from the discourse analysis of the debate around JPEG2000 (2004–2013) was that, over time, the detractors of JPEG2000 began to move the focus of their debate against JPEG2000 from the particular claims made by supporters to the grounds on which these claims were being made (Lischer-Katz, 2014): The detractors of JPEG2000 work to continually shift the burden of proof back from the rhetoric of expert opinion to direct experience with the technology, privileging individual experience with the technology over the knowledge of experts (Lischer-Katz, 2014). While an economic approach might suggest that the lack of widespread adoption of JPEG2000-based technology is due to economies of scale and the limited revenue streams of small organizations, these shifting epistemic grounds may be linked back to historical changes within the preservation field as preservationist cultures shift with the adoption of new technologies and the rise of mass digitization projects that call into question the dominance of preservationists’ professional agency in the face of standards promoted by powerful institutions.

The case of JPEG2000 shows how focusing on standards and their role in mediating infrastructures and institutions can identify points of tension between different parts of the preservation community, drawing attention to otherwise unseen power differentials in the field. As more and more media collections are digitized, understanding the work of archives and the materiality of preserved media artifacts hinges on taking into account the important role played by standards.

Global/ecological perspective on media archives

The following section will offer connections between the materiality of standards and codes, their forensic and institutional dimensions, and their embeddedness in global systems and ecological forces. The Library of Congress, the de facto national library of the United States, is likely the world’s largest collection of media artifacts (U.S. Library of Congress, 2016), and consideration of the Library’s ecological impact and its geospatial association with artifacts of the military-industrial complex offers an extreme case study of a media archive enmeshed in global processes. In 2007 the Library moved its millions of moving image and sound materials from various locations in the eastern United States into a decommissioned Federal Reserve bunker in Culpeper, Virginia, which is now known as the Packard Campus of the National Audio-Visual Conservation Center (NAVCC). Built in 1969, the bunker was originally designed to safely house millions of dollars in paper currency that could be used to jump-start the U.S. economy in the event of a nuclear holocaust. This underground site has been completely gutted, refitted with the latest archival cooling and heating technologies, with a forest of trees replanted on the site to hide the structure and return the hilltop to a pastoral vista (it was also a strategic site in the Civil War as a place of panoramic views and source of military intelligence gathered from distant horizons).

While the huge size of the collection and the expensive construction project are impressive in and of themselves, what is more intriguing is the immense storage system that has been built to house the Library’s data. The long-term goal of the Library of Congress is to entirely digitize these collections and make them available for the “life of the republic plus 4,000 years,” according to Senior Systems Administrator at the Library of Congress NAVCC, James Snyder (2010). Digitizing the millions of media objects in the collection will likely generate exabytes (millions of terabytes) of data by 2017 (Snyder, 2011). Because of the nature of preserving digital information, constant copying and checking of this huge collection is required to keep it accessible over time. As the Library of Congress Web site explains, “the digital archive is based on the concept of continual migration and verification. Migration to progressively higher density storage — meaning progressively greater storage capacity — will continue indefinitely into the future” (U.S. Library of Congress, 2007). With such ambitious goals of preservation set, the Library is attempting to establish a media archive for the end of time, one that can survive format obsolescence, digital decay and nuclear war, in order to repopulate the earth with America’s media patrimony (Lischer-Katz, 2013).

The construction of this archival facility within a decommissioned Cold War bunker, coupled with the implementation of large scale, industrial-grade tape and spinning disc storage dramatizes the relationship between military funding and information research. It reminds us that the Internet, too, had its origins as a mechanism for nuclear survival. Materialism makes us reconsider the meaning of sites, architectural forms, and specific infrastructural formations, in the context of large-scale social and political forces. We might consider the tension between the Library’s ecologically-friendly forest reclamation project on top of the construction site, and the fact that the facility draws large quantities of electrical resources to run servers, data tape warehouses, and the HVAC systems to keep its servers and analog materials cool and dry. This tension is also played out in the Library’s motion picture film lab, where they have spent years trying to filter their wastewater to reach pollution levels acceptable to the local community water standards (based on discussions with staff at the facility carried out by the author in 2011). Thinking about archives in this way emphasizes the large ecological impact of preservation. According Bozak (2012) in Cinematic Footprint, “however sophisticated digital technology becomes, and however politically affecting as a result, it remains plugged into a turn-of-the-century system of energy generation that is so outdated it should long ago have been declared, like the commodities it has yielded, not just thoroughly inadequate and antiquated, but obsolete” [25]. Similarly, archives rely upon the same systems of energy distribution for preserving these energy-intensive media artifacts, yet the problems of long-term energy use associated with the preservation of archival collections have only recently been addressed within the archiving community. Clearly, a “green” approach to archiving is in order, with an emphasis on renewable energy sources, or perhaps even a preservation “carbon” tax. For a system that requires considerable amounts of electricity, the viability of archives is bound up in the sustainability of energy production, linking media archives to global systems of ecological and political forces.

 

++++++++++

Directions for future inquiry

As individual and collective memory become increasingly represented through machine storage and our “intimate and physical lives” become saturated “by digital, wireless, and virtual technologies” [26], issues related to the ethics, political economy and ecological sustainability of information infrastructures become linked to the physical and symbolic construction and maintenance of the archive, suggesting new avenues for research. Using the analytics of materiality and infrastructure can help us address different scales of infrastructure, as well as assess the role of preservation within systems that operate under neoliberal policies and enact systems of cybernetic control.

Considering materiality in media studies opens up new avenues of research into technical systems related to media archives, including the resources and labor that go into producing media devices, and the ecological concerns (e.g., Bozak, 2012; Hogan, 2015) pervading today’s media systems. Gitelman (2006) suggests that the ways in which media artifacts can be known and the systems that enable them to be played back help to shape their meaning. This notion is already well understood by bibliographic theorists (e.g., see Galey’s 2012 bibliographic analysis of the materiality of an eBook) and digital humanists. Emphasizing the materiality of digital media is still a radical move that directly questions what Kirschenbaum (2008) terms the dominant “medial ideology” in which digital inscription and its associated labor are effaced by the appearance of “light, reason, and energy unleashed in the electric empyrean” [27]. Bits have materiality (Blanchette, 2011), and bits require labor and infrastructure, use resources and produce waste (Grossman, 2006), as well require particular knowledge bases, systems of maintenance and educational programs to function and be displayable/legible.

The field of media archiving and preservation is concerned with ensuring the long-term physical and intellectual persistence of media artifacts deemed to have significant value within a particular institutional context (archive, library or museum), so the call to considering materiality and infrastructure at first seems like an affront to a professional practice already focused on the material support of media. However, Kirschenbaum’s (2008) concept of medial ideology may be at work within the discourses of preservation. In his 2010 address to the Association of Moving Image Archiving and Preservation annual conference, film curator Greg Wilsbacher identified what he called “the Cartesian dilemma” in the field of media archiving, a duality between form and content, in which the dominant belief in the field was that the informational content of a media artifact could be identified and fully extracted from its physical carrier. Wilsbacher (2010) argues instead for an approach that considers how the material dimensions, such as “hidden” information along a film’s sprocket holes, or a “trick” groove on the runout of some Beatles’ 33 and1/3 RPM records, are integrally tied to the various dimensions of the meaning of the artifacts. However, some dimensions of this materiality must be lost in the process of copying. This has implications both for preserving original media types, as well as extending conceptions of what constitutes “content” when migrating decaying media artifacts to a more stable medium. Additionally, given this increasingly strong linkage between archives and media scholarship, archives need to seriously consider the layers of access that media scholars will need in order to properly study media artifacts, including access to particular machines for accurately displaying complex media formats, access to tools to open up digital packages and examine code, and access to the machines themselves as objects of study (see Ernst’s 2013 approach to media archaeology).

Finally, a focus on the materiality of media infrastructures produces a rich model for conceptualizing archiving along the lines of a communication network that has both extension in space — between repositories, document creators and scholars and other users — and in time, to an indeterminate array of unknown future users and uses. Developing the concept of archival infrastructure is critical to understanding the concerns of media archivists and how they mobilize particular arrangements of techniques, tools and systems to “ensure” accessibility of content into the future. This infrastructure is built and maintained in a state of uncertainty since future states of the world always maintain some degree of indeterminacy that predictive models can never overcome. For instance, particular types of video files may become unplayable in future computer systems, electrical grids may no longer be reliable enough to run the tape libraries for digital archives, and display systems may change [28].

 

++++++++++

Conclusion

Postmodern perspectives on archives have called into question traditional notions of the archive as a neutral conduit for accessing the records of the past (see Cook, 2001; Schwartz and Cook, 2002); at the same time, digitization impacts how archival media artifacts are perceived and how meaning is made through scholarly interpretation (Manoff, 2006). The call to critically investigate the materiality of media archives is becoming particularly dire as the predicted effects of anthropogenic climate change continue to grow in magnitude, and the long-term preservation of media archives becomes increasingly bound up in the long-term survival of humanity itself. This relationship between storage and global ecological sustainability, and the responsibility it implies for members of the communities responsible for preserving recorded human knowledge, is finally being formally acknowledged in the archives and libraries communities by several initiatives, including work done by the American Library Association’s Sustainability Roundtable, founded in 2013, and ProjectARCC: Archivists Responding to Climate Change, founded in 2015. Archivists are being called upon to acknowledge the dire preservation risks associated with anthropogenic climate change and to integrate this knowledge into their archival practice (see Tansey, 2015).

When studying viewing patterns, understanding the archive as infrastructure becomes a critical analytic strategy because it helps to denude a site of power that is responsible for shaping our knowledge of the visible past. Regimes of intellectual property protections, standards and protocols, along with large-scale digital infrastructures span the globe and shape how visual information may circulate. Opening up the “black box” of the media archive and examining its material artifacts, systems and policies, reveals a sociotechnical infrastructure that generates resonances across multiple ontological and epistemological scales, from computer code and video codecs to digital communication networks to global ecologies and political conflicts. Infrastructures, standards, codes, ecologies, etc. play essential parts in shaping the materiality of media artifacts, and understanding them is essential to fully understanding the cultural and historical specificities of media archives in the age of mass digitization. Critically investigating these materialities reveals the central position of the archive as infrastructure in shaping the global circulation of knowledge, which suggests that the digitization of media (old and new) brings the media archive as an active agent, rather than a passive repository, into the struggle for global control over the viewing patterns of our records of visual memory. End of article

 

About the author

Zack Lischer-Katz, just finishing up his Ph.D. at Rutgers University in library and information science, is a CLIR (Council on Library and Information Resources) Research Fellow at the University of Oklahoma, where he studies the preservation of virtual reality and 3D data for scholarship and pedagogy. He was co-organizer and programming coordinator for the 2015 and 2016 editions of Extending Play, a play and game studies conference hosted by the School of Communication and Information at Rutgers University. With co-organizer Aaron Trammell (Assistant Professor at UC Irvine), he co-edited a volume of papers from the 2015 edition of Extending Play in the Journal of Games Criticism (July 2016), entitled, “Considering the sequel to game studies ... .”
E-mail: zlkatz [at] ou [dot] edu

 

Acknowledgments

An earlier version of this paper entitled, “New materialist directions for studying media archives,” was presented at the Sixth International Conference on the Image, University of California, Berkeley, 29–30 October 2015.

 

Notes

1. Starosielski, 2014, pp. 2–3.

2. One exabyte = one billion gigabytes, or roughly the storage space of 500 million laptop computers (assuming each has disk storage of 250 GB).

3. Miller, 2015, p. 138.

4. A concept popular in science and technology studies, Bruno Latour (1999) has described blackboxing as “the way scientific and technical work is made invisible by its own success. When a machine runs efficiently, when a matter of fact is settled, one need focus only on its inputs and outputs and not on its internal complexity. Thus, paradoxically, the more science and technology succeed, the more opaque and obscure they become” (p. 304).

5. Packer and Crofts Wiley, 2012, p. 109.

6. Ibid.

7. Bowker and Star, 1999, p. 33.

8. Coole and Frost, 2012, p. 5.

9. In 2004 the Association of Research Libraries formally endorsed digitization as a technique for preserving library materials (Arthur, et al., 2004).

10. For instance, the conversion between analog and digital media formats involves transformations between color spaces, sampling rates, and picture elements, involving the mediation between models of human perception and machine sensitivities, as well as embedding calibrations based on “taste.” Winston (1996) points to empirical research conducted by Kodak in the 1950s that showed that Kodak “judges” preferred “pleasing” copies of images, rather than the most “technically accurate” (p. 56), suggesting that media representations of the world, and we might imply the process of copying media from one format to another, involves a series of aesthetic judgments.

11. Shep, 2016, p. 323.

12. Shep, 2016, p. 322.

13. Kirschenbaum, et al., 2010, p. 2.

14. Derrida and Prenowitz, 1995, p. 9.

15. Leonardi, 2010, paragraph 38.

16. Packer and Crofts Wiley, 2012, p. 109.

17. Fuller, 2005, p. 96.

18. Fuller, 2005, p. 98.

19. Bush, 2013, p. 3.

20. Boczkowski, 2004, p. 257.

21. Donaldson and Yakel, 2013, pp. 55–56.

22. Conway, 2010, p. 64.

23. Brunsson and Jacobsson, 2000, p. 132.

24. Timmermans and Epstein, 2010, p. 70.

25. Bozak, 2012, p. 4.

26. Coole and Frost, 2012, p. 5.

27. Kirschenbaum, 2008, p. 39.

28. The problem of preserving media display devices has become a significant obstacle in the archiving of media art. Some artists have strict specifications about display devices. When the devices breakdown, unless the curators decide to go against the artist’s documented wishes, the artwork may become lost. See Laurenson, 2006.

 

References

Kathleen Arthur, Sherry Byrne, Elisabeth Long, Carla Q. Montori, and Judith Nadler, 2004. “Recognizing digitization as a preservation reformatting method,” Association of Research Libraries (22 July), at http://chnm.gmu.edu/digitalhistory/links/pdf/preserving/8_34a.pdf, accessed 15 December 2016.

Jean-François Blanchette, 2011. “A material history of bits,” Journal of the American Society for Information Science and Technology, volume 62, number 6, pp. 1,042–1,057.
doi: http://dx.doi.org/10.1002/asi.21542, accessed 15 December 2016.

Pablo J. Boczkowski, 2004. “The mutual shaping of technology and society in videotex newspapers: Beyond the diffusion and social shaping perspectives,” Information Society, volume 20, number 4, pp. 255–267.
doi: http://dx.doi.org/10.1080/01972240490480947, accessed 15 December 2016.

Geoffrey C. Bowker and Susan Leigh Star, 1999. Sorting things out: Classification and its consequences. Cambridge, Mass.: MIT Press.

Nadia Bozak, 2012. The cinematic footprint: Lights, camera, natural resources. New Brunswick, N.J.: Rutgers University Press.

Nils Brunsson and Bengt Jacobsson, 2000. A world of standards. Oxford: Oxford University Press.

Paul Conway, 2010. “Preservation in the age of Google: Digitization, digital preservation, and dilemmas,” Library Quarterly, volume 80, number 1, pp. 61–79.

Diana Coole and Samantha Frost, 2012. “Introducing the new materialisms,” In: Diana Coole and Samantha Frost (editors). New materialisms: Ontology, agency, and politics. Durham, N.C.: Duke University Press, pp. 1–46.

Terry Cook, 2001. “Archival science and postmodernism: New formulations for old concepts,” Archival Science, volume 1, number 1, pp. 3–24.
doi: http://dx.doi.org/10.1007/BF02435636, accessed 15 December 2016.

Sean Cubitt, Robert Hassan, and Ingrid Volkmer, 2011. “Does cloud computing have a silver lining?” Media, Culture & Society, volume 33, number 1, pp. 149–158.
doi: http://dx.doi.org/10.1177/0163443710382974, accessed 15 December 2016.

Lisa Darms and Lawrence Giffin, 2010. “The Jeremy Blake (born-digital) papers,” Metropolitan Archivist, volume 16, number 2, pp. 9–10; version at http://www.nycarchivists.org/Resources/Documents/2010_2.pdf, accessed 15 December 2016.

Casey E. Davis, 2015. “Our story: Project ARCC” (6 June), at https://projectarcc.org/2015/06/06/our-story/, accessed 15 December 2016.

Jacques Derrida, 1995. “Archive fever: A Freudian impression,” Diacritics, volume 25, number 2, pp. 9–63.
doi: http://dx.doi.org/10.2307/465144, accessed 15 December 2016.

Devan Ray Donaldson and Elizabeth Yakel, 2013. “Secondary adoption of technology standards: The case of PREMIS,“ Archival Science, volume 13, number 1, pp. 55–83.
doi: http://dx.doi.org/10.1007/s10502-012-9179-0, accessed 15 December 2016.

Kate Eichhorn, 2013. The archival turn in feminism: Outrage in order. Philadelphia, Pa.: Temple University Press.

Wolfgang Ernst, 2013. Digital memory and the archive. Minneapolis: University of Minnesota Press.

Matthew Fuller, 2005. Media ecologies: Materialist energies in art and technoculture. Cambridge, Mass.: MIT Press.

Alan Galey, 2012. “The enkindling reciter: E-books in the bibliographical imagination,” Book History, volume 15, pp. 210–247.
doi: http://dx.doi.org/10.1353/bh.2012.0008, accessed 15 December 2016.

Alexander R. Galloway, 2004. Protocol: How control exists after decentralization. Cambridge, Mass.: MIT Press.

Tarelton Gillespie, 2007. Wired shut: Copyright and the shape of digital culture. Cambridge, Mass.: MIT Press.

Lisa Gitelman, 2006. Always already new: Media, history and the data of culture. Cambridge, Mass.: MIT Press.

Elizabeth Grossman, 2006. High-tech trash: Digital devices, hidden toxics, and human health. Washington, D.C.: Island Press.

Mél Hogan, 2015. “Data flows and water woes: The Utah Data Center,” Big Data & Society, volume 2, number 2, pp. 1–12.
doi: http://dx.doi.org/10.1177/2053951715592429, accessed 15 December 2016.

Matthew G. Kirschenbaum, 2008. Mechanisms: New media and the forensic imagination. Cambridge, Mass.: MIT Press.

Matthew G. Kirschenbaum, Richard Ovenden, and Gabriela Redwine, 2010. “Digital forensics and born-digital content in cultural heritage collections,” Council on Library and Information Resources (CLIR) Publication, number 149, at https://www.clir.org/pubs/reports/pub149/pub149.pdf, accessed 15 December 2016.

Zack Lischer-Katz, 2014. “Considering JPEG2000 for video preservation: A battle for epistemic ground,” iConference 2014 Proceedings, pp. 1,056–1,065.
doi: http://dx.doi.org/10.9776/14381, accessed 15 December 2016.

Zack Lischer-Katz, 2013. “Archiving until the end of the Republic,” In Media Res (22 October), at http://mediacommons.futureofthebook.org/imr/2013/10/22/archiving-until-end-republic, accessed 15 December 2016.

Pip Laurenson, 2006. “Authenticity, change and loss in the conservation of time-based media installations,” Tate Papers, number 6, at http://www.tate.org.uk/research/publications/tate-papers/06/authenticity-change-and-loss-conservation-of-time-based-media-installations, accessed 15 December 2016.

Paul M. Leonardi, 2010. “Digital materiality? How artifacts without matter, matter,” First Monday, volume 15, number 6, at http://firstmonday.org/article/view/3036/2567, accessed 15 December 2016.
doi: http://dx.doi.org/10.5210/fm.v15i6.3036, accessed 15 December 2016.

Adrian Mackenzie, 2010. “Every thing thinks: Sub-representative differences in digital video codecs,” In: Casper Bruun Jensen and Kjetil Rödje (editors). Deleuzian intersections: Science, technology, anthropology. New York: Berghahn Books, pp. 139–162.

Marlene Manoff, 2013. “Unintended consequences: New materialist perspectives on library technologies and the digital record,” portal: Libraries and the Academy, volume 13, number 3, pp. 273–282.
doi: http://dx.doi.org/10.1353/pla.2013.0018, accessed 15 December 2016.

Marlene Manoff, 2006. “The materiality of digital collections: Theoretical and historical perspectives,” portal: Libraries and the Academy, volume 6, number 3, pp. 311–325.
doi: http://dx.doi.org/10.1353/pla.2006.0042, accessed 15 December 2016.

Toby Miller, 2015. “The art of waste: Contemporary culture and unsustainable energy use,” In: Lisa Parks and Nicole Starosielski (editors). Signal traffic: Critical studies of media infrastructures. Urbana: University of Illinois Press, pp. 137–156.

Jeremy Packer and Stephen B. Crofts Wiley, 2012. “Strategies for materializing communication,” Communication & Cultural/Critical Studies, volume 9, number 1, pp. 107–113.
doi: http://dx.doi.org/10.1080/14791420.2011.652487, accessed 15 December 2016.

Joan M. Schwartz and Terry Cook, 2002. “Archives, records, and power: The making of modern memory,” Archival Science, volume 2, pp. 1–19.
doi: http://dx.doi.org/10.1007/BF02435628, accessed 15 December 2016.

Sydney J. Shep, 2016. “Digital materiality,” In: Susan Schreibman, Ray Siemens and John Unsworth (editors). A new companion to digital humanities. Hoboken, N.J.: Wiley, pp. 322–330.
doi: http://dx.doi.org/10.1002/9781118680605.ch22, accessed 15 December 2016.

James Snyder, 2011. “JPEG2000 in moving image archiving,” JPEG2000 Summit (12–13 May, Library of Congress), at http://www.digitizationguidelines.gov/still-image/documents/Snyder.pdf, accessed 15 December 2016.

James Snyder, 2010. “James Snyder: Library of Congress archive,” at http://snapshotsfoundation.com/index.php/james-snyder-library-of-congress-digital-archive-interview, accessed 31 October 2016.

Susan Leigh Star and Karen Ruhleder, 1996. “Steps towards an ecology of infrastructure: Design and access for large information spaces,” Information Systems Research, volume 7, number 1, pp. 111–134.
doi: http://dx.doi.org/10.1287/isre.7.1.111, accessed 15 December 2016.

Nicole Starosielski, 2014. “The materiality of media heat,” International Journal of Communication, volume 8, http://ijoc.org/index.php/ijoc/article/view/3298/1268, accessed 15 December 2016.

Jonathan Sterne, 2012. MP3: The meaning of a format. Durham, N.C.: Duke University Press.

Linda Tadic, 2016. “The environmental impact of digital preservation,” presentation at the Association of Moving Image Archivists conference (Portland, Oregon November 2015), at https://www.digitalbedrock.com/resources/, accessed 15 December 2016.

Eira Tansey, 2015. “Archival adaptation to climate change,” Sustainability: Science, Practice, & Policy, volume 11, number 2, at http://d20nn6mxpbiih2.cloudfront.net/sspp-journal/SSPP-vol11.2.1509-019.tansey.pdf, accessed 15 December 2016.

Stefan Timmermans and Steven Epstein, 2010. “A world of standards but not a standard world: Toward a sociology of standards and standardization,” Annual Review of Sociology, volume 36, pp. 69–89.
doi: http://dx.doi.org/10.1146/annurev.soc.012809.102629, accessed 15 December 2016.

U.S. Library of Congress, 2016. “Audio-visual conservation at the Library of Congress,” at http://www.loc.gov/avconservation/, accessed 15 December 2016.

U.S. Library of Congress, 2007. “Preserving the collections,” at http://www.loc.gov/avconservation/preservation/, accessed 15 December 2016.

Siva Vaidhyanathan, 2006. “Afterword: Critical information studies: A bibliographic manifesto,” Cultural Studies, volume 20, numbers 2–3, pp. 292–315.
doi: http://dx.doi.org/10.1080/09502380500521091, accessed 15 December 2016.

Greg Wilsbacher, 2010. “It plays therefore it is: The Cartesian problem for film preservationists,” presentation at the Association of Moving Image Archivists Conference, at http://scholarcommons.sc.edu/lib_facpub/5/, accessed 15 December 2016.

Langdon Winner, 1986. The whale and the reactor: A search for limits in an age of high technology. Chicago: University of Chicago Press.

Brian Winston, 1996. Technologies of seeing: Photography, cinematography and television. London: British Film Institute.

 


Editorial history

Received 15 December 2015; accepted 15 December 2016.


Creative Commons License
This paper is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Studying the materiality of media archives in the age of digitization: Forensics, infrastructures and ecologies
by Zack Lischer-Katz.
First Monday, Volume 22, Number 1 - 2 January 2017
http://www.firstmonday.org/ojs/index.php/fm/article/view/7263/5769
doi: http://dx.doi.org/10.5210/fm.v22i1.7263





A Great Cities Initiative of the University of Illinois at Chicago University Library.

© First Monday, 1995-2017. ISSN 1396-0466.