FM Interviews
First Monday

FM Interviews: Sandra Braman

Sandra Braman

Sandra Braman has been studying the effects of new information technologies on policy for several decades. Recent work includes Change of State: Information, Policy, and Power (2006, MIT Press) and the edited volumes Communication Researchers and Policy–making (2003, MIT Press), The Emergent Global Information Policy Regime (2004, Palgrave Macmillan) and Biotechnology and Communication: The Meta–technologies of Information (2004, Lawrence Erlbaum Associates).

Braman has been working with the Ford Foundation, Rockefeller Foundation, and Open Society Institute to map the contemporary legal environment, identify emergent policy issues, and try to bring the research and policy communities more closely together. She has published about 70 scholarly journal articles, book chapters, and books; served as book review editor of the Journal of Communication; is former Chair of the Communication Law & Policy Division of the International Communication Association; and sits on the editorial boards of nine scholarly journals in four countries.

Currently Professor of Communication at the University of Wisconsin–Milwaukee, Braman previously served as Reese Phifer Professor of Telecommunication at the University of Alabama, Research Assistant Professor at the University of Illinois — Urbana, Henry Rutgers Research Fellow at Rutgers University, and Silha Fellow of Media Law and Ethics at the University of Minnesota. She earned her PhD from the University of Minnesota in 1988. During 1997–98 Braman designed and launched the first postgraduate programme in telecommunications and information policy on the African continent, as Visiting Professor and Director at the University of South Africa, and in 2008 she will be the Freedom of Expression Professor at the University of Bergen in Norway.

Sandra has been a significant contributor to First Monday for a number of years. Her papers in First Monday — “Posthuman Law: Information Policy and the Machinic World” ( and “Tactical Memory: The Politics of Openness in the Construction of Memory” ( — are important contributions. With Thomas M. Malaby, she organized a special issue of First Monday in 2006 — “Command Lines: The Emergence of Governance in Global Cyberspace” ( — based their conference in April, 2005 at the University of Wisconsin–Milwaukee. Sandra was also the keynote speaker at the second First Monday Conference, FM10 Openness: Code, Science, and Content, in Chicago on 15 May 2006.

This interview was conducted by First Monday’s Chief Editor Ed Valauskas, stimulated in part by Change of State.

FM separation

First Monday (FM): Is there some point in time that we can identify as the beginning of information becoming vital to the functioning of the state? Or has information always been part of the bureaucracy of an efficient government?

Sandra Braman (SB): Of course information has always been important to governments — this is the stuff of many of the earliest examples of writing that we have. There is some wonderful work on what Roman leaders knew about people’s homes, room by room, and they even travelled with the equivalent of file cabinets. There are two reasons why things are different today.

First, Engels’ law applies. This is the idea that quantitative change yields qualitative change. It explains why we can say we live in an information society — the number of information technologies upon which we depend, and the number of ways in which we depend upon them, have so multiplied in number that there is qualitative change in the nature of society. Engels’ law also applies to the effects of the uses of information, and of information technologies, by the state.

... we have many ways of using information as a tool of power that were not available in the past.

Second, digital technologies — we can call them informational meta–technologies — themselves have characteristics that are very different from those of the industrial technologies and preindustrial tools of the past. As a result, we have many ways of using information as a tool of power that were not available in the past. This has changed the way we exercise very old forms of power; “smart” guns, for example, have information–rich targetting systems that can send bullets around corners. It also allows us to do things we were never able to do before — data mining massive amounts of diverse types of information for patterns was not possible when you were working with file cabinets.

FM: Information and identity, how are they linked?

SB: That’s a huge topic. Even narrowing the question to how laws dealing with information affect individual identity reveals multiple links. Information policy can influence our sense of ourselves, for example; libel law affects how others perceive you, and privacy law affects how you feel about yourself. Laws determining which language you can use in particular settings regulate your identity vis–à–vis the diverse communities with which you are involved. Rules about citizenship shape your identity as a political actor. Some statistical representations of you (such as those of the census) determine the types of resources available to you, while other statistical representations (such as those that result from data mining by the Department of Homeland Security) affect the types of constraints and additional scrutiny you may experience.

The state has an identity, too, which depends upon its historical knowledge of itself and current scientific data about everything from resources to the weather to migration, in addition to any stories those in government may be telling about who they would like us to believe we are. Information policy affects the identity of the state as well as of individuals.

FM: In many states, there is a minister of information. Why not the equivalent in the United States?

SB: It’s hard to choose the right tone for answering this question — ironic? Sardonic? Didactic? Orgiastic?

There were actually some interesting ways in which a variety of tasks involving information have been joined in single federal government jobs in the past. The first U.S. efforts to collect germplasm from around the world, in support of experimentation with what we now call biotechnology, were undertaken in the nineteenth century by the Post Office (because it moved information around) and the Navy (because it was seeking information about other places around the globe).

Most countries long had a single person at the head of a “PTT” (postal, telephone, and telegraph agency), but those positions have disappeared as government control over broadcasting and telecommunications has given way to privatization. This was a spot the U.S. never had because all telecommunications and broadcasting have (almost) always been in the private sector in this country. Positions such as the Librarian of Congress and the Archivist of the United States (head of the National Archives and Records Administration) involve specific ritual and administrative responsibilities, but neither has an over–arching portfolio.

Where one loses confidence about tone is if by this question you mean a government official who controls all information about citizens. Certainly there have been repeated efforts on the part of various agencies — for a long time the National Security Agency, and today the Department of Homeland Security — to bring all databases about U.S. citizens under a single analytical lens. To the extent to which this takes place, the head of such an agency could be called a de facto Minister of Information, if not de jure. What we don’t know as citizens is the level of such activity currently underway, given that it may be either publicly known or secret, legal or not, undertaken formally or informally, and accomplished by the government or by groups in the private sector. Nor do we know as managers or researchers the extent to which such efforts will actually be successful technologically or organizationally.

FM: In the 1990s, the Clinton Administration embraced the first wave of Web technologies, encouraging federal agencies to use the Internet and the Web to deliver information to their constituencies. There was no organized plan — each agency it seemed did “its own thing” with the Web. Was this an appropriate use of new information technologies?

SB: Since you use both the words “Internet” and “Web,” your question itself describes the period you’re referring to as a critical turning point — these were the years in which the Internet was first thrown open for use by anyone for any purpose (instead of being restricted to researchers) and hyperlinking and visuals became available. In 1993 the U.S. government stopped providing direct financial support for the Internet — but by encouraging government experimentation with the Internet as a means of delivering services and information, the Clinton Administration continued to support its development in another way. Doing so was in accord with a couple of decades of thinking that technological innovation and the development of new uses of technologies were most likely to take place if folks were simply free to experiment. So in my view the answer to your question would be “yes.”

The amount of some types of government information now generally and easily available is much more vast than was historically the case.

But there is another story here as well — a tale about how adapting the law to the Internet environment can have consequences at the most fundamental level. The U.S. government has always acted on the constitutional principle that citizens should have access to government information in one way or another, though historically this access was often minimal, and often only reactive, in response to specific requests for information. Our ability to get information from the government was greatly strengthened with passage of the Freedom of Information Act (FOIA) in 1976. Almost immediately, though, questions about whether or not FOIA applied to information in electronic form (rather than print) were raised.

After a great deal of debate and courtroom action, in 1996 the FOIA was amended with the Electronic Freedom of Information Act. This legislation did two things: It explicitly applied the Freedom of Information Act to information in electronic forms. And it also required government agencies to proactively make all important information available on the Web — including any information for which requests had been submitted more than a few times. Since 1996, then, the government has been operating under a set of fairly demanding requirements to systematically provide information to the public in a timely way rather than passively responding (if at all) only when asked. The amount of some types of government information that is now generally and easily available is much more vast than was historically the case.

Of course decisions about what information to put up may be politicized, increased access to government information does not necessarily lead to an increase in the quality of democracy, and since 2000 we have seen a number of new restrictions on access, but at least a legal stance of openness has been put in place. Against this history, Clinton’s encouragement of government experimentation with the Internet created a climate supportive of the Electronic Freedom of Information Act, and was likely to have stimulated success stories that were also taken into account in 1996. Another “yes!”

FM: Your most recent book, Change of State, argues that the nature of government — the state — itself has changed, and you use a lot of examples of developments since 9/11. Are fears of terrorism, and this Bush Administration, responsible for the political transformations you discuss?

SB: The book works on two timelines. The first is the political equivalent of “geological,” dealing with changes in political forms that have been taking place since the modern state first appeared several hundred years ago. The bureaucratic welfare state dominated much of the world from about 1870 to 1970. While observers began to comment in the 1970s that the state was weakening (relative to the power of corporations), in fact what they were seeing was the beginning of the transition to the informational state.

Technological change was one factor responsible, but so were things like shifts in how we manage organizations and globalization. There were new ideas about what the government should be doing; the notion that government should be more efficient and use less paperwork, for example, has had an enormous impact on the way the state operates and how it uses information. Over these decades, the same developments took place under both Democratic and Republican administrations, and under all kinds of leaders with a great diversity of motives and dreams.

Since 2000, and particularly since 9/11, developments already underway sped up and intensified, providing particularly vivid examples of historical trends. We must remember that the entire set of ideas embedded in the USA PATRIOT Act was first developed in the early 1990s. Referred to at that time as “new security theory,” this was the U.S. response to the question of how to identify the enemy, and how to defend itself, after the close of the Cold War. In the early 1990s it was believed that the U.S. population would not tolerate many of the inversions of the law suggested by new security theory. After 9/11, however, achieving that acceptance was demonstrably easier — at least for the short run.

The roles of specific individuals have varied. In the Clinton example you raise, there was “the general idea” that promoting use of digital technologies would be a good thing. In this Bush Administration, there is a more finely honed sense of the importance of staying on story, attentive ensurance that every law or regulation conforms to specific informational requirements, and a willingness to turn away from — or even bury — scientific evidence that doesn’t support political goals. Just who is responsible for which of these overt examples of the use of information policy as a tool of power is less clear. Kenneth Adelman, for example, seems to have been particularly sophisticated regarding how to politically shape information and our uses of information technologies when he reviewed every piece of proposed legislation for Dick Cheney, but there must be others as well who are quite self–aware in their approach to information policy.

FM: I enjoyed the intellectual organization of Change of State with chapters separated from more detailed bibliographic essays (which provide the context for your arguments in the chapters). What led you to this scheme?

SB: There were two goals — to maximize readability,and to support a variety of ways of using the book. The general reader, policy–makers, journalists, and undergraduate students may read the chapters of the main text but not the bibliographic essays. Scholars and graduate students are likely to read both. And one could imagine an entire course on a topic such as information policy and borders, for example, that would use the pertinent bibliographic essay to structure a semester of readings to accompany exploration of the related legal topics as discussed in the chapters.

FM: Do you see Change of State as a kind of textbook for classes in information policy? Or are notions of information policy so fluid that it would be impossible to create something as static as a textbook?

SB: The book was intended to to be a “cross–over,” for general readers as well as scholars and students. It is in fact already being required for classes in information policy. The problem of how to treat an ever–changing situation in a work used as a course text faces any book that finds a place in the classroom. While there is no consensus on the definition of information policy and there is great variability in how that phrase is used, the definition used in Change of State is so broad that it should remain stable. The theoretical framework, methodological approach, and categories of types of social trends that need monitoring introduced in this book should also have enduring value.

On the other hand, details of the law and effects of trends will of course continue to change over time. This hasn’t yet been discussed with MIT Press, but hopefully the book will go into second and further editions over the years when and if there are enough changes in pertinent laws and regulations to make revisions useful. When this is done, it will also be an opportunity to incorporate what has been learned by researchers since publication of the first edition as well as subsequent theoretical developments.

FM: What was the biggest surprise to you while you were writing this book?

SB: I think this came while I was studying borders. Every time we go through customs at an airport inside the U.S. we experience what it means to turn the concept of “functionally equivalent borders” into practice, but I hadn’t realized how far use of this concept — which treats spaces inside the U.S. as if they were borders — had gone. There is even now a kind of mobile border that travels with immigrants into the U.S. wherever they go. This was pretty interesting, especially when taken in combination with the fact that the Department of Homeland Security does not need to follow the law when it is defending the borders. So far — to my knowledge — this unique set of operating conditions for the Department of Homeland Security has only come into play regarding environmental and labor law, but given the breadth and ambiguity of the pertinent language, we don’t know how far this could go.

FM: Would Claude Shannon agree with John Perry Barlow that “Information is experienced, not processed”?

SB: Probably not.

Shannon was actually an electrical engineer who worked for AT&T. He was absorbed in the problem of how to most efficiently get information through the telecommunications network, so his interest stopped at the edges of the network. He did acknowledge that information had to be encoded to get into the network, and decoded to get out of it, but in his model those processes were managed by technologies as well.

Shannon’s work became popularized — and stimulated what might almost be described as an obsession with “information theory” — but only after a collaboration with social scientist Warren Weaver suggested that the engineering model could also be used to think about human communication. In their influential co–authored article, people began to show up in encoding and decoding. The emphasis was still so much on a mechanistic view of processing, though, that their approach is often referred to as the “transmission” model of communication.

The “sense–making” theory introduced by Brenda Dervin offers an alternative way of thinking about communication as a process that is less mechanistic and more experiential than found in Shannon and Weaver and closer to Barlow’s perspective.

FM: Do you see open source, open content, open access, open curriculum, and other flavors of “openness” as signs of change in personal and institutional attitudes towards information?

SB: The digital environment has forced us to rethink many things we used to consider unproblematic, or at least singular in nature. We now think about, for example, types of knowledge that are not communicated (“tacit”) as well as types of knowledge that are (“codified”). Openness is another example. We’ve long thought about such things as the “free flow of information” and “access to information,” but new conditions are encouraging us to reconsider just what we mean.

The problem of how to assert ownership rights in information used to confront very, very few of us, but now the problem is essentially universal.

Several factors are feeding into the openness movement. The problem of how to assert ownership rights in information used to confront very, very few of us, but now the problem is essentially universal. Most license agreements with ISPs, for example, even claim the right to use e–mail and other content sent through the ISP, so anyone who uses the Internet is a content producer from the perspective of copyright law. This in turn has led to a much deeper appreciation of the many different types of information and uses. As a relatively simple sense of information and its flows becomes much more complex, we’re seeing a lot of groping for position.

For some people and organizations that demonstrably means changes in attitude and practice. In other cases, it means differentiating between different types of information that were before treated as if they were all the same. For society as a whole, we’re seeing a great enrichment of ways of thinking about openness in all of its flavors. And certainly we’re seeing a renegotiation of boundaries among different industries and types of institutions that can result in what we experience as greater openness. Still, we shouldn’t forget that much of recent history shows first a movement towards openness, and then a backing away, and a number of open experiments have failed.

In my view it’s too early to know what the final configurations — institutional, legal, economic, and cultural — will be because we are still experiencing so much experimentation. Against the very long view, however, much of what is going on is less about changes in attitude and more about developing a much more complex understanding of when information should be open and when it should be more constrained.

FM: What will be the fallout in the music industry of the recent decision by EMI to make a portion of its music catalog available DRM–free?

SB: The interesting twist here is that it is the first time that the marketing technique of “versioning” has been applied to the law. The term versioning has become popular in the digital environment to refer to the very old practice of designing consumer goods of the same type with different characteristics to appeal to distinct demographics. The difference between first class and second class railway cars is a classic example — second class cars are deliberately made less comfortable so that there will be a motive to spend more on a first class ticket if you can afford it.

EMI is versioning its music catalog in three ways. In addition to removing the DRM restrictions that make sharing the music without paying the license fee or royalties required by copyright law impossible, the company will also produce the music at a higher quality, and it will charge more. Buyers can pay a lower price for lower quality music protected by DRM, or a higher price for better music without DRM.

While in the past we may have been cynical about the ability of those who are rich to essentially buy themselves out of having to comply with certain legal requirements, doing so was never designed into either products or processes. This new move by EMI in essence commoditizes non–compliance with copyright law — you can buy yourself out of it.

... sharing music digitally tends to increase, not decrease, music sales.

Of course this development is also a recognition that sharing music digitally tends to increase, not decrease, music sales. In this, it is another example of the changes in institutional attitudes you asked about earlier.

FM: Will we ever see an international agency for information to deal with transborder issues? Or will existing agencies like the World Intellectual Property Organization continue as a de facto information court?

SB: Another area in which we are still seeing great experimentation. A decade ago it looked as if a number of information industries would be successfully globalized as a result of the World Trade Organization, but that international organization is now weakening in favor of bilateral or restricted multilateral agreements. The International Telecommunications Union and UNESCO continue to grope towards resolution of digital divide issues, but the effect is often more rhetorical than actual.

ICANN is of course the important new player, but it is not a traditional international organization. Instead, it is in essence a private sector entity that has created what we can think of as a parallel system of law to regulate the Internet. We are still in the midst of struggles among ICANN, national governments, and international organizations for that particular turf.

A lot of decision–making for transborder issues is taking place outside of any international organization. Contract law remains particularly important, often serving now as precedent for public law. Activities in cyberspace itself are generating new practices and institutions with law–like functions. And national courts are becoming internationalized both in terms of where cases are being adjudicated and the sources of precedent relied upon.

FM: You’ve said that Change of State was a 25–year project. Where does your work go from here?

SB: A number of issues that receive limited discussion need to be explored much more fully. Change of State identifies a few areas in which there has been some relaxation in the attachment between the law and facticity (claims to reliance upon the fact, and specification of the procedures to be used in generating or identifying facts). The FBI, for example, no longer is required to claim that it is acting on factual information when it identifies someone as a target of suspicion and therefore of surveillance. I’m beginning to study this particular development wherever it occurs across the law.

Another area touched upon only briefly in the book that needs to be expanded upon is the problem of “inference attacks.” The security establishment uses this phrase to refer to the ability to reach conclusions considered politically or militarily dangerous to the United States on the basis of information to which one legally has the right of access. There is currently work underway to develop new techniques, whether of the law, technologies, or social practices, for protecting against inference attacks. Concomitant research needs to be undertaken on how we constitutionally distinguish among different types information processing (modes of argument, types of inference, logics, or just plain thinking) for differential treatment under the law.

FM: Sandra, thanks for taking some time to talk to First Monday about information policy and Change of State. First Monday’s many readers around the world always look forward to your ideas and research in our virtual pages. Given the mercurial political scene in the United States and elsewhere, we guess that the next title for a new book might be Subject to Change Without Advance Notice!

About the Excerpt from Change of State
by Sandra Braman

By the early 1980s, it was clear that traditional legal and economic categories were inadequate for thinking about a policy environment that was rapidly changing as a result of technological innovation. Change of State is my response to that problem. It is really several books in one — a theoretical and conceptual framework for thinking about the nature of information policy and its role in the evolution of the state as a political form; a text on current law across almost 30 areas of information policy; an analysis of the impact on society of trends in information policy across those areas; and, a methods book on policy analysis itself.

As used here, information policy is an umbrella term that refers to all law and regulation of information creation, processing, flows, and use; more colloquially, all policy for information, communication, and culture. While U.S. law is used as the case for detailed study, the trends and effects identified are found in other countries around the world and in international decision–making as well. The theoretical and methodological framework can also be used to think about information policy for particular organizations or communities.

It is a key argument of the book that we can’t understand the effects of any particular law in isolation. Just as we experience the legal environment as a seamless whole, so our analyses must link up treatment of related matters wherever they arise. The first half of Change of State introduces the basic principles and history of information policy, along with the theoretical and conceptual framework used in the book. The more detailed legal analysis of the second half looks at the effects of the law as they are experienced in the areas of identity (of the individual and of the state), structure (of technological, social, and informational systems), borders of all three types of systems, and ways in which the law encourages, constrains, or prevents change in technological, social, and informational systems. For each legal issue, the work provides an overview of history and context, details the law as it stands today, and examines several social, political, cultural, and economic trends. The concluding chapter, republished here through the generosity of MIT Press, offers generalizations that appear when you look across all of this detail.

Three criteria were used in choosing which issues to include. Some are so obviously important that they had to be discussed (e.g., intellectual property rights and privacy). Others are quite important matters that are little understood outside of specialist communities (e.g., network interconnection and national archives). Because diverse aspects of the law change in very different ways in response to digital technologies, the book includes some issues with very long histories in which the law is changing relatively slowly but with deep cultural and social implications (e.g., libel law); some in which it is commonly believed that the law is quite stable but there has actually been dramatic change (e.g., border law); and, examples in which the impact on society can’t be discerned without examining an entire suite of inter–related issues (e.g., what the state knows about itself).


About the author

Sandra Braman is Professor in the Department of Communication, University of Wisconsin–Milwaukee.
e–mail: braman [at] uwm [dot] edu

Contents Index

Copyright © 2007, First Monday

Copyright © 2007, Sandra Braman

Copyright © 2007, MIT Press

FM Interviews: Sandra Braman
First Monday, volume 12, number 4 (April 2007),

A Great Cities Initiative of the University of Illinois at Chicago University Library.

© First Monday, 1995-2019. ISSN 1396-0466.