Modern disciplinary research is partly constructed, and limited, by the medium of paper. It is possible to bypass the restraints imposed by paper in modern Web publication. Still, the research sector keeps publishing as if the qualities of hard copy should be forced on the Web. This article discusses the role of paper in the construction of the boundaries of disciplines and the challenges from digital Web–based publication.
The purpose of this article is to argue that the ongoing shift from information on paper to digital Web–based information carries fundamental repercussions for the disciplinary research system. This is a new angle in a complex research area which can be called “the transformative character of research in the Digital Age”. This is a vital research topic to which numerous perspectives contributes. The most notable advances have been made within scholarly communication, a sub–discipline within library and information science.
Two anthologies, World wide research: Reshaping the sciences and humanities  and Scientific collaboration on the Internet  summarize a wealth of research on how information technologies enable both new forms of access and innovative genres of collaboration. These discussions are often tied to the concept of “e–research”, referring to the technologies that bind together researchers in new ways.
Others focus transformations concerning basic features of research, such as the business model, peer review and post–publication evaluation . The concept of “science 2.0” is sometimes used, mostly when analysing the massive growth of data . It is also possible to view the transformation of research as a shift of power . Still, discussions that in–depth probe the shift from paper to Web is missing. This is a lapse I intend to deal with here.
The following article is concerned with the historical construction of the discipline as something tied to textual production on paper. I will, therefore, focus philosophic and practical dimensions/problems of shifting disciplinary based knowledge from paper to Web. This article, then, adds another layer to the critical discussion of the widening gap between the potentials of modern Web publication and the ideals of the traditional research publication.
1.1. Research as early and slow adopter
My main target will be, again, the conventional organisation of research in the form of the discipline. Disciplinary research is, for the purpose of this article, defined as an institutionalised research area with clearly defined objects of study, methods, boundaries, institutions, conferences and journals. This is a broad definition that allows for wide diversity. Nevertheless, it includes a generic ideal of "disciplining" the creation of knowledge within a bounded domain. As a result, disciplines tend to develop unique characteristics, expressed as consensus on state of the art concepts, standards, methods and perspectives.
My starting point is the observation that the research community has been an early adopter of digital technology, but very conservative in exploiting the potential of digital Web–based text. This is a curious situation. Why have researchers not continued to embrace new digital possibilities? One explanation, advanced in this article, is that disciplinary based scholarly work has been fundamentally tied to presentational properties of paper. It is, therefore, difficult to utilize sophisticated Web–based publishing without redefining the fundamental role of the discipline. This line of thought can be pressed further: Web–based technologies available today enable the renegotiation of authorship, publication and, indeed, the relationship between science and society.
In pursuing these arguments, I will suggest a link between presentation on paper, specialisation and the social character of research. I argue that the presentation of research results on paper is a highly restricted genre compared to what really can be done on the Web. Furthermore, the process of performing research is tightly interwoven with the presentation of results. If we change one, the other must follow. I understand this as the primary difficulty, there is a fear of renegotiating the procedures of research.
The title of this article carries two meanings. First, it alludes to the media of paper being a neglected and vital feature for the historical construction of research practice. As someone with a long background in science and technology studies, I have a fondness for the metaphor of “social construction of scientific knowledge”. This is a perspective that has been fruitfully developed in science studies since the mid–1970s . This tradition maintains that the context of scientific practice contributes to the results of scientific investigation. To this, I here add the construction that can be attributed to the media of paper.
The second meaning concerns paper constructs as something that should be seen as re–negotiable. By recognizing that research has been developed through a restricted medium (paper), it becomes possible to rethink many features of traditional science as “paper constructs” and go back to the drawing board.
I will deal with the following three research questions:
- How has the medium of paper contributed to the construction of specialised research disciplines?
- Which transformative aspects of Web–based digital information are introduced in recent decades?
- How can disciplines be reorganised in order to adapt to the potential of Web–based information?
I will deal with these questions in the three parts of the article. The first contains a critical discussion of the evolution of the discipline and its privileged position in society. Although this mode of organization has shown great strength, there is an obvious problem. Disciplines have monopolized epistemic areas and erected strong boundaries. It can be argued that this is the main characteristics of the current system of knowledge production. While disciplines have become increasingly robust, there has also been a lack of crossbreeding between specialized experts. I argue that the development of conventions related to the discipline is connected to the restrictions of the medium of paper.
In the second part, I maintain that the introduction of Web–based digital texts enables us, in several ways, to move away from the traditional restrictions of paper. However, the practice of doing research has been bound to what paper actually allows. To embrace these new possibilities fully would be to radically transform the practice of research. So far, the research sector has not seemed ready to take on this challenge.
In the third part, I will discuss alternatives to the disciplinary oriented research system and suggest an organizational model that transcends the restrictions of the traditional discipline.
In this part of the paper I will review characteristics and philosophies of the research discipline, connecting these discussions to the ever present feature of publication on paper.
2.1. The strong discipline
One of the most fundamental dimensions of modern societies is the rational production, evaluation and dissemination of high–quality knowledge. With time, the production of high–quality knowledge has become the exclusive domain of the research community. This privileged position has also been legitimized through the idea of specialisation within different branches, disciplines that function as autonomous epistemic regimes. Educational systems, public libraries and even political systems have become the caretakers and disseminators of rational research based knowledge.
The fundamental ideas of research have roots in the works of Aristotle and the correspondence theory of truth . It was thought that ideas and reality in some way corresponded to each other. Although commonsensical in character, the crux with this assumption is that language always expresses partial narratives of aspects of any given reality. Aristotle navigated this problem by suggesting that categories could capture the basic principles, the essence, of any object. Complexity was dealt with through hierarchies of categories. Different narratives were linked to separate concepts and held apart within the hierarchical framework.
The Aristotelian research system was, from the start, based on the notion of holding specialized areas of knowledge apart from each other. Writings on papyrus were restricted in length to what was convenient to roll together. With such a restricted medium of representation, specialized troops had to be inscribed in separate documents. Obvious digital features, such as linking different texts together, were not available. The medium of paper was, however, quite effective in communicating hierarchies and sequential narratives. This was the mindset developed as the producers of knowledge worked with paper. Therefore, knowledge could be seen as a tree–like structure, clearly demarcating entities apart from each other. Different papers and narratives were produced for various topics.
While the Internet may potentially democratize participation in politics, in praThe correspondence theory of truth can be seen as the idea to catch reality on paper. Thereby, paper becomes the focal point for researchers negotiating the description of truth. In this sense, the written text appears superior to the spoken word since it can be formalized, revised, discussed and strengthened. The philosophical aspect of research would develop into the art of using textual representation that clearly describes the phenomenon based on available evidence.
The correspondence theory of truth would eventually be complemented with the coherence theory of truth, which meant that statements were seen as true if they fit together with other statements seen to be true. Implicitly, this created a system of linking texts together. The written statements of scientists could therefore be supported by other pieces of paper rather than references to the actual object of study. In turn, the coherence theory of truth stimulated the formation of specialised communities that could compare their various papers.
The idea of verification, introduced by empirical positivism, worked at two levels. On a micro level, observations between different researchers could be matched with each other. On a meso level, this created a system of trust in which researchers verified their findings through collegial discussions. In order to gain stature within the discipline, researchers were required to collect and read the papers by other specialists working on the same topic.
In practice, verification meant that researchers turned the correspondence theory of truth sideways, emphasising the coherence theory of truth. Instead of comparing the writings on paper with the phenomenon, researchers could compare their texts. This, once again, stimulated the formation of groups of scholars that had written texts on very similar issues. Otherwise, comparing became otiose. This created two distinct phases of the research process, sometimes referred to as the context of discovery and the context of justification. The added emphasis on the latter necessitated more collegial discussions. The coherence theory of truth therefore went well with other competing ideas of truth, pragmatism (it is true if it works) and consensus (it is true if we all believe so).
The restrictive character of the medium of paper is an important factor in this development. Texts on paper are physically separate entities that can be said to be held together by a primitive system of linking. It is effective enough to reasonably glue together texts that are very similar in character. However, this system of linking is far too insufficient to deal with the links between texts that are related but not similar. Instead, our process of organizing scholarly publication serves to tie textual entities more closely together and, as a by–product, strengthen the boundaries between them.
Clearly, researchers benefit by working closely together, discussing similar issues. However, with time, social relations become intermingled with truth . We tend to believe the results of our fellow researchers to the degree that we trust them. Particularly in the natural sciences, researchers tend to be mutually dependent . A high degree of mutual dependence creates more stability, stronger boundaries and, unfortunately, less creativity.
Furthermore, for many disciplines the production of knowledge began with observable data. This emphasized measurement as a key research skill. However, in order to establish a mutual exchange of measured data, researchers had to standardize practices, exchanging formalized papers on how, exactly, good measurements should be made . Without this kind of systematic stabilization of measurement practices, it would be impossible to compare and aggregate values given by different scholars. As standardization hinged on research groups being well connected, different disciplines embarked on separate standardization processes. This, in turn, further strengthened the organizational principle of the discipline.
2.2. Philosophies of disciplinary research
The empirical positivism of the Vienna school has served as the core philosophy of twentieth century disciplines. This approach contained an interesting combination of empiricism (knowing through observing), rationalism (knowing through thinking, logic and mathematics) and, indeed, anti–realism. The positivists realized that linguistic constructions never represented the thing in itself . Instead, science made attempts at catching the most dominating drivers of what constituted each phenomenon. These were called “laws of nature” and were understood as mathematically measurable constants. In order to attain the level of laws, inconsistencies had to be cleared away. Nevertheless, it was hoped science would progress by understanding the world through these dominating principles.
Empirical positivism emphasized abstract logic and mathematics. Such skills are, by and large, paper–based and the emphasis on these further empowered researchers as textual craftsmen. It was also thought, for a substantial part of the twentieth century, that these methodological tools were universal and that they could and should be used by scholars from all different disciplines.
This notion of universal methodological standards can be seen as an attempt to significantly strengthen the weak system of linking. If all researchers spoke the same language, then it would be possible to compare texts from different disciplines. An important assumption within empirical positivism was the idea that disciplines were connected hierarchically . Therefore, any law of nature stipulated by one discipline was relevant for all below. Physics was deemed to be at the top of this scientific pyramid. This idea of a strictly held together tree of knowledge, which connected all research areas to each other, would slowly fade away during the twentieth century. The two giants of twentieth century philosophy of science, Karl Popper and Thomas Kuhn, would both, in different ways, contribute to a trend in which disciplines moved apart from each other.
Popper tended to turn empirical positivism on its head, arguing that the tedious detection and verification of law–like structures was both too slow and misdirected . He saw knowledge seeking as a personal intellectual project, suggesting that different researchers, despite the same base data, would move in separate directions. Therefore, theories were engaged in a kind of Darwinian survival of the fittest, competing to come as close to truth as possible. Final truth was never reached and verisimilitude, rather than truth, became the goal of science. Popper argued for a scientific practice characterized by openness and engagement with societal issues . Implicit in this recommendation was an idea that scientific ideas needed more than collegial support. Popper also maintained that researchers should be systematically self–critical, attempting to find flaws in their own positions. Theories should be abandoned if they could not produce predictions that matched data. With such an understanding, disciplines were allowed to disregard the positivist ideal of the hierarchically connected universal science. In other words, disciplines could ignore each other.
This angle was to be pursued by Thomas Kuhn, although he disagreed with Popper on the practical liability of falsification . Popper upheld the ideal of the self–critical and disinterested researcher and therefore found it effective for researchers to pursue daring hypothesises, readily discarding them if they were falsified. According to Kuhn, researchers were seldom that adventurous. They tended to be stuck in settings where the background theories already were fixed. Therefore, they labelled falsifications as “anomalies” and ignored them.
Although it was hardly his intention, Kuhn introduced the social dimension into the study of science as collectives of disciplinary workers were seen as bound by disciplinary restrictions. Scientific success was reached by process of disciplinary puzzle solving which reaffirmed the main ideas and principles of the paradigm. In this model, disciplines were seen to be self–contained and not only removed from society, but also from other disciplines. Different disciplines upheld ideals, epistemic strategies etc. to such an extent that they were unable to communicate with each other.
Kuhn emphasized the importance of the exemplary scientific text that served as a model for all other research ventures within the paradigm. Young researchers would read these texts and try to model them when writing their own.
Kuhn’s work has been widely influential not only as a description of scientific practice, but also as a normative ideal for strengthening disciplines. Furthermore, the success of Big Science in achieving both an atomic weapon and men on the Moon pitched the attractiveness of collective research groups that, nevertheless, were distinctly held apart. In practice, this emphasis on the collective nature of research has weakened the position of strong academic intellectuals and university departments. Such were traditionally developed in line with the visions and competencies of the professor. Increasingly, scholars were, instead, controlled by specialised schools of thought that gained international hegemony.
A great number of empirical investigations within science studies followed up on the ideas of Kuhn . Typically, science studies suggested that the richness of the data, together with a tradition of emphasizing certain dimensions of the phenomenon (while ignoring others), created a kind of disciplinary prejudice. Furthermore, inconsistencies between data sets, that had to be compared, together with the obligation (in many disciplines) to translate all aspects of reality into mathematics, created a steady flow of interpretative choices.
The most established instrument for quality control in the research community, peer review, was in several studies seen as flawed . Although this gatekeeper function served to institutionalize and uphold valid quality criteria, it also tended to feedback the paradigm. The peer review system can be said to be less effective in processing innovative research that challenged either the core or the boundary of the discipline.
In a series of papers, Silvio Funtowicz and Jerome Ravetz constructed a model of research that aimed at revitalizing normal science into post–normal science . The most essential instrument for this transformation was “extended peer review”. Funtowicz and Ravetz argued that what one discipline characterized as “anomalies”, appeared as the paradigmatic core of the neighbouring discipline. By allowing related researchers to become involved in extended peer review of research articles, constructive quality control and disciplinary development were subtly stimulated. Furthermore, extended peer review should also include knowledgeable practitioners outside the research community. Such non–academic experts could often supply valuable input on the implementation of knowledge.
The idea of extended peer review is an interesting strategy for revolutionizing the traditional linking system of research and both allowing and forcing connections between different disciplines. It is an idea that can be connected to the new tools available to us today. I will return to this later in this paper.
Contemporary theory of science has successfully followed Kuhn in understanding science as a collective venture. However, as philosopher of science Steve Fuller has argued, we need the ideas of Popper as well . In the context of this paper, I would like to emphasise the Popperian vision of the morally responsible scientist in the open society as relevant in understanding research in the Digital Age. Utilising modern Web resources, researchers can today move a bit more freely in relation to disciplinary collectives. Boundary spanning scientists with popular blogs directed toward the “open society” are, seemingly, acting according to several Popperian ideals. Nevertheless, the majority of research is still produced strictly within disciplinary boundaries. These are best understood through the tradition of science studies that follow Kuhn.
2.3. Stabilization and destabilization
In order to relate this historical discussion to the development of digital information, I will utilize the concepts of stabilization and destabilization. These are useful in describing the main thrust of scientific activity from a meta-perspective. I am building on ideas developed for similar purpose by the French philosopher and anthropologist Bruno Latour in his analysis of how Louis Pasteur gained recognition for his research . Basically, science and technology studies in following the insights of Thomas Kuhn, has found that researchers favor a stable research context in which they can deliberate the finer points of an epistemic area. Researchers, therefore, strive to stabilize a research area with specific boundaries and conventions. This stabilization is necessary to regulate peaceful practices. Furthermore, there are also important external considerations. Disciplines function much like other institutions analyzed within new institutionalism . Survival is dependent on the establishment and negotiation of a position within a larger context of other institutions/disciplines. Stability can be translated into societal power as the strong discipline can be agenda setting, establishing frames of reference and key concepts. Often, stability is reliant on epistemic monopoly and a system of disciplines mutually recognizing the legitimacy of various boundaries and practices.
In its most successful form, disciplinary stabilization is a global project where institutionalization in different countries supplies mutual legitimacy. Ideally, there are professional groups that have specified needs that disciplines match. The privileged position of the academic discipline is promoted by professional groups that are involved in similar power struggles outside the Academy . These will disseminate and solidify the societal position of the discipline. Historically, professional societies have insisted on increasing levels of stabilization. Furthermore, it is just as vital that research funding agencies traditionally have been prone to favouring the most stabilized research areas: the strong disciplines.
Although the main tendency is stabilization, we also find a vital, but much weaker, movement toward destabilization. As researchers eagerly stabilize their ideas, they also critically scrutinize the foundational ideas of their intellectual opponents. Destabilization is also found in much scientific practice, in research seminars, peer review, examination boards, conferences, coffee discussions, etc. However, my main point in this section is that compared to the movement of stabilization, destabilization is relatively weak. The main vulnerability of the research system is that disciplines are allowed to follow their own path with very little interaction with neighbouring research areas that could serve as destabilizing catalysts.
I maintain that the medium of paper has contributed to stabilization of scientific practice in many ways. Results are, in the words of Latour and Woolgar, “inscribed” on paper . Digital texts uphold a potential of destabilization, quite simply, not possible on paper.
Arguably, many researchers find destabilizing settings extremely stimulating. However, as we have fostered a stable system of “disciplinary apartheid”, most researchers remain under–stimulated, working with the same ideas, day in and day out.
Naturally, much is gained by the stabilized research agenda. With strong perspectives, researchers produce insightful, in–depth understandings of specific phenomenon. Furthermore, the stabilized agenda is a prerequisite for information/knowledge filtering, to know where to start looking for something interesting. Without these stabilized research agendas, we have difficulties in initiating interesting research projects. The problem that I am outlining here is not that specialized research traditions are unnecessary, only that they become too stable. We are, quite simply, too much concerned with stabilization and find very little rewards in pursuing destabilization.
I connect these ideas to the classical distinction between exploration and exploitation . Exploration can be associated with the development of new perspectives and skills, while exploitation concerns the refinement of existing perspectives and skills. Typically, there is a trade–off relationship between these two. When we are too concerned with exploration, exploitation will suffer and vice versa. As I adapt these analytical ideas to the discussion on disciplinary research, I find that the dominance of stabilizing practices leads to a focus on exploitation. In other words, we tend to become more proficient within areas of already established competence. In doing this, exploration suffers, or, is corrupted into becoming an activity performed within exploitation. We tend to explore only within our narrow competence, a subset of the disciplinary domain.
2.4. Science and society
The relationship between science and society has also followed a trajectory of increasing stabilization. Basically, science has inherited the traditional role of the Church in mediating truths to the members of society. In the old days, human behaviour was guided by the priesthood. As specialists in interpreting Holy Scripture, they were given the power to convey practical codes of conduct. Similarly, we today expect our schools to be organized according to the ideas of educational research and the content of education to be research-based. Naturally, healthcare is evidence–based and, indeed, all kinds of professions gain legitimacy through research. This is a fairly efficient construction with inbuilt mechanisms that increase epistemic quality over time. However, in the context of a modern democratic society, this model has some vital flaws. Building on the discussion on post–normal science, these can be seen as a lack of extended peer review.
There are two dimensions to the contemporary relation between science and society. First, the disciplinary apartheid, discussed above, leading to a lack of internal extension of the peer review process: disciplinary specialists monopolize feedback. Second, the hierarchical relationship between science and society as there is no external extension of the peer review process: only researchers are allowed to give input to research.
Within science and technology studies, there has been extensive criticism of the top–down character of science to society communication, leading to a lack of dialogue between researchers and non–researchers . Seemingly, the idea underpinning policy has been that researchers are to produce stabilized segments of truth which are then smoothly and efficiently placed inside the heads of non–researchers. Within the research field of public understanding of science, this has sarcastically been dubbed “the deficit model” .
2.5. The development of the read only culture
The evolution of a division of labour between a small community of knowledge producers and the rest of the world as consumers is actually part of a much larger context. Modern society is firmly built on allowing specialized professionals jurisdiction and in the twentieth century these ideals were connected to the structures of mass production, mass distribution and mass consumption.
In his discussion of how modern social technologies challenge traditional patterns of cultural consumption, Lawrence Lessig describes an older form of “read and write culture” compared to the “read only culture” of the twentieth century .
The production of culture was, argues Lessig, originally a localized feature. Creative expressions of music, storytelling and art were naturally situated in the family and the village. This situation was disrupted when new technology could communicate creative achievements of the very proficient to the mass audience. As the mass–market book became possible, local storytelling became undermined. Why listen to grandpa when I can read Jules Verne?
As mass culture developed during the twentieth century, we as a species became accustomed to the role of consumers. And this followed the same pattern regardless of content: music, movies, art, books, news and knowledge. The specialists produced stabilized products that were consumed on a massive scale.
The research system is therefore part of a larger societal pattern of division of labour. The whole system is now challenged by the development of Internet–based social technologies. However, the challenge to research carries some specific traits and in order to understand these, it is necessary to discuss the power related dimensions of information/knowledge on paper.
2.6. Information on paper as power
Information on paper has served as an effective tool for control. In this way, the scientific journal article, the scientific journal and the scientific monograph have been vital tools for creating a well regulated system for upholding expertise and authority. The publication system has followed disciplinary boundaries. By allowing disciplines to have their own mouthpieces, unchallenged by potentially destabilizing neighbours, boundaries have been reinforced.
Before proceeding with this argument, I must, once again, point to extensive advantages with this differentiated system. However, I am in this article concerned with various downsides of the system. Therefore, it is important to note that the publication system is tightly controlled by disciplinary collectives through various gatekeeper functions that regulate access to formal knowledge production, e.g., prestigious publication.
Although the scientific community allows freedom in establishing research journals, ranking systems enable strict hierarchies. In many cases, publications have not been evaluated based on the actual content of the article, but rather on where it is published.
By and large, it has been possible to transform this system to digital landscapes of research publication. As argued initially, the research sector was actually an early adapter of the new technologies. The most radical changes were made in the late 1990s and a few years into the new millennium. After this, relevant actors have strived toward stabilization. One of the most ambitious journals in adapting to the new technology has been the prestigious British Medical Journal. In 1997 the editors invited readers to make predictions on what online articles would look like five years hence. This was followed up in 2002 . The five most common themes suggested by readers, according to the 1997 predictions, were graded on a scale from 1 to 10 depending on the degree of manifestation in 2002. These results illustrate the lack of innovation since the late 1990s:
- Scientific articles will become living documents, always updated and never finalized. Realization: 1.
- Online articles will be supersets of the paper publication. Realization: 3.
- Links will be greatly used. Realization: 5.
- Articles will be available in different work versions and with different levels of complexity. Realization: 2.
- The peer review system will be transformed. Realization: 4.
As of writing, almost a decade later, I find this summary still valid as a description of the character of research publication on the Web. Indeed, some of these ideas reflect the potential of a more dynamic Internet, features that we associate with the social technologies of Web 2.0. However, such tools have, by and large, had difficulty within established research. The digital publication landscape for research is still built on Web 1.0 ideals. These closely mimic paper publication.
In this section, I will first introduce the notion of publication as expensive and therefore exclusive/excluding. I will argue that publication on paper is a restricted process that has contributed to a construction of social structures with strong boundaries and many drawbacks. Following this, I will discuss a few features of research publication on the web that can be seen as unexploited potentials. These constitute a significant challenge to our traditional system of organizing scientific knowledge in relation to society.
3.1. Paper–based information is expensive
Editing information on paper, printing and distributing it, is quite expensive. In the twentieth century, we knew of nothing else, and had difficulty in reflecting on it. As we now can compare with digital publication on the Internet, the drawbacks of the expensive paper–based publication system becomes clear.
As information on paper is costly, it becomes exclusive. Only those who acquire publication funds in some way or another are allowed to put their ideas on print and have them distributed to a wide audience. In the case of research, a complex funding system has been created in which commercial publication companies have constructed prestigious journals. The scientific community have discipline by discipline provided, usually cost free, high–quality peer review and University libraries have paid through subscription. When the paper–based structure was carbon copied to the digital, the high costs of research journals were carefully reinforced .
This system is restrictive in several ways. First of all, as scientific publishing is costly, only well–written, original and high–quality research is allowed access. This forces researchers to stay on topic, within the discipline, strictly in the area where their expertise is acknowledged. Although there is much to appreciate in this structure, it tends to further encourage narrow specialisation. Originality is most commonly gained through increased specialization.
A second drawback is that research of a non–original character, such as integrative research overviews and replication, become non–publishable and, as an extension, not funded .
A third restriction lies in the high demand of discipline–based special competence that is required to write a high–quality article. It is often necessary to be well seasoned in the language of a specialised area in the discipline in order to structure and compose an article fit for publication. This can serve as a gatekeeper against younger researchers.
Fourth, costly publication becomes, by necessity, formal, finished and polished. Researchers have much need for testing ideas, toying with them and engaging in a debate without having invested professional prestige in certain conclusions. Granted, there are seminars, workshops and conferences that allow these kinds of interactions. However, in the non–digital age, these are exclusive, local discussions. To the wider audience, research publications appear polished and final.
Fifth, as articles are formally finalized for expensive publication, they are fixated in time. This mode of freezing a polished set of ideas in time would appear natural within the paper publishing system. However, articles quickly become dated as updates are prohibited. More seriously still, there is an obvious risk that researchers find themselves bound by previous publications. This, in turn, may inhibit intellectual growth and an open academic debate.
Obviously, the expensive character of paper–based research publication holds many drawbacks. I will now turn to a more philosophical discussion on paper as restrictive medium for research.
3.2. Paper as discipline
Compared to digital publication, the most striking aspects of paper–based printing are multidimensional restrictions. Paper–based space is always restricted. A journal or a book only has a number of pages at its disposal. Often, articles may not exceed a generic threshold. Not too short and not too long.
When writing in paper, researchers have to be effective in delivering their original ideas. As a consequence, research results are often painfully simplified. Deliberations on methodological details, actually vital for evaluating quality, are usually held back.
Furthermore, discussions relating to uncertainties, interpretation, faulty measurements or bad data, etc. may also be compromised. Since researchers have to be brief they are usually not allowed the luxury of pursuing several possible interpretations . Articles must be original in order to be published and, therefore, the researcher must not give too much credit to previous, competing, research narratives. The article must be convincing and this, in turn, leads to clear and effective storylines.
The disciplining character of paper seduces the researcher into linear storytelling where all parts of the narrative make sense with each other. This is congruent with an empirical positivist perspective, as the fundamental features of phenomenon are taken to be orderly arranged. This one–dimensional rational narrative fits less well with modern perspectives in the human and social sciences that acknowledge the existence, and justification, of conflicting narratives.
The research account on paper is linear as each segment of the text is fixated in a certain order. Sequencing, in this manner, automatically creates narrative structures, implicitly suggesting that this is the only relevant narrative on the topic. Arguably, for any phenomenon, there are a great amount of possible narratives. As the research text is forced to fit into the disciplining shackles of paper, there is a tendency to highlight one interpretation and make it the basis of the narrative. As a consequence, the narrative actually serves to hide a number of competing interpretations and meanings.
Internet–based digital text lacks, in several different ways, these restrictions of paper–based information. It is possible to identify at least six ways in which digital information can go beyond the restrictions of paper:
- The potential of sidestepping hierarchies.
- The potential of updating.
- The potential of collective authorship.
- The potential of space.
- The potential of commenting.
- The potential of social tagging.
These themes will be explored in the next sections of this paper.
3.3. The potential of sidestepping hierarchies
The orderly shelf–based system has promoted a restricted research organization. Although research results may be relevant for other disciplines, the hierarchical system of categorization has served to glue research results to the disciplines. However, digital Web–based publication allows us to disconnect from pre–established hierarchies. This is, indeed, a revolution in publishing.
The earliest usage of the metaphor “there is no shelf” probably comes from two talks given by Clay Shirky in 2005 . Still, the principle itself had been revolutionizing Web–based digital documents ever since the fixed and hierarchical categories of Yahoo.com were overrun by Google in the late 1990s. Following the Aristotelian tradition, we have assumed individual documents can only be in one place at a time. As a consequence, all information has been pressured into being positioned into a fixed place in a hierarchical tree of knowledge. In the material library, it is not possible to place the same book in two places, even though the content may cross several boundaries.
So far, digital research publication has mostly pursued shelves. However, a search on Google scholar sends the researcher into a mixed research landscape with no shelves. Results from medicine can be mixed with technology, social science and humanities. Furthermore, while numbers of citations are highlighted, other characteristics of status are ignored. Therefore, a frequently cited PowerPoint presentation or blog entry can be placed next to (and perhaps above) the prestigious journal article.
Amazon.com is another major Web actor that has discarded the shelf system. Instead, the reader is supplied with a wealth of metadata and links in order to add precision. Amazon.com indiscriminately mixes not only disciplines, but also places journalists and amateurs side–by–side with researchers. Information services such as these entail a fundamental break with the tradition of organizing research by discipline. These are subtle revolutions that quietly undermine the traditional system of strong boundaries both between disciplines and between science and society.
3.4. The potential of updating
While information on paper is printed, inscribed and fixated, digital information is easily revised and updated. This actually carries extensive potential for all kinds of writers. Nevertheless, different categories of writers will relate to this in various ways. If I had written and published a novel five years ago, it would make little sense for me to revise it today. The situation is similar for a journalist. Their articles are supposed to be consumed when fresh.
Arguably, researchers constitute those writers with most to gain by updating. They produce narratives that, as best as possible, supplies insights and explanations that are meant to last. As many research fields are quickly developing forward, it would actually make sense for researchers to regularly revise their most significant articles. This is often done for the handful of monographs that are printed in a second edition. Frequently, it seems that monograph authors welcome the opportunity to update their data, include or exclude case studies and address the comments of critics. However, the transformation of journals from print to the digital medium has been business as usual. So far, the digital version has been seen as a copy of the printed article.
Allowing researchers the privilege to update and continuously revise their publications would indeed be revolutionary. Researchers could develop their ideas in response to new developments within (and outside) their research field. Naturally, the system of referencing would have to be modified so that there were clear references to what version of the article served as a source. This would also create more transparency as it would be possible to follow how ideas were developed from one version to another.
Ideally, funds would be allocated to allow researchers to update their most important contributions with some regularity. Scientists could be somewhat less concerned with the steady production of new articles and be able to continuously sharpen and broaden the content and quality of certain articles. Researchers could also be allowed to experiment and publish unfinished drafts, knowing that several different narratives could be developed. They might even allow other researchers to take control over further idea development. Such a development is now possible as digital information on the Web makes truly collective authorship possible. There are substantial advantages with such arrangements and this will be dealt with in the next section.
3.5. The potential of collective authorship
“The author is dead”, argued poststructuralists such as Jacques Derrida and Roland Barthes in the 1960s and 1970s . As with many philosophical and sociological ideas, this one makes more sense with the coming of cyberspace. Derrida argued that each text was dependent on other texts, in turn building on earlier writings and so on. As a consequence, the seemingly original ideas of the individual author could always be deconstructed into a number of distinct influences. An extension of this argument is that thinking is not only a property of a mind. Thinking always includes ideas taken from other people.
The idea of authorship and intellectual property is tightly connected to the tidy practice of publishing on paper. As have been argued earlier, this allows a measure of control of production and distribution. Authorship is also bound with the hegemony of the “read only culture”. This development has also been a key in conventions for textual classification . In other words, we have based our system of organizing texts on authorship. As we increasingly settle down into a Digital Age where intellectual property can be claimed by anyone, we have to re–evaluate the societal role of strong authorship. This, in turn, challenges our conventions for cataloging and organizing information.
The “read only culture” has celebrated those great people with specialized access to publication. In the twentieth century, it became important for us to know who was creating what. Whose genius, credibility, perspective, money or experience was in play in that work of art, scientific article or commercial product?
No other genre within “read only culture” has made such massive use of the idea of authorship as research. Essentially, the system is underpinned by healthy criteria of source criticism. It appears important to know who wrote what since this impact on credibility. There is also status and motivational drive connected to the attainment of publication. This works both ways as clear authorship makes it possible to hold people accountable when things go wrong. Furthermore, we can collect the ideas of an individual researcher from paper to paper. The impact and “quotability” of an article relies to a large extent on the prestige of the author. However, while all this is quite fruitful, the idea of authorship has been taken to its extreme. There are rules for authorship, citing and co–authorship. These rules have enabled the bibliometric project of measuring author, journal and article impact. Originally, bibliometrics was an interesting attempt at understanding the research system through quantitative approaches. It is now frequently used to measure individual distinction as well as to guide economic investments in research projects/institutions.
The evolution of the collective text tends to challenge our pre–occupation with individual authorship. In the 1990s, it was obvious that the Internet could facilitate networking and collaboration. Initially, online collaboration was most evident in open source software. Linux is actually a textbook example of collective work . Social network sites and online communities, specialized in different areas, created new tools for collaboration. Eventually, the business world initiated exploitation in a number of corporate enterprises, including social media platforms.
One of the most exciting ideas coming out of this discussion concerns the advantage of collective Web–based negotiation of text. Such a procedure leads to the development of text as a meeting of minds rather than the narrative of a single mind. This kind of collective text eventually loses all sense of authorship and becomes truly collectively produced.
In a number of experiments, sociologist James March has found that homogeneous groups tend to perform poorly when compared to groups with a more heterogeneous composition . When new members are brought into any group, performance is enhanced if newcomers considerably add diversity. The level of diversity is much more important than the level of competence.
These ideas have been pursued systematically by Surowiecki, arguing that groups with a more diverse composition become better at problem solving . Apart from diversity, Surowiecki emphasizes the importance of independence as well as a context in which individuals are allowed independent expression. In conventional social structures with implicit and explicit hierarchies, independence of thought is fragile. People tend to adjust their opinions according to the ruling ideas of their organization.
Within online contexts, independent expression is considerably facilitated. Furthermore, collective group formation and collective work online is very different and easier than group dynamics off–line . Organization can move bottom–up and through other, intrinsic values, than within traditional organizations, with their extrinsic rewards.
The potential for collective work online is, furthermore, magnified in the 2010s. Smartphones, together with laptops, have now become drivers for mobility and cloud computing. Typically, Microsoft Office 2010 enabled cloud computing according to the ideals of Google Docs. The tools for sophisticated collective work are now integrated into our standard software.
Although collective authorship is a common genre in research, online collaborations would seem to pose a new challenge to the current system. Already, we are seeing young researchers bouncing ideas to and fro on social platforms. Eventually, it will be difficult to credit any individual with the sequentially developed breakthrough ideas. The potential of moving from collective authorship to instances of non–authorship is dramatic and challenges current structures and practices of research.
3.6. The potential of space
As remarked earlier, research writing on paper is marked by being restricted in space. Paper is expensive and authors held to a fixed number of words. In this respect, digital publishing is quite revolutionizing. Editors of digital research journals, not bound to print, do not have to put a number of print–ready articles on the back burner due to a lack of paper space. Instead, it becomes possible to print as many articles as needed in a single issue. Of course, this also leads to a breakdown of the traditional held form of a sequenced order of articles. Now, articles may not even be page numbered in sequence. The new issue of a journal is not meant to be picked up as a whole and browsed through. It is, instead, meant to be a searchable archive.
As publication space is open for everyone, researchers can also publish articles themselves. Researchers with fresh and controversial ideas may tire of having a tough time in peer review and instead publish on their own Web site. Research blogging is a huge area and, for the successful blogger, content on a blog may reach a wider audience than traditional journal publication. It is also much quicker.
This also makes it possible for researchers to publish their text in different places and push for them in social networks and social media.
This is only the beginning. Researchers have not yet taken full advantage of the endless extra space, thanks to nearly unlimited storage, allowed by hypertext.
There are two major transformations waiting to happen in this area. The first concerns the opening up of narratives. As argued earlier, the restrictions of the paper–based text force the writer to choose among possible interpretations and then build a congruent narrative based on this choice. As researchers begin to master the art of writing with hyperlinks, it becomes possible for them to pursue several different interpretations. The researcher will, no doubt, increasingly grasp the opportunity of incorporating a number of side issues through hyperlinks, much in a Wikipedia article.
In this way, researchers can counter the print tradition of hiding alternative interpretations. As argued earlier, the restrictions dictated by paper actually serve to solidify a discipline. The lack of space does not allow the writer to communicate more than a select number of the extensive complexities of a discipline. With hyperlinking and open–minded researchers involved in collective authorship, it can be possible to go far beyond what is possible to know within the confines of the single discipline.
The second major transformation concerns transparency. In research, there is so much high–quality peer–reviewed work done, but very little of it appears in print. Once a paper is accepted, critical comments on it are history. An important argument for throwing away high–quality discussion is that paper is expensive. An added reason would be that exposure of the internal strife within science could undermine public support. It might also be argued that few readers would actually be interested in this detailed discussion of research results. However, the digital age supplies both new opportunities and a new social context. If we are really serious about the quality of research, the discussion between reviewers and authors should be available. It will then be possible to follow sophisticated discussions on the finer points of a given paper. This would create more transparency for other specialized researchers, science studies scholars, historians of science, policy–makers and others involved in the scrutiny of research.
3.7. The potential of commenting
The commenting function is closely related to discussions on updating, transparency and space. It is, however, a quite old and traditional feature in research. Paper–based journals often include “letters to the editor” or other kinds of remarks on previous publications. However, digital technology allows a much more intense form of interaction . This creates a shift in academic debate. Earlier, a researcher could comment in the following issue and thereafter, the original author could reply, in turn, when the next issue of the journal appeared. If the journal appeared in print four to six times a year, this sort of discussion could be very slow.
Digital Web–based publication can allow academic debates to proceed according to the natural dynamics of the discussion. However, the academic community has been slow in adapting to the potential of quick commenting. This is a practical problem for journals that still regard the paper–based version as the original and the digital as a mere copy. In such cases, digital publication will include interactions that are not present in paper, thereby allowing Web copy to be richer.
However, instead of seeing comments as a problem, they could be viewed as a resource. Experiments with reader scoring have indicated that this system can replace the old peer review system .
Such suggestions will, of course, lead to discussions on the boundaries of the discipline. Who should be allowed to contribute as a reader? Should only the disciplinary colleagues have this access? Or should it be open for other researchers? How about professionals and other members of the general public? These questions are also relevant when discussing the digital Web–based potential of social tagging.
3.8. The potential of social tagging
The most controversial tool for research in the Digital Age is probably social tagging. Researchers are old hands at tagging, supplying keywords that identify their text. This becomes a method for identifying the various meanings of a paper. However, any research article carries a vast potential of associations and much of this is likely to be beyond the grasp of the author. So, why not allow your readers the opportunity to add their own keywords to your article?
Obviously, social tagging has been a gigantic success in a number of social media, following on the pioneering work of Delicious.com and Flickr.com. From a philosophical perspective, such work utilises a kind of collective intelligence in order to supply ideas, pictures and text with more meaning. These practices were not possible when working on paper. Typically, a librarian would document metadata on a small index card. The amount of description that was possible about a book was limited by the size of the card. In the digital age, space is not a problem. New “index cards” that connect metadata to a document can actually be larger than the very thing it is describing.
This is certainly a leap forward as it serves to undo the strict labelling of authors. While we earlier were forced to hide different interpretations, restrict meanings and go with the single narrative, it now becomes possible to add as many meanings as possible. A traditional problem is that extensive tagging is time–consuming. With social tagging, this becomes a side effect of large numbers of individuals browsing the same information.
As social tagging developed, a new genre for defining/identifying was created: the tag cloud . There are two variations of the tag cloud. First, there is automatic generation and visualisation of the most frequent words in a given text. This becomes a kind of introvert tag cloud, a visualisation of the main concepts of the text, seen by the text itself. The second form is actually much more interesting. It is a visualisation of all the tags that a large number of users have given a photograph, text or Web resource.
This second kind of tag cloud is usually clickable and therefore becomes a kind of mind map for the Digital Age. As users, we can click on a large number of different interpretations that have been given a certain object and follow it through different taxonomic structures. The tag cloud actually enhances a given research article. It is a dynamic integration between text, figures and headings. At its best, it supplies both an overview on competing interpretations and easy access to these alternative narratives.
Could research articles be allowed this kind of openness? What could be lost and what could be gained by allowing readers to tag research articles?
3.9. The fear of losing control
Those who aim to uphold paper as norm for digital research publishing may defend the tradition through different arguments. Most of them are, justifiably so, rooted in a fear of losing control. It is also important to note that many societal actors rely on scientific publishing remaining as it has been. An adaption to the potential of digital information may also be contrary to the interests of the large publishing companies that own many of the most prestigious journals. This process of adaption is most fundamentally a policy issue as a re–negotiation of the relationship between science and society is at stake. This is important to emphasize, as it is by no means obvious. However, I have difficulty in coming to any other conclusion than the current societal position of science in several ways is tied to the values of paper publication. Digital tools, such as social tagging, create openings for a closer relationship between the producers and consumers of science–based knowledge. That does not knock science off the pedestal, but it tends to lower the height of the pedestal.
There is, indeed, much to be apprehensive about if science loses some of its mystique and authority. Traditionally, the rationality of science has been placed against the superstition of the masses. A variation of this theme is to pit science against religion, most clearly when Darwinism is challenged by creationism. There have also been discussions on the threats of populism, irrationalism, commonsense, anti–science, pseudo–science, amateur researchers as well as the philosophical schools of post–structuralism, post–modernism, epistemic relativism and social constructivism.
Many research disciplines are actively gatekeepers against:
- Professionals who perceive themselves (and some of them rightly so) as competent researchers.
- Amateur researchers who usually (but not always) are far from being able to understand the nuts and bolts of research.
- Students that feel that they have understood more than their teachers (and in the Digital Age they frequently have a point).
- Colleagues from another discipline that are interested in expanding the explanatory territory of their pet theories.
- Interdisciplinary boundary spanning researchers, attempting to tie things together and allow the discipline to be part of a larger context.
In science and technology studies the concept “boundary work” is often used to describe how researchers police the necessary borders between science and non–science, as well as between different research areas . My argument is that the destabilising potential of digital publication has an impact on the character of traditional boundary work. I am not saying that this is the end of boundary work, only that these new technologies allow more destabilisation. And that this is good for research, as well as for society.
I would also like to emphasise that there is an urgency involved in dealing with these issues. As most large corporations already have learned, in the Digital Age, there is a loss of control over the brand . We have entered into a reality where a major part of brand development is made by customers through YouTube, Facebook, Flickr and Delicious. The success of Wikipedia signals that researchers in the same way may lose control over knowledge. Already, researchers active in areas of societal relevance are finding that non–academic readers are utilising their research for their own purposes. Most of this will entail a misreading of researcher intentions, due to the lack of scientific know–how. Some of it, however, may be in line with their original ideas. There is even likelihood that a number of outsiders will fruitfully develop ideas from research. The bottom line is that researchers do not have the same amount of control as in the last century.
The larger issue that comes out of these technologies is an enabling of bottom–up movements that destabilise disciplinary boundaries. It seems to be time to rethink the institution of the discipline given the Digital Age. Indeed, it has been suggested that “e–research” is the turf on which disciplinary and interdisciplinary struggles will be played out . This is an exceedingly complex issue that invites a discussion far beyond the ambitions of this paper. Nevertheless, I will later in this article introduce a few ideas.
This part of the article starts with an historical overview chronicling how the research sector has responded to the development of digital opportunities in recent decades. This will be followed by a presentation of traditional alternatives to the discipline. I believe that the opportunities supplied by the new technologies invite us to pursue models beyond these traditions. I will close the article by suggesting a few new ideas. I am here concerned with structures that balance stabilizing and destabilizing aspects of scientific practice.
4.1. Research in the Digital Age
Arguably, the magnitude of the transformation from paper to the Web has been somewhat disguised by its incremental character. It is possible to detect four different steps, rather than one gigantic leap, in the evolution of digital resources beyond paper.
The first of these concerns the evolution of digital text which started in the 1960s and became standard practice in the 1980s. It became possible to ask the question “what is text, really?” with experience beyond that of paper . D.F. McKenzie, divided text into two contradicting categories . Some text was closed, fixated and finished while other text seemed to be open, unfinished and constantly reinterpreted. This, also, suggests an implicit contrast between analog and digital publication.
The second step in the shift away from traditional paper–based information concerned Internet distribution. Roger Chartier described this development as a “library without walls” . Text that earlier was a “prisoner” of physical containers could now move freely in a new form. “The opposition long held to be insurmountable between the closed world of any finite collection, no matter what its size, and the infinite universe of all texts written is thus theoretically annihilated... .”  The real breakthrough came in the 1990s after the World Wide Web enabled easy browsing of Internet resources.
As I have argued earlier, the research sector, like most authorities, professions, large corporations and media, were early adopters of the Web. Researchers quickly stabilized their practices according to the logics of the Web. However, researchers seemed to view the Web as merely a new mode of distributing information. The Web of the 1990s has in retrospect sometimes been called “Web 1.0” or “the read only Web”. In other words, it was still possible to see this development as an extension of paper–based practices. The idea of a “digital library” created an image of business as usual and the Web was seen as a traditional top–down distribution channel.
The third step can be alluded to as “Web 2.0”, a concept which is, however, highly contested concerning actual definition. Nevertheless, this created a fundamental difference in our way of managing information compared to the changes of the first and second step.
The concept of Web 2.0 was coined by DiNucci who posited a future where the Web were more interactive and held a more central position in the lives of individuals . The concept was popularized by Tim O’Reilly a few years later .
Until recently, Internet researchers had difficulty in understanding the scope of these changes. In the 1990s, the Internet was perceived as cyberspace, a world removed from reality . In the early 2000s, a number of studies situated Internet use as a part of real world practices.
Increasingly, Internet studies served as an instrument to understand not only behavior on the Web, but also off–line practice. As Richard Rogers has observed, Internet giants such as Google and Amazon are far ahead of researchers in performing user studies.
Arguably, the Internet has in the recent decade, year by year, strengthened its central position in modern life. In recent years, we have seen a convergence of the three trends of growing social media activity, mobility (smartphones) and cloud computing (running applications from the Web rather than from the computer). In the 1990s, a user had to sit down at a computer in order to temporarily access a rather slow Web. Today, most Internet users have constant, easy and fast access to Internet resources.
I see this as strengthening the fourth step in the transformation away from paper–based information and thinking: digital Web–based information becomes default . This means that text is produced for the Web first and paper second. Increasingly, then, all kinds of publications, including scholarly, will utilize a broad range of digital textual resources. Paper printouts will become inferior, one–dimensional, copies.
This important shift of standard away from paper has been ongoing for several years, but the implications, discussed in the previous section of this article, are so far–reaching that research has not yet adapted.
4.2. Disciplinary, multidisciplinary, interdisciplinary and transdisciplinary work
Disciplinary research was at the outset of this paper defined as an institutionalised research area with clearly defined research objects, methods, boundaries, institutions, conferences and journals. It has been tremendously successful and expansive during the twentieth century. It is an excellent model of organising the production of knowledge as specialists discuss domain specific problems. However, as I have discussed in the first part of this article, there are also problems in the way we erect boundaries. Some of these restrictions have been forced upon us due to the firm restrictions of paper publication. Therefore, today, we can do better.
What are then the alternatives? During recent decades, policy–makers have frequently been frustrated by narrowly specialized experts that are unable to advice on broader issues. Policy–makers have become increasingly aware of psychological, cultural and societal dimensions being relevant on issues that earlier were monopolized by the natural sciences. For instance, policy–makers can no longer be content with advice from meteorologists in relation to climate change. Also of relevance are economical research (on costs of switching to renewable energy), geological/biological/technological research (for developing renewable energy) and a wide range of social science (in understanding how to steer the citizens towards sustainable energy usage).
Science policy has, increasingly, emphasized collaboration across boundaries . Researchers have usually responded with multidisciplinary research. Multidisciplinary research can be said to be collaboration between researchers who do not share core values and assumptions. Unfortunately, it is quite common that this does not change through the whole project. The outcome can then be documented in an anthology where representatives of different disciplines are responsible for separate chapters. Usually, multidisciplinary projects contain no real driving force for crossbreeding since project members are trained in and loyal to their respective disciplines. There is, of course, a certain degree of learning, mutual understanding and respect, but it grows slowly. Project members will usually find their colleagues to have interesting ideas, but view them as misguided.
Interdisciplinary research is a more intense venture. In a weak interpretation, only methods are exchanged while disciplinary loyalty holds firm. The stronger, and more interesting version, can be understood as collaboration between representatives of different disciplines that, nevertheless, make progress toward sharing core values and assumptions. For many reasons, this is more difficult than multidisciplinary research. Truly interdisciplinary research takes time as it also includes a reprogramming of certain disciplinary dogma. Therefore, huge funding structures are needed, guaranteed for an extended time period. At the same time, there must be individual researchers who are willing to temporarily or permanently abandon their institution. Often, there are difficulties in coming home once the project is over as researchers have not published within the discipline for many years.
Interdisciplinary research is better at producing high–quality criticism than normative suggestions or theoretical descriptions. In other words, it is basically a destabilizing practice which serves to catch up with all the problems that have been accumulated given “disciplinary apartheid”. As so much fresh complexity is recognised, interdisciplinary researchers may have difficulty in committing to stabilized accounts of the phenomenon studied.
Successful multidisciplinary and interdisciplinary research may often lead to institutionalisation, often under the label “interdisciplinary studies”. Typically, stabilization is thereafter pursued and the ideals of the discipline are adopted as university departments are formed, linked internationally through conferences and journals of their own.
In my mind, it is time to revitalise the basic ideas of transdisciplinarity, aiming at producing knowledge which is actually disembodied from any disciplinary home . This idea was launched in the early 1970s as a response to the failure of disciplines to solve societal problems. The main idea is focused on mechanisms for linking and synthesizing successful research from different disciplines in order to address the most urgent challenges of our times. One obvious problem, in the 1970s, was lack of instruments for aggregating, reviewing and synthesizing large amounts of research publications. Transdisciplinarity appears as a more realistic initiative in the Digital Age. If we allow disciplinary specialists to regularly work on collective Web–based text (similar to Wikipedia) we would achieve much the same results.
To sum this up, I have reviewed four ideas for organizing research which all have their different weaknesses and strengths. Disciplinary research has been extremely successful, but with time, it becomes too stabilizing, reproducing the same perspectives again and again. There may be paradigms shifts, however, the new paradigm will also have its blind spots that will be systematically ignored. Interdisciplinary research is extremely valuable as it is truly destabilizing by bringing aboard more complexity. However, as a consequence, it has difficulties in replacing the stabilized accounts of the discipline. Multidisciplinary research is something in between and creates slow progress. Transdisciplinary work, finally, is a helpful resource for external actors as researchers try to produce state of the art research accounts that are disembodied from any individual discipline. Transdisciplinary work is not a model for reorganizing research, only a strategy to improve science and society communication. As such, it is considerably more viable with digital tools.
In my own research on disciplinary boundaries, I have been influenced by the work on research organisation made by Beecher and Trowler . In promoting a dynamic view of epistemic boundaries, they argue a distinction between research areas that are convergent and divergent. I will use these concepts in order to identify two trends in disciplinary research.
Convergent research hosts one or many core perspectives/paradigms and there is an inward direction noticeable in research activities as theoretical support is taken from the core. This also enables a strong self–identity as various researchers utilize similar theoretical resources. Consequently, convergent researchers know where their competence lie, what they can and cannot do. For better and worse, boundaries are strong, forbidding both entrance and exit.
Divergent research tends to lack core perspectives. The research area is heterogeneous and unstructured. There is a weak sense of common self–identity and a tendency for researchers to move into multidisciplinary and interdisciplinary collaborations rather than developing core values and theories. Consequently, divergent researchers are uncertain about their area of competence and the boundaries are weak.
Traditionally, these two models would be used to describe a well functioning and a poorly functioning discipline. Divergent disciplines could be characterized as “fragmenting” or “immature” or even as “fragmented adhocracies”. The strategic issue would be: how can this research area become more convergent, allowing stronger identity and boundaries. However, given the opportunities of the Digital Age, which destabilizes traditional controlling instruments of the discipline, it is time to rethink the ideals of the discipline.
4.4. Combining convergent and divergent research approaches
In their criticism of strategies for constructing interdisciplinary research projects, Norman Metzger and Richard Zare argue that strong disciplines are a prerequisite for strong interdisciplinary formations . This may very well be the case. In building a more dynamic research sector, we have to work with what we have developed. In other words, the disciplines exist as starting point in any process of transformation.
The remedy that I will outline in the following is actually in line with the practice at many university departments. However, there is a lack of formal acknowledgement that this constitutes a viable alternative to the traditional convergent ideal of the discipline. My idea is to combine convergent and divergent approaches within the disciplinary setting.
Following such an approach, it should be recognized that departments need to be constructed with a certain tension. This should include both a strong convergent and divergent movement . Therefore, some researchers would be involved with the core issues of the discipline while others pursue multidisciplinary or interdisciplinary ventures. This creates a balance between stabilizing and destabilizing initiatives. Furthermore, there should be mechanisms for connecting these two movements. As a consequence, convergent researchers will be continuously destabilized by ideas, data and perspectives imported by divergent researchers. This will serve to strengthen the core research. Similarly, divergent researchers will be able to work with an increasingly stronger core perspective in their boundary work.
There may be variations to this model. At one of my earlier university departments, I found that all researchers were involved with the core discussion. However, all of us were pursuing case studies with very different research ideals and perspectives to feed into our common discussions.
Another variation would be that individual researchers would alternate between working at the core and the boundary.
This model of combining convergent and divergent research can be connected to my initial discussion on post–normal science . Here, the main recipe was “extended peer review”. Once again, this meant building on the main vehicle of the discipline, but allowing much more destabilizing input both from other disciplines and from professionals. In line with this, there is one more element that needs to be added as we adapt research to the Digital Age.
In addition to the convergent and divergent movement, research units need also have an external movement. This means interacting with a large number of non–academics. This is an old requirement but, as has been discussed earlier, the context for this has now changed radically. The Web–based external movement is an exciting challenge that all universities are already dealing with. It needs to be recognized as a valuable resource for developing the quality of research–based knowledge in society. Ignoring it is not an option.
As of writing, our experiences of working with digital texts are brief. The combination of digital documents and Web–based resources is merely a few decades old. It is therefore, perhaps, not surprising that many institutions and professions can be perceived as newcomers to the vast potentials of digital Web–based tools. The research system has been both an early adapter and conservative in adapting to the new opportunities. I have argued in this text that the practice of research has in part been formatted and restricted by the limitations of text on paper.
The Web allows us to process narratives in ways that transcend what is possible on paper. However, this constitutes a challenge to fundamental features of research as well as the relationship between science and society.
We now have tools for reworking traditional boundaries. This is an opportunity that science must deal with in one way or another. The choices to be made are linked to the practices of professions. As we move away from the monopolies enabled by the paper–based publication and into a new kind of read and write culture, professionals must increasingly be seen as partners engaged in the production and distribution of knowledge.
I argued that a combination of convergent and divergent ideals will significantly improve the quality of research as well as the interaction between science and society. All professions will have to, in their own way, find a way to work proactively with digital Web–based information in order to uphold and develop their service to society. However, the rejuvenation of professions is hampered as long as the Academy is tied to a conservative and discipline–based publication system.
About the author
Jan Nolin took his Ph.D. in theory of science and is currently professor at the Swedish School of Library and Information Science at the University of Borås. His main research interest lies in the coproduction of Internet technology and society.
Direct comments to Jan [dot] Nolin [at] hb [dot] se
1. Dutton and Jeffreys (editors), 2010.
2. Olson, et al. (editors), 2008.
3. Cope and Kalantzis, 2009.
4. Burgelman, et al., 2010.
5. Whitworth and Friedman, 2009.
6. Knorr–Cetina and Mulkay (editors), 1983.
7. Crivelli, 2004.
8. Shapin, 1994.
9. Fuchs, 1992.
10. Kula, 1986.
11. Patomäki and Wight, 2000, pp. 213–237.
12. Glymour, 1991, pp. 75–95.
13. Popper, 1959.
14. Popper, 2011.
15. Kuhn, 1970.
16. Some of this research has been collected and summarized in a few anthologies, for example, Knorr–Cetina and Mulkay (editors), 1983; and, Pickering (editor), 1992.
17. See Chubin and Hackett, 1990; Daniel, 1994; Hall and Nousala, 2010.
18. Funtowicz and Ravetz, 1992, pp. 963–976; Funtowicz and Ravetz, 1993, pp. 739–755; Funtowicz and Ravetz, 1990.
19. Fuller, 2003.
20. Latour, 1983.
21. Meyer and Rowan, 1977.
22. Abbott, 1988.
23. Latour and Woolgar, 1986.
24. March, 1991.
25. Irwin and Wynne, 1996.
26. Sturgis and Allum, 2004, pp. 55–74.
27. Lessig, 2008.
28. Delamothe, 2002.
29. Halliday and Oppenheim, 2001.
30. Collins, 1985.
31. My discussion on the limits of paper versus the digital document is informed by Weinberger, 2008.
32. Shirky, 2005.
33. Burke, 1998.
34. Chartier, 1994.
35. Tapscott and Williams, 2008.
36. Leadbeater, 2009.
37. Surowiecki, 2004.
38. Shirky, 2010.
39. March, 1991.
40. Surowiecki, 2004.
41. Shirky, 2010.
42. Kling and Callahan, 2002.
43. Mizzaro, 2003.
44. Sinclair and Cardew–Hall, 2008.
45. Gieryn, 1983, pp. 781–795.
46. Li and Bernoff, 2008.
47. Fry and Schroeder, 2010.
48. DeRose, et al., 1990.
49. McKenzie, 1999.
50. Chartier, 1994.
51. Chartier, 1994, p. 89.
52. DiNucci, 1999.
53. O’Reilly, 2007.
54. Rogers, 2009.
55. This is an extension of the argument “the digital is default” put forward by Snickars, 2011.
56. Metzger and Zare, 1999.
57. “Charter of transdisciplinarity,” 1994. Adopted at the First world conference on transdisciplinarity.
58. Becher and Trowler, 2001. Important influence for my thinking has also been the rethinking of in the academic faculties made by Boyer, 1997, as well as Whitley, 2000.
59. Metzger and Zare, 1999.
60. These ideas have been developed in connection with an analysis of library and information science in Nolin and Astrom, 2010, pp. 7–27.
61. Funtowicz and Ravetz, 1993, pp. 739–755.
Andrew Abbott, 1988. The system of professions: An essay on the division of expert labor. Chicago: University of Chicago Press.
Tony Becher and Paul R. Trowler, 2001. Academic tribes and territories: Intellectual enquiry and the culture of disciplines. Second edition. Buckingham: Open University Press.
Ernest L. Boyer, 1997. Scholarship reconsidered: Priorities of the professoriate. London: Jossey Bass.
Jean–Claude Burgelman, David Osimo and Marc Bogdanowicz, 2010. “Science 2.0 (Change will happen...) ,” First Monday, volume 15, number 7, at http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2961/2573, accessed 27 October 2011.
Seán Burke, 1998. The death and return of the author: Criticism and subjectivity in Barthes, Foucault and Derrida. Second edition. Edinburgh: Edinburgh University Press.
“Charter of transdisciplinarity,” 1994. Adopted at the First World Conference on Transdisciplinarity, Convento da Arrábida, Portugal, (2–6 November), at http://basarab.nicolescu.perso.sfr.fr/ciret/english/charten.htm, accessed 25 June 2011.
Roger Chartier, 1994. The order of books: Readers, authors and libraries in Europe between the fourteenth and eighteenth centuries. T ranslated by Lydia G. Cochrane. Cambridge: Polity Press.
Daryl E. Chubin and Edward J. Hackett, 1990. Peerless science: Peer review and U.S. science policy. Albany: State University of New York Press.
H.M. Collins, 1985. Changing order: Replication and induction in scientific practice. London: Sage.
Bill Cope and Mary Kalantzis, 2009. “Signs of epistemic disruption: Transformations in the knowledge system of the academic journal,” First Monday, volume 14, number 4, at http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2309/2163, accessed 27 October 2011.
Paolo Crivelli, 2004. Aristotle on truth. Cambridge: Cambridge University Press.
H.–D. Daniel, 1994. Guardians of science: Fairness and reliability of peer review. Translated by William E. Russey. Weinheim: VCH.
Steven J. DeRose, David Durand, Elli Mylonas and Allen H. Renear, 1990. “What is Text, Really?” Journal of Computing in Higher Education, volume 1, number 2, pp. 3–26.http://dx.doi.org/10.1007/BF02941632
Darcy DiNucci, 1999. “Fragmented future,” Print, volume 53,number 4, p. 32.
William H. Dutton and Paul W. Jeffreys (editors), 2010. World wide research: Reshaping the sciences and humanities. Cambridge, Mass.: MIT Press.
Jenny Fry and Ralph Schroeder, 2010. “The changing disciplinary landscapes of research,” In: William H Dutton and Paul W Jeffreys (editors), 2010. World wide research: Reshaping the sciences and humanities. Cambridge, Mass.: MIT Press.
Stephan Fuchs, 1992. The professional quest for truth: A social theory of science and knowledge. Albany: State University of New York Press.
Steve Fuller, 2003. Kuhn vs Popper: The struggle for the soul of science. Cambridge: Icon Books.
Silvio O. Funtowicz and Jerome R. Ravetz, 1993. “Science for the post–normal age,” Futures, volume 25, number 7, pp. 739–755.http://dx.doi.org/10.1016/0016-3287(93)90022-L
Silvio O. Funtowicz and Jerome R. Ravetz, 1992. “The good, the true and the post–modern,” Futures, volume 24, number 10, pp. 963–976.http://dx.doi.org/10.1016/0016-3287(92)90131-X
Silvio O. Funtowicz and Jerome R. Ravetz, 1990. Uncertainty and quality in science for policy. Dordrecht: Kluwer Academic.
Thomas F. Gieryn, 1983. “Boundary–work and the demarcation of science from non–science: strains and interests in professional ideologies of scientists,” American Sociological Review, volume 48, number 6, pp. 781–795.http://dx.doi.org/10.2307/2095325
Clark Glymour, 1991. “The hierarchies of knowledge and the mathematics of discovery,” Minds and Machines, volume 1, number 1, pp. 75–95.
Leah Halliday and Charles Oppenheim, 2001. “Developments in digital journals,” Journal of Documentation, volume 57, number 2, pp. 260–283.http://dx.doi.org/10.1108/EUM0000000007102
Alan Irwin and Brian Wynne (editors), 1996. Misunderstanding science? The public reconstruction of science and technology. Cambridge: Cambridge University Press.
Rob Kling and Ewa Callahan, 2002. “Electronic journals, the Internet, and scholarly communication,” Annual Review of Information Science and Technology, volume 37, issue 1, pp. 127–177.http://dx.doi.org/10.1002/aris.1440370105
Karin D. Knorr–Cetina and Michael Mulkay (editors), 1983. Science observed: Perspectives on the social studies of science. London: Sage.
Thomas S. Kuhn, 1970. The structure of scientific revolutions. Second edition, enlarged. Chicago: University of Chicago Press.
Witold Kula, 1986. Measures and men. Princeton, N.J.: Princeton University Press.
Bruno Latour, 1983. “Give me a laboratory and I will change the world,” In: Karin D. Knorr–Cetina and Michael Mulkay (editors). Science observed: Perspectives on the social studies of science. London: Sage, pp. 141–169.
Bruno Latour and Steve Woolgar, 1986. Laboratory life: The construction of scientific facts. Princeton, N.J.: Princeton University Press.
Charles Leadbeater, 2009. We–think: Mass innovation, not mass production. Second edition. London: Profile.
Lawrence Lessig, 2008. Remix: Making art and commerce thrive in the hybrid economy. London: Bloomsbury Academic.
Charlene Li and Josh Bernoff, 2008. Groundswell: Winning in a world transformed by technologies. Boston: Harvard Business School Press.
James G. March, 1991. “Exploration and exploitation in organizational learning,” Organization Science, volume 2, number 1, pp. 71–87.http://dx.doi.org/10.1287/orsc.2.1.71
D.F. McKenzie, 1999. Bibliography and the sociology of texts. Cambridge: Cambridge University Press.
John Meyer and Brian Rowan, 1977. “Institutionalized organizations: Formal structure as myth and ceremony,” American Journal of Sociology, volume 83, number 2, pp. 340–363.http://dx.doi.org/10.1086/226550
Stefano Mizzaro, 2003. “Quality control in scholarly publishing: A new proposal,” Journal of the American Society for Information Science and Technology, volume 54, number 11, pp. 989–1,005.
Jan Nolin and Fredrik Astrom, 2010. “Turning weakness into strength: Strategies for future LIS,” Journal of Documentation, volume 66, number 1, pp. 7–27.http://dx.doi.org/10.1108/00220411011016344
Gary Olson, Ann Zimmerman and Nathan Bos (editors), 2008. Scientific collaboration on the Internet. Cambridge, Mass.: MIT Press.
Tim O’Reilly, 2007. “What is Web 2.0: Design patterns and business models for the next generation of software,” Communications & Strategies, number 1, p. 17.
Heikki Patomäki and Colin Wight, 2000. “After postpositivism? The promises of critical realism,” International Studies Quarterly, volume 44, number 2, pp. 213–237.http://dx.doi.org/10.1111/0020-8833.00156
Andrew Pickering, 1992, Science as practice and culture. Chicago: University of Chicago Press.
Karl Popper, 2011. The open society and its enemies. London: Routledge.
Karl Popper, 1959. The logic of scientific discovery. London: Hutchinson.
Richard Rogers, 2009. “The end of the virtual: Digital methods,” text prepared for the inaugural speech, chair, new media & digital culture, University of Amsterdam, at http://www.govcom.org/rogers_oratie.pdf, accessed 2 August 2011.
Steven Shapin, 1994. A social history of truth: Civility and science in seventeenth–century England. Chicago: University of Chicago Press.
Clay Shirky, 2010. Cognitive surplus: Creativity and generosity in a connected age. London: Penguin.
Clay Shirky, 2005. “Ontology is overrated: Categories, links, and tags,” at http://www.shirky.com/writings/ontology_overrated.html, accessed 25 June 2011.
James Sinclair and Michael Cardew–Hall, 2008. “The folksonomy tag cloud: When is it useful?” Journal of Information Science, volume 34, number 1, pp. 15–29.http://dx.doi.org/10.1177/0165551506078083
Pelle Snickars, 2011. “Archival transitions: Some digital propositions,” In: Kingsley Bolton and Jan Olsson (editors). Media, popular culture, and the American century. Stockholm: National Library of Sweden, pp. 301–329.
Patrick Sturgis and Nick Allum, 2004. “Science in society: Re–evaluating the deficit model of public attitudes,” Public Understanding of Science, 55–74.http://dx.doi.org/10.1177/0963662504042690
James Surowiecki, 2004. The wisdom of crowds: Why the many are smarter than the few and how collective wisdom shapes business, economics, societies, and nations. New York: Doubleday.
Don Tapscott and Anthony D. Williams, 2008. Wikinomics: How mass collaboration changes everything. London: Atlantic Books.
David Weinberger, 2008. Everything is miscellaneous: The power of the new digital disorder. New York: Holt.
Richard Whitley, 2000. The intellectual and social organization of science. Second edition. Oxford: Oxford University Press.
Brian Whitworth and Rob Friedman, 2009. “Reinventing academic publishing online. Part I: Rigor, relevance and practice,” First Monday, volume 14, number 8, at http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2609/2248, accessed 27 October 2011.
Received 30 June 2011; revised 4 August 2011; accepted 12 August 2011.
This work is licensed under a Creative Commons Public Domain License.
Boundaries of research disciplines are paper constructs: Digital Web–based information as a challenge to disciplinary research
by Jan Michael Nolin.
First Monday, Volume 16, Number 11 - 7 November 2011
A Great Cities Initiative of the University of Illinois at Chicago University Library.
© First Monday, 1995-2014.