With digitization, our modes of writing change and we go from writing by putting pen to paper, to typing on a variety of screens. In educational as well as in leisurely settings, writing by hand is increasingly marginalized, and may in a foreseeable future be abandoned altogether. Summarizing some of the research comparing handwriting and typewriting, this article aims to shed light on some salient features handwriting and keyboard writing, and to discuss what may be the implications of the ongoing marginalizing or even abandonment of handwriting — in a short- and long-term perspective.
Is handwriting important?
Comparing handwriting and typewriting
Writing tools and substrates, and a note on note-taking
Technologies of writing in beginning writing instruction
Adequate reading and writing skills are defining components of literacy, and both are indispensable for full participation in today’s technologically advanced societies. As more and more of our daily tasks are performed by the use of a smart phone, tablet or computer, it is essential to be able to read and write in fully digitized environments. Today, we read, write and communicate with a wide range of gadgets or devices, each with a user interface and a set of technical and material affordances that in subtle and less subtle ways shape our interaction and engagement — with the device, with each other, and with our lifeworld. Understanding the ways in which features of the medium may affect processes involved in reading and writing, is of major importance. In what ways does digitization change the ways in which we read and write, and what may be the implications — short- and long-term; positive and less positive — of such changes? What is the good, the bad, and the unknown with respect to digital technologies of reading and writing?
If we focus on the digitization of reading continuous, linear, verbal text, empirical research over the past couple of decades has shown that the technology or substrate (i.e., screens or paper) on which we read indeed makes a difference. Specifically, recent meta-analyses (Singer and Alexander, 2017; Delgado, et al., 2018; see also Kovač and van der Weel, this issue) have found that, for the reading of longer linear texts, there is a computer screen inferiority with respect to reading comprehension. What is more, this difference between paper and computer screen reading has in fact increased over the past 10 to 15 years (Delgado, et al., 2018). In addition, empirical research shows that there is a persistent preference for print, again especially with respect to longer texts, such as textbooks and study reading (N. S. Baron, et al., 2017; Mizrachi, et al., 2018).
Such findings run counter to the often heard claims about digital natives (e.g., Palfrey and Gasser, 2008; Prensky, 2010), assumedly preferring digital formats for their reading, and assumedly performing better with screens than with paper . Hence, it seems reasonable to claim that for certain kinds of reading, the medium continues to matter. A closely related question — and the topic of this article — is how digitization may affect our modes and habits of writing. With smart phones, tablets, and computers, we can write by tapping keys on mechanical or virtual keyboards . What are the salient features handwriting and keyboard writing, and what may be the implications of the ongoing marginalizing or even abandonment of handwriting — in a short- and long-term perspective?
This article takes a closer look at some of the differences between conventional (paper-based) handwriting and modes of writing enabled by digital technologies. Rather than aiming to reveal radically new insights into the future of writing, I will summarize some of the empirical research comparing various aspects of handwriting with keyboard writing, with the primary objective to highlight some very significant, but too often overlooked aspects of the transition from pen and paper to keyboards and screens.
Is handwriting important?
The marginalization of handwriting in our everyday lives has been underway for quite some time, and news about schools abandoning handwriting keeps resurfacing from time to time, typically accompanied by heated debates in which proponents of digitization are pitted against those arguing for the cognitive and cultural benefits of writing by hand. A steady flow of popular science publications on the “lost art” of handwriting (e.g., D. Baron, 2009; Neef, et al., 2006; Florey, 2009; Hensher, 2013; Keim, 2013; Konnikova, 2014) is further testimony of the degree to which people take an interest in this topic.
In several countries, schools are implementing one-to-one iPad (or laptop) programs , and students in first grade may be learning to write letters by using keyboards and touch-screen affordances, such as audio feedback. In particular, the method “Write-to-Read” (Trageton, 2003) has gained traction in Scandinavian countries. In this method, children are encouraged to discover the grapheme-phoneme correspondence (the link between the letter [grapheme] and its corresponding sound [phoneme]) through experimenting with letter and text production on keyboards instead of learning to write each letter by hand. Formal and systematic writing instruction is postponed until second or third grade. Research on the “Write-to-Read” method is scarce, but one study found that students wrote “longer texts with better structure, clearer content, and a more elaborate language” than students not trained in this method (Genlott and Grönlund, 2013). The ease of writing on keyboards is also likely to be of importance for children’s motivation to write, especially for those who struggle with fine-motor skills required for handwriting.
In contrast, experimental research in psychology and neuroscience focusing on associations between fine-motor processes and cognitive outcomes tends to find that handwriting, but not typewriting, supports the visual identification and recall of letters (e.g., James and Gauthier, 2006; Kiefer, et al., 2015; Longcamp, Boucard, et al., 2006; Longcamp, et al., 2008) (more on these differences, below). Research from these fields reveals how handwritten script is an “imprint of action”; Longcamp, Tanskanen and Hari (2006) point to the rather striking fact that we are usually able to recognize handwriting accurately even if one writer’s script may vary considerably from those of others:
“Several psychophysical studies have demonstrated a striking ability of the perceptual system to reliably extract production-related information from the graphic trace [...]] .” 
In other words, we apply knowledge about the implicit motor rules involved in writing by hand, during the perception of the handwritten traces. An increasing marginalization and eventually abandonment of the teaching of handwriting may hence impair future generations’ ability to read handwritten text. It is worth pausing to reflect on the implications of such a scenario.
Comparing handwriting and typewriting
Despite an increasing influence of digital technologies in educational contexts, empirical research comparing handwriting and keyboard writing is still scarce. This may seem like a paradox, especially in light of an increasing awareness of the close associations between motor action, perception and cognitive processes, much by courtesy of the embodied cognition paradigm (see, for instance, Coello and Fischer, 2016; Davis and Markman, 2012; Shapiro, 2011; Wilson, 2002). Simply watching our fingers during writing suggests that the process of writing by means of a keyboard is in fact quite different than writing by hand with pen on paper. However, the implications of such differences for cognitive and emotional outcomes are far from clear, and empirical research has only just begun mapping the effects of digitization on writing.
As mechanical or motor processes, handwriting and typewriting differ in a number of ways — some of which are easily observable, others less so. The transition from writing by hand to writing by the use of a keyboard thus prompts a heightened awareness of the role of haptics (Mangen and Velay, 2010) — in other words, the ways in which our hands and fingers interact with technological and material interfaces . For a start, when typewriting, we typically engage both hands and — if we are proficient — all 10 fingers. In comparison, writing by hand is one of the most lateralized processes, as very few are able to write equally well with both hands . Other differences pertain to, for example, the coordination of fine-motor input and visual attention. When writing by hand, the visual attention is mainly directed to the tip of the pen(cil), hence very close to the location of motor input. This way, in handwriting, there is a close temporal and spatial contiguity of visual attention and motor input.
When writing on a keyboard, in contrast, the writer’s visual attention may shift between the motor input location (the keyboard) and the screen, or — depending on typewriting proficiency — the writer may be focusing mostly on the screen. In this respect, typewriting may be described as more abstract and phenomenologically detached than writing by hand (Mangen, 2014). This division between motor input and visual attention may at least partly explain findings in studies where participants report that, when writing on a computer, the act of typing can be separated from thinking and listening (such as note-taking during class), whereas writing by hand is felt to require and enable focus and concentration (Park and Baron, 2017). As one of the participants in Park and Baron’s (2017) study expressed it:
“[Handwriting] helps my brain to think. When I’m writing in class [on paper] it helps me to actually take in the information more, where when I’m sitting in front of a computer I just feel like — I ... blank out. So that’s why I [think that with] handwriting you have to actually engage more. You have to concentrate on what you’re actually writing, where typing, you can just blank out. ” 
Importantly, some of the differences between writing with pen in hand and typing may have implications for our modes of reading and of literally being in touch with our own language. Reading and writing are very closely related in many respects, and the digitization of one of these processes and skills will have implications for the other, in ways that we may not necessarily be aware of.
Writing tools and substrates, and a note on note-taking
Before exploring the differences between writing by hand and writing by keyboard in more detail, let us first establish some key features of writing in general. All writing — irrespective of technology, addressee, purpose, and content — involves manually handling a device or tool to create a visual representation in some modality or sign system. From using clay tablets and animal skins, via papyrus rolls and paper, to the array of digital devices available today, writers must engage in dexterous handling of some implement or technology. In this sense, says Christine Haas (1996), writing can never be separate from technology:
“Whether it is the stylus of the ancients, the pen and ink of the medieval scribe, a toddler’s fat crayons, or a new Powerbook, technology makes writing possible. To go further, writing is technology, for without the crayon or the stylus or the Powerbook, writing simply is not writing. Technology has always been implicated in writing: In a very real way, verbal behavior without technological tools is not, and cannot be, writing.” 
As there are several technologies and devices involved in writing, it may be helpful to distinguish between (i) the writing tool, and (ii) the substrate or display on which the writing appears. Examples of writing tools are the pen or pencil, the digital stylus pen, and keyboards, whereas paper and different kinds of screens are examples of writing substrates or displays.
Importantly, both the tool and the substrate/display are technologies with distinct material affordances. The sensorimotor contingencies  of the tools and the substrates of handwriting are different from those of digital devices, and consequently, they enable and support certain modes of use, at the expense of others. For instance, the materiality of paper ensures a fixity and (relative) permanence to any inscription. Once they have appeared on paper, words don’t move. At the same time, this fixity makes paper much less flexible than its digital counterpart with respect to the writer’s need to edit and revise. The endless malleability of a digitized text ensures that it can be smoothly revised and edited, at local as well as global levels.
Another difference between handwriting and keyboard writing pertains to writing speed. Most of us write much faster on a keyboard than by hand, and typing allows the production of huge amounts of text without the fine-motor fatigue typically associated with writing by hand. This is, for many writing purposes, a huge advantage. However, for certain writing tasks, the speed of handwriting comes with a perhaps unexpected flip side. When used for note-taking, the fact that we can type practically as fast as someone speaks, may have as a consequence that note-taking turns into mere copying and, this way, ends up supporting shallow retention of facts rather than deeper comprehension.
This is exactly what Mueller and Oppenheimer (2014) found. In their aptly titled article, “The pen is mightier than the keyboard: Advantages of longhand over laptop note taking”, they present three studies comparing note-taking by hand and by means of a laptop keyboard. Note-taking, as Mueller and Oppenheimer observe, can be either generative (e.g., summarizing or paraphrasing) or non-generative (i.e., verbatim copying), and the latter tends to predict poorer performance in terms of deeper comprehension, whereas the added processing entailed in generative notetaking typically leads to greater cognitive benefits (Igo, et al., 2005; Kiewra, 1985; Slotte and Lonka, 1999). Asking students to take notes during watching a TED movie (Study 1 and Study 2) and during listening to four prose text passages read from a teleprompter (Study 3), Mueller and Oppenheimer tested students’ performance on several measures, including factual recall-questions assessing surface memory, and conceptual questions tapping into deeper processing — better comprehension — of the material. Results showed that, across all three studies, participants using laptops for note-taking were more inclined to take notes verbatim (even when explicitly recommended not to), and that, consequently, their performance on the conceptual recall measure was inferior to those taking notes by longhand. The authors conjecture that the ease of writing on a keyboard encourages mindless transcription, whereas the extra effort entailed in writing by hand may be a “desirable difficulty” positively contributing to deeper processing of the content. Hence, laptop use in classrooms, Mueller and Oppenheimer contend, “should be viewed with a healthy dose of caution; despite their growing popularity, laptops may be doing more harm in classrooms than good.” .
In addition to note-taking, there are other aspects of writing which warrant attention when keyboards are beginning to replace pens and paper in schools. The next section will focus on a topic that is often the subject of contentious debate, namely beginning writing instruction for children.
Technologies of writing in beginning writing instruction
For a very long time, learning to write used to imply pen-and-paper manuscript of lowercase and uppercase letters, followed at a later stage by cursive (a.k.a. joined-up, or joint) writing. Today, however, many children today get their first writing experiences by using different kinds of keyboards. One-to-one iPad programs are implemented in an increasing number of elementary schools, and as a consequence, systematic handwriting instruction may be postponed, while children in first-grade learn letters by using iPads or laptops. Such developments invite closer scrutiny of the ways in which affordances of different writing tools shape and influence modes of beginning writing.
Despite evidence of close links between handwriting and other fine-motor skills and cognitive development overall, there is a lack of knowledge about the best means of teaching writing in elementary school (Dinehart, 2015). One reason for this may be related to the fact that research on writing (by hand or by keyboard), in general as well as in an educational context, tends to privilege very different aspects (e.g., shaping each letter correctly vs motivation to produce sentences and texts) depending on the theoretical and methodological approach. Writing — like reading — is a multifaceted skill and no single theoretical framework or model can be assumed to account for all the processes involved. A good starting point, however, is van Galen’s (1991) overview of the different stages and levels of handwriting (see Figure 1), where he describes writing as a hierarchy of tasks in need of completion. Idea formulation sits at the highest level and letter strokes make up the lowest level:
Figure 1: Architecture of processing modules, processing units and mediating memory stores for handwriting production. The left-hand column indicates processing modules, the central column describes the corresponding unit sizes, and the right-hand column refers to the storage nodes that mediate in the communication between the levels .
Writing by hand entails the formation of letters with particular shapes and sizes, in a process that, once automatized and incorporated in the body’s repertoire of skills, the Russian neuropsychologist Alexander Lurija calls a “kinetic melody”:
“In the initial stages [...] writing depends on memorizing the graphic form of every letter. It takes place through a chain of isolated motor impulses, each of which is responsible for the performance of only one element of the graphic structure; with practice, this structure of the process is radically altered and writing is converted into a single ‘kinetic melody’, no longer requiring the memorizing of the visual form of each isolated letter or individual motor impulses for making every stroke. [...]. The participation of the auditory and visual areas of the cortex, essential in the early stages of the formation of the activity, no longer is necessary in its later stages, and the activity starts to depend on a different system of concertedly working zones.” 
Handwriting requires the integration of visual, proprioceptive (haptic/kinesthetic), and tactile information — hence, motor commands and kinesthetic feedback are closely linked to visual information when we write by hand, while this is not the case with typewriting. This separation of motor input and visual attention may have implications for subsequent perception of the output — the letter or script. A growing number of experiments in neuroscience has shown that the graphomotor element of writing by hand — shaping the letter according to strokes, lines dots and circles that embody distinct trajectories and “motor traces” — supports visual recognition of letters (James, 2010; James and Engelhardt, 2012; James and Gauthier, 2006; Longcamp, et al., 2003; Longcamp, et al., 2008; Longcamp, Boucard, et al., 2006; Longcamp, et al., 2011; Longcamp, Tanskanen, et al., 2006; Longcamp, et al., 2005). For instance, Longcamp, et al. investigated differences between handwriting and typewriting in children (Longcamp, et al., 2005) as well as in adults (Longcamp, Boucard, et al., 2006). Instructing the participants to learn a set of unknown letters, they found that those who had learned to write the letters by hand performed better on memory tasks and visual recognition tasks. These behavioral studies were followed up with neuroimaging studies (Longcamp, et al., 2008), in which participants were shown the letters that they had learnt to write by either practicing the pattern of strokes, lines, dots and curves by hand, or by correctly locating and visually identifying them on a specially designed laptop. The results showed that processing the orientation of handwritten and typed letters did not rely on the same brain areas: for those who had learned the letters by hand, there was activation in several regions in the brain known to be involved in the imagery, observation and execution of motor action. For those who had learned the letters by typing on a keyboard, there was no such activation (Longcamp, et al., 2008; see also Mangen and Balsvik, 2016). The handwritten letter is, literally, an “imprint of action” in a way that typing on a keyboard is not (Longcamp, Tanskanen, et al., 2006), and this feature may have cognitive implications that we should acknowledge when debating the need of handwriting in the future.
Along the same lines, Markus Kiefer and colleagues (2015) have studied the effect of writing technology — handwriting and typewriting — on preschool children’s word-level reading and writing tasks. Using closely matched letter learning games, they had the children learn eight letters using either handwriting or typing. After training, they assessed letter recognition, letter naming and letter writing performance as well as word reading and word writing performance. Results did not indicate any superiority of typing training in any of the tasks, and typing was found to be inferior to handwriting training in word writing and (marginally significant) in word reading. The authors conclude that this study support theories indicating the beneficial effects of meaningful action-perception couplings which is absent when writing on a keyboard (Kiefer, et al., 2015).
A related aspect is how learning to write — by hand — entails a process in which the child produces a range of (unfinished; imperfect) versions of letters on her way towards fluent writing. This exposure to a range of variants of a prototype, for instance of the letter ‘G’/‘g’, has shown to play an important role in letter categorization. For instance, in an experiment by Li and James (2016), children were taught the names of four Greek symbols in one of six learning conditions: three involved motor production (copying symbols, tracing single-font symbols and tracing handwritten symbols), and the remaining three conditions had no motor component — children studied, by visual + auditory input, single-font type symbols, multiple-font type symbols, and handwritten symbols. Afterwards, Li and James tested children’s categorization performance through a card-sorting task. Results showed that children who had studied multiple instances of the symbols — whether self-produced or studied visually — performed better than children who had studied only a single prototypical example of the symbol. Based on these results, the authors argue that, since variable symbols are produced through handwriting but not keyboard writing, it is important that children produce symbols through handwriting early on, as “the output of their bodies will create ‘messy,’ variable instances of categories. This will not occur with keyboarding or tracing, where the typed/traced symbol is presented in a constant (usually prototypical) font.” 
As our tasks and activities are increasingly screen-based, our future reading and writing will continue to be performed in digital rather than analogue or paper-based environments. In this article, I have pointed to some arguably important differences between writing by hand and writing by keyboard. This is not to say that we should abandon digital technologies in schools, or that our children and grandchildren will turn into mindless zombies incapable of deeper reflection. However, based on extant empirical research, it seems worth pausing to reflect on the ways in which different technologies — tools and substrates — affect the ways in which we engage, cognitively and emotionally, with whatever we read, write and communicate. Especially with respect to the future of writing by hand (with pen on paper) in beginning writing instruction, we may want to think twice before throwing out paper and pencils.
About the author
Anne Mangen is Professor in the Reading Centre at the University of Stavanger (Norway).
E-mail: anne [dot] mangen [at] uis [dot] no
The title of this paper was inspired by Radesky, et al.’s (2015) article in Pediatrics on mobile and interactive media use by young children.
1. See also Bennett, et al. (2008) and Helsper and Eynon (2010) for some critical perspectives on the claims about digital natives.
2. Another interesting digital mode of writing is that of writing by hand using stylus pen on a touch screen, which invites questions relating to the effects of the different substrates (paper and touch screen) on handwriting (see, e.g., Alamargot and Morin, 2015; Gerth, Dolk, et al., 2016; Gerth, Klassert, et al., 2016).
3. Examples of “iPad-schools” in Denmark include the municipalities of Gladsaxe (https://www.gladsaxe.dk) and Odder (https://odder.dk ).
4. Longcamp, Tanskanen, et al., 2006, p. 681.
5. Comprising both “passive” (cutaneous/tactile) and “active” (proprioceptive/kinesthetic) sensory processes, haptics (from Greek, haptikos = “able to touch”) refers to the combined sense of touch, proprioception (the sense of the relative position of muscles, joints and tendons) and kinesthesia (the sense of movement).
6. However, as with other skilled manual processes, the non-writing hand plays a complementary role in handwriting by providing a positioning frame to the writing hand: it “sets and confines the spatial context in which the ‘skilled’ movement will take place” (Guiard, 1987, p. 492).
7. Park and Baron, 2017, p. 156.
8. Haas, 1996, preface, pp. x–xi.
9. “Sensorimotor contingencies” (Noë, 2004; O’Regan and Noë, 2001) refer to our practical and embodied knowledge of sets of structured laws governing the sensory changes brought about by one’s movement and/or manipulation of objects. As Noë (2004) elaborates, “how things look, smell, sound, or feel (etc.) depends, in complicated but systematic ways, on one’s movement. The sensory modalities differ in the distinctive forms that this dependence takes. [...] Sight has its own characteristic forms of sensorimotor dependence. How things look varies in systematic ways as one moves one’s head, eyes, or body relative to the environment. [...] The senses are modes of awareness of one and the same environment as mediated by different patterns of sensorimotor contingency.” (Noë, 2004, p. 109) Importantly, our practical knowledge of the laws governing these contingencies is based on lifelong exploration of the physical environment by means of unified, and unifying, orchestration of all sensory modalities. See also Mangen (2016).
10. Mueller and Oppenheimer, 2014, p. 1,166.
11. Van Galen, 1991, p. 183.
12. Lurija, 1973, p. 32.
13. Li and James, 2016, p. 308.
D. Alamargot and M.-F. Morin, 2015. “Does handwriting on a tablet screen affect students’ graphomotor execution? A comparison between grades two and nine,” Human movement science, volume 44, pp. 32–41.
doi: https://doi.org/10.1016/j.humov.2015.08.011, accessed 10 September 2018.
D. Baron, 2009. A better pencil: Readers, writers, and the digital revolution. New York: Oxford University Press.
N. S. Baron, R. M. Calixte and M. Havewala, 2017. “The persistence of print among university students: An exploratory study,” Telematics and Informatics, volume 34, number 5, pp. 590–604.
doi: https://doi.org/10.1016/j.tele.2016.11.008, accessed 10 September 2018.
S. Bennett, K. Maton and L. Kervin, 2008. “The ‘ digital natives’ debate: A critical review of the evidence,” British Journal of Educational Technology, volume 39, number 5, pp. 775–786.
doi: https://doi.org/10.1111/j.1467-8535.2007.00793.x, accessed 10 September 2018.
Y. Coello and M. H. Fischer (editors), 2016. Perceptual and emotional embodiment. Foundations of embodied cognition, volume 1. New York: Routledge.
J. I. Davis and A. B. Markman, 2012. “Embodied cognition as a practical paradigm: Introduction to the topic, the future of embodied cognition,” Topics in Cognitive Science, volume 4, number 4, pp. 685–691.
doi: https://doi.org/10.1111/j.1756-8765.2012.01227.x, accessed 10 September 2018.
P. Delgado, C. Vargas, R. Ackerman and L. Salmerón, under review. “Don’t throw away your printed books: A meta-analysis on the effects of reading media on comprehension,” Educational Research Review.
L. H. Dinehart, 2015. “Handwriting in early childhood education: Current research and future implications,” Journal of Early Childhood Literacy, volume 15, number 1, pp. 97–118.
doi: https://doi.org/10.1177/1468798414522825, accessed 10 September 2018.
K. B. Florey, 2009. Script and scribble: The rise and fall of handwriting. Brooklyn, N.Y.: Melville House.
A. A. Genlott and Å Grönlund, 2013. “Improving literacy skills through learning reading by writing: The iWTR method presented and tested,” Computers & Educationy, volume 67, pp. 98–104.
doi: http://dx.doi.org/10.1016/j.compedu.2013.03.007, accessed 10 September 2018.
S. Gerth, T. Dolk, A. Klassert, M. Fliesser, M. H. Fischer, G. Nottbusch and J. Festman, 2016. “Adapting to the surface: A comparison of handwriting measures when writing on a tablet computer and on paper,” Human Movement Science, volume 48, pp. 62–73.
doi: http://dx.doi.org/10.1016/j.humov.2016.04.006, accessed 10 September 2018.
S. Gerth, A. Klassert, T. Dolk, M. Fliesser, M. H.Fischer, G. Nottbusch and J. Festman, 2016. “Is handwriting performance affected by the writing surface? Comparing preschoolers’, second graders’, and adults’ writing performance on a tablet vs. paper,” Frontiers in Psychology.
doi: http://dx.doi.org/10.3389/fpsyg.2016.01308, accessed 10 September 2018.
Y. Guiard, 1987. “Asymmetric division of labor in human skilled bimanual action: The kinematic chain as a model,” Journal of Motor Behavior, volume 19, number 4, pp. 486–517.
doi: http://dx.doi.org/10.1080/00222895.1987.10735426, accessed 10 September 2018.
C. Haas, 1996. Writing technology: Studies on the materiality of literacy. Mahwah, N.J.: L. Erlbaum Associates.
E. J. Helsper and R. Eynon, 2010. “Digital natives: Where is the evidence?” British Educational Research Journal, volume 36, number 3, pp. 503–520.
doi: https://doi.org/10.1080/01411920902989227, accessed 10 September 2018.
P. Hensher, 2013. The missing ink: The lost art of handwriting. New York: Faber and Faber.
L. B. Igo, R. Bruning and M. T. McCrudden, 2005. “Exploring differences in students’ copy-and-paste decision making and processing: A mixed-methods study,” Journal of Educational Psychology, volume 97, number 1, pp. 103–116.
doi: http://dx.doi.org/10.1037/0022-06220.127.116.11, accessed 10 September 2018.
K. H. James, 2010. “Sensori-motor experience leads to changes in visual processing in the developing brain,” Developmental Science, volume 13, number 2, pp. 279–288.
doi: https://doi.org/10.1111/j.1467-7687.2009.00883.x, accessed 10 September 2018.
K. H. James and L. Engelhardt, 2012. “The effects of handwriting experience on functional brain development in pre-literate children,” Trends in Neuroscience and Education, volume 1, number 1, pp. 32–42.
K. H. James and I. Gauthier, 2006. “Letter processing automatically recruits a sensory-motor brain network,” Neuropsychologia, volume 44, number 14, pp. 2,937–2,949.
doi: https://doi.org/10.1016/j.neuropsychologia.2006.06.026, accessed 10 September 2018.
B. Keim, 2013. “The science of handwriting,” Scientific American Mind, pp. 54–59, and at https://www.scientificamerican.com/article/the-science-of-handwriting/, accessed 10 September 2018.
M. Kiefer, S. Schuler, C. Mayer, N. M. Trumpp, K., Hille and S. Sachse, 2015. “Handwriting or typewriting? The influence of pen-or keyboard-based writing training on reading and writing performance in preschool children,” Advances in Cognitive Psychology, volume 11, number 4, pp. 136–146.
doi: https://doi.org/10.5709/acp-0178-7, accessed 10 September 2018.
K. A. Kiewra, 1985. “Investigating notetaking and review: A depth of processing alternative” Educational Psychologist, volume 20, number 1, pp. 23–32.
doi: https://doi.org/10.5709/acp-0178-7, accessed 10 September 2018.
M. Konnikova, 2014. “What’s lost as handwriting fades,” New York Times (2 June), at https://www.nytimes.com/2014/06/03/science/whats-lost-as-handwriting-fades.html, accessed 10 September 2018.
J. X. Li and K. H. James, 2016. “Handwriting generates variable visual output to facilitate symbol learning,” Journal of Experimental Psychology: General, volume 145, number 3, pp. 298–313.
doi: https://doi.org/10.1037/xge0000134, accessed 10 September 2018.
M. Longcamp, Y. Hlushchuk and R. Hari, 2011. “What differs in visual recognition of handwritten vs. printed letters? An fMRI study,” Human Brain Mapping, volume 32, number 8, pp. 1,250–1,259.
doi: https://doi.org/10.1002/hbm.21105, accessed 10 September 2018.
M. Longcamp, T. Tanskanen and R. Hari, 2006. “The imprint of action: Motor cortex involvement in visual perception of handwritten letters,” NeuroImage, volume 33, number 2, pp. 681–688.
doi: https://doi.org/10.1016/j.neuroimage.2006.06.042, accessed 10 September 2018.
M. Longcamp, M.-T. Zerbato-Poudou and J.-L. Velay, 2005. “The influence of writing practice on letter recognition in preschool children: A comparison between handwriting and typing,” Acta Psychologica, volume 119, number 1, pp. 67–79.
doi: https://doi.org/10.1016/j.actpsy.2004.10.019, accessed 10 September 2018.
M. Longcamp, C. Boucard, J.-C. Gilhodes and J.-L. Velay, 2006. “Remembering the orientation of newly learned characters depends on the associated writing knowledge: A comparison between handwriting and typing,” Human Movement Science, volume 25, numbers 4–5, pp. 646–656.
doi: https://doi.org/10.1016/j.humov.2006.07.007, accessed 10 September 2018.
M. Longcamp, J.-L.Anton, M.Roth and J.-L. Velay, 2003. “Visual presentation of single letters activates a premotor area involved in writing,” NeuroImage, volume 19, number 4, pp. 1,492–1,500.
doi: https://doi.org/10.1016/S1053-8119(03)00088-0, accessed 10 September 2018.
M. Longcamp, C. Boucard, J.-C.Gilhodes, J.-L. Anton, M. Roth, B. Nazarian and J.-L. Velay, 2008. “Learning through hand- or typewriting influences visual recognition of new graphic shapes: Behavioral and functional imaging evidence,” Journal of Cognitive Neuroscience, volume 20, number 5, pp. 802–815.
doi: https://doi.org/10.1162/jocn.2008.20504, accessed 10 September 2018.
A. R. Lurija, 1973. The working brain: An introduction to neuropsychology. Translated by B. Haigh. London: Allen Lane.
A. Mangen, 2016. “What hands may tell us about reading and writing,” Educational Theory, volume 66, number 4, pp. 457–477.
doi: https://doi.org/10.1111/edth.12183, accessed 10 September 2018.
A. Mangen, 2014. “The disappearing trace and the abstraction of inscription in digital writing,” In: K. E. Pytash and R. E. Ferdig (editors). Exploring technology for writing and writing instruction. Hershey, Pa.: IGI Global, pp. 100–114.
doi: https://doi.org/10.4018/978-1-4666-4341-3.ch006, accessed 10 September 2018.
A. Mangen and L. Balsvik, 2016. “Pen or keyboard in beginning writing instruction? Some perspectives from embodied cognition,” Trends in Neuroscience and Education, volume 5, number 3, pp. 99–106.
doi: http://dx.doi.org/10.1016/j.tine.2016.06.003, accessed 10 September 2018.
A. Mangen and J.-L. Velay, 2010. “Digitizing literacy: Reflections on the haptics of writing,” In: M. H. Zadeh (editor). Advances in haptics. Vienna: IntechOpen.
doi: http://dx.doi.org/10.5772/8710, accessed 10 September 2018.
D. Mizrachi, A. M. Salaz, S. Kurbanoglu and J. Boustany on behalf of the ARFIS Research Group, 2018. “Academic reading format preferences and behaviors among university students worldwide: A comparative survey analysis,” PloS ONE, volume 13, number 5 (30 May), e0197444.
doi: https://doi.org/10.1371/journal.pone.0197444, accessed 10 September 2018.
P. A. Mueller and D. M. Oppenheimer, 2014. “The pen is mightier than the keyboard: Advantages of longhand over laptop note taking,” Psychological Science, volume 25, number 6, pp. 1,159–1,168.
doi: https://doi.org/10.1177/0956797614524581, accessed 10 September 2018.
S. Neef, J. Van Dijck and E. Ketelaar (editors), 2006. Sign here: Handwriting in the age of new media. Amsterdam: Amsterdam University Press.
A. Noë, 2004. Action in perception. Cambridge, Mass.: MIT Press.
J. K. O’Regan and A. Noë, 2001. “A sensorimotor account of vision and visual consciousness,” Behavioral and Brain Sciences, volume 24, number 5, pp. 939–973.
doi: https://doi.org/10.1017/S0140525X01000115, accessed 10 September 2018.
J. Palfrey and U. Gasser, 2008. Born digital: Understanding the first generation of digital natives. New York: Basic Books.
S. Park and N. S. Baron, 2017. “Space, context, and mobility: Different experiences of writing on mobile phones, laptops, and paper,” In: J. Vincent and L. Haddon (editors). Smartphone cultures. London: Routledge, pp. 150–162.
M. Prensky, 2010. Teaching digital natives: Partnering for real learning. housand Oaks, Calif.: Corwin Press.
J. S. Radesky, J. Schumacher and B. Zuckerman, 2015. “Mobile and interactive media use by young children: The good, the bad, and the unknown,” Pediatrics, volume 135, number 1, pp. 1–3.
doi: https://doi.org/10.1542/peds.2014-2251, accessed 10 September 2018.
L. A. Shapiro, 2011. Embodied cognition. New York: Routledge.
L. Singer and P. Alexander, 2017. “Reading on paper and digitally: What the past decades of empirical research reveal,” Review of Educational Research, volume 87, number 6, pp. 1,007–1,041.
doi: https://doi.org/10.3102/0034654317722961, accessed 10 September 2018.
V. Slotte and K. Lonka, 1999. “Review and process effects of spontaneous note-taking on text comprehension,” Contemporary Educational Psychology, volume 24, number 1, pp. 1–20.
doi: http://dx.doi.org/10.1006/ceps.1998.0980, accessed 10 September 2018.
A. Trageton, 2003. Å skrive seg til lesing: IKT i småskolen. Oslo: Universitetsforlaget.
G. P. van Galen, 1991. “Handwriting: Issues for a psychomotor theory,” Human Movement Science, volume 10, numbers 2–3, pp. 165–191.
doi: https://doi.org/10.1016/0167-9457(91)90003-G, accessed 10 September 2018.
M. Wilson, 2002. “Six views of embodied cognition,” Psychonomic Bulletin & Review, volume 9, number 4, pp. 625–636.
doi: https://doi.org/10.3758/BF03196322, accessed 10 September 2018.
Received 3 September 2018; accepted 7 September 2018.
Copyright © 2018, Anne Mangen. All RIghts Reserved.
Modes of writing in a digital age: The good, the bad and the unknown
by Anne Mangen.
First Monday, Volume 23, Number 10 - 1 October 2018