Difficulties in a Digital Age
Matthew Crippen
December 15, 2016
Struggling somewhat with writing in my youth, I am grateful for computers and word processors. As someone who now writes with relative ease and publishes a fair bit, I remain grateful. At the same time, I am concerned. I will explain why by starting with a story from my days in film school.
Filmmaking, by the standards of any period in which it has existed, has always been technologically advanced. Today it is more so than ever, with the introduction of increasingly complex digital devices replacing analogue video, and largely supplanting older, photochemical methods too because the latter are far less cost effective and also require considerable expertise to use. Although relatively ubiquitous when I was a student, my high school was largely bereft of digital devices that allow you to attempt uncounted combinations when piecing scenes together during the editing process. So limited to analogue technology, I learned to conceive sequences ahead of time and shoot according to a plan that would facilitate editing. Accordingly, even when working with Super 8mm film in the first year of my undergraduate program, which required that we literally cut and tape celluloid strips with minimal preview, I was able to do so effectively.
By contrast, I had very gifted classmates who struggled, I think because they had not worked much with older technologies. Even now they transferred Super 8mm to digital, and engaged in the labour-intensive process of trying numerous combinations on computer editing software. Accustomed to toiling thus, they never developed the habit of forming clear conceptions of how their films would come together beforehand. Absent this, they did not shoot footage lending itself to editing. Often this meant their results did not hold together especially well, and even when the final product was good, arriving at it require inordinate time and fumbling. This continued when we started shooting on 16mm and indeed digital cameras. A lesson here is that even when you have a digital camera and editing software, you will do better if you know how to shoot and construct sequences without these technologies; and knowing this, you are likely to make far better use of digital technologies when you have the opportunity to use them.
A similar lesson follows in the case of computers and writing. Understand, first, that writing is a form of thinking, and computers accordingly prosthetics of thought. John Dewey (1925), a prominent figure in education theory, adds another insight by noting that thinking is “pre-eminently an art; knowledge and propositions which are the products of thinking, are works of art, as much so as statuary and symphonies” (p. 378). This suggests thinking entails building and putting together, and since we build with materials at hand, using, for example, already available concepts, which we may then twist to develop new meaning, thinking entails reconstruction. Dewey went on to say that “[i]f defective materials are employed or if they are put together carelessly and awkwardly, the result is defective” (p. 379); it will then cast “a fog which obscures” (p. 378). Yet if the reverse is so, then the working of thought integrates and illuminates. At its height, “[e]very successive stage of thinking [becomes] a conclusion in which the meaning of what has produced it is condensed; and it is no sooner stated than it is a light radiating to other things” (p. 378).
Allowing that there are non-linguistic modes of thought, for example, visual ones, very few will deny that using words to build essays, stories and suchlike are ways of thinking. On Dewey’s account, moreover, thinking generally and writing specifically are processes that take raw materials—whether observations of the world or already existing concepts—and shape and twist them so as to generate organized, meaningful forms. From this standpoint, moreover, composing thought in words bears obvious similarity to shooting and editing films. In light of this, and also based on my experience both as a student and professor, I want to argue digital technologies in the form of word processing bring the same potential risks and benefits as they do to filmmaking. Now I am not so old to predate a time when computers were in classrooms and homes. However, many of my teachers were. By today’s standards they were, accordingly, “old school.” This meant, among much else, that my early education involved writing rough drafts by hand, correcting them and only then going to a word processor, if at all. Beginning at an early stage, moreover, and well into my graduate years, I never submitted significant assignments without having well-educated native speakers proofread my work first, though I admit that I was in a privileged position in this regard. From this, I learned things that computers do not teach and often misteach—effective semicolon use, to name one example. Analogous to my cinematic training, the learning process was such that I actually learned to write in the first place, as opposed to tapping out relatively random ideas, with limited form, then attempting to generate structure by copying, pasting and moving various chunks around the screen, say nothing of relying on grammar check, which is only particularly useful if you know grammar in the first place, so that it highlights errors you already know how to fix.
My point, again, is not that computers are bad, nor even that young children should not play games and so on. My grievance is with early addiction to word processors, which, to repeat, are not bad in themselves. On the contrary, I opened by emphasizing how useful I find them. I will go further and say in light of my personality and disorganized temperament, I likely would not have succeeded as an academic without access to computers and word processors. Computers have enabled me to return to projects started years earlier and cultivate them into publishable pieces, or to better organize and draw from research notes and do considerably more. In fact, in this piece, I did some cutting and moving and a lot of trimming and rewording, yet only after writing a decent draft in the first place. What I want to urge is that writing ability would improve markedly if more emphasis remained on doing so by hand during formative stages. Arguments that children must become computer literate at early ages are nonsense. We live in a digital age. The world of youth already revolves around digital devices, and children will learn to negotiate them regardless of whether or not they are taught to do so in school. As in the case of filmmaking, moreover, I think word processors would profit students more if they first learned how to write without them. This means learning to develop a general conception that will frame a composition and laying it down on paper in a grammatically sound way, rather than moving chunks around a screen and building essays, stories and the like through trial and error.
So in an age when many laud increasingly complex technologies as answers in education, with some urging that every elementary level student have a laptop or tablet, I advocate the moderation of digital devices and especially word processors, at least at early stages. Indeed, even now when I occasionally struggle with writer’s block, I revert to pen or pencil, and find ideas flowing more fluidly. Computers and word processors are assets if used circumspectly, but it is worth remembering that the bulk of great writing in the human legacy has been generated without them. As Raymond Williams (1989) put it, “[h]igh technology can distribute low culture: no problem. But high culture can persist at a low level of technology: that is how most of it was produced” (119).
References
Dewey, John. 1925. Experience and Nature (Chicago: Open Court Publishing Company).
Williams, Raymond. 1989. Poltics of Modernism: Against the New Conformists (London: Verso).