July 1, 1997

A New-Luddite Reflects on the Internet

On the subject of our latest technological revolution, cyberspace, I am a neo-Luddite. Not a true Luddite; my Ludism is qualified, compromised. I revel in the word processor; I am grateful for computerized library catalogues; I appreciate the convenience of CD-ROMS; and I concede the usefulness of the Internet for retrieving information and conducting research. But I am disturbed by some aspects of the new technology—not merely by the moral problems raised by cybersex, which have occupied so much attention recently, but also by the new technology's impact on learning and scholarship.

Revolutions come fast and furious these days. No sooner do we adapt to one than we are confronted with another. For almost half a millennium, we lived with the product of the print revolution—the culture of the book. Then, a mere century ago, we were introduced to the motion picture; a couple of decades later, to radio and then to television. To a true Luddite, those inventions were the beginning of the rot, the decline of Western civilization as we have known it. To a true revolutionary, such as Marshall McLuhan, they were giant steps toward a brave new world liberated from the stultifying rigidities of an obsolete literacy. To the rest of us, they were frivolities, diversions, often meretricious (as some popular culture has always been), but not threatening to the life of the mind, the culture associated with books.

Not that the book culture has been immune from corruption. When the printing press democratized literature, liberating it from the control of clerics and scribes, the effects were ambiguous. As the historian Elizabeth Eisenstein pointed out in her seminal 1979 work The Printing Press as an Agent of Change, the advent of printing facilitated not only the production of scientific works, but also of occult and devotional tracts. It helped create a cosmopolitan secular culture and, at the same time, distinctive national and sectarian cultures. It stimulated scholarship and high culture, as well as ephemera and popular culture. It subverted one intellectual elite, the clergy, only to elevate another, the "enlightened" class.

Yet for all of its ambiguities, printing celebrated the culture of the book—of bad books, to be sure, but also of good books and great books. Movies, radio and television made the first inroads on the book, not only because they distracted us from reading, but also because they began to train our minds to respond to oral and visual sensations of brief duration rather than to the cadences, nuances and lingering echoes of the written word. The movie critic Michael Medved has said that even more detrimental than the content of television is the way that it habituates children to an attention span measured in seconds rather than minutes. The combination of sound bites and striking visual effects shapes the young mind, incapacitating it for the longer, slower, less febrile tempo of the book.

And now we have the Internet to stimulate and quicken our senses still more. We channel-surf on television, but that is as naught compared with cyber-surfing. The obvious advantage of the new medium is that it provides access to an infinite quantity of information on an untold number and variety of subjects . How does one quarrel with such a plenitude of goods?

As an information-retrieval device, the Internet is unquestionably an asset, assuming that those using it understand that the information retrieved is only as sound as the original sources—an assumption that applies to all retrieval methods, but especially to one whose sources are so profuse and indiscriminate. Yet children and even older students, encouraged to rely upon the Internet for information and research, may not be sophisticated enough to question the validity of the information or the reliability of the source. A child whom I saw interviewed on television said that it was wonderful to be able to ask a question on one's home page and have "lots of people answer it for you." Before the age of the Internet, the child would have had to look up the question in a textbook or encyclopedia, a source that he would have recognized as more authoritative than, say, his older brother or sister (or even his mother or father).

As a learning device, the new electronic technology is even more dubious—indeed , it may be more bad than good. And it is dubious at all levels of learning. Children who are told that they need not learn how to multiply and divide, spell and write grammatical prose, because the computer can do that for them, are being grossly miseducated. More important, young people constantly exposed to "multimedia" and "hypermedia" replete with sound and images often become unable to concentrate on mere "texts" (known as books), which have only words and ideas to commend them. Worse yet, the constant exposure to a myriad of texts, sounds and images, that often are only tangentially related to each other, is hardly conducive to the cultivation of logical, rational, system­atic habits of thought.

At the more advanced level of learn­ing and scholarship, the situation is equally ambiguous. Let me illustrate this from my own experience. I used to give (in the pre-electronic age) two sequences of courses: one on social history, the other on intellectual history. In a course on social history, a student might find electronic technology useful, for example, in inquiring about the standard of living of the working classes in the early period of industrialization, assuming that the relevant sources—statistical surveys, diaries, archival collections, newspapers, tracts, journals, books, and other relevant materials—were online (or at least that information about their location and content was available).

This kind of social history, which is built by marshaling social and economic data, is not only facilitated, but actually is stimu­lated, by the new technology. One might find oneself making connections among sources of information that would have had no apparent link had they not been so readily called up on the computer screen (on the other hand, now one might not make the effort to discover other kinds of sources that do not appear).

But what about intellectual history? It may be that the whole of Rousseau's Social Contract and Hegel's Philo­sophy of History are now online. Can one read such books on the screen as they should be read—slowly, carefully, patiently, dwelling upon a difficult pas­sage, resisting the temptation to scroll down, thwarting the natural speed of the computer? What is important in the history of ideas is not retriev­ing and recombining material, but understanding it. And that requires a different relation to the text, a different tempo of read­ing and study.

One can still buy the book (or per­haps print out a "hard copy" from the computer), read it, mark it up and take notes the old-fashioned way. The difficulty is that students habituated to surf­ing on the Internet, to getting their infor­mation in quick easy doses, to satisfying their curiosity with a minimum of effort (and with a maximum of sensory stimulation) often do not have the patience to think and study this old-fashioned way. They may even come to belittle the intellectual enterprise itself, the study of the kinds of books—"great books ," as some say derisively—that require careful thought and study.

Perhaps I am exaggerating the effect of the electronic revolu­tion, just as critics have said that Elizabeth Eisenstein has exagger­ated the effect of the print one. She sometimes seems to suggest that printing was not only an agent of change, but the primary agent. Without the printing press, she has implied, the Renaissance might have petered out or the Reformation been suppressed as yet another medieval heresy. "The advent of printing," she notes, preceded "the Protestant revolt."

The electronic media cannot make that claim to priority. The intellectual revolution of our time, postmodernism, long antedated the Internet. Nonetheless, the Internet powerfully reinforces post-modernism: It is the postmodernist technology par excellence. It is as subversive of "linear," "logocentric," "essentialist" thinking, as committed to the "aporia," "indeterminacy," "fluidity," "intertextuality" and "contextuality" of discourse, as deconstruction itself. Like post­ modernism, the Internet does not distinguish between the true and the false, the important and the trivial, the enduring and the ephemeral. The search for a name or phrase or subject will produce a comic strip or advertising slogan as readily as a quotation from the Bible or Shakespeare . Every source appearing on the screen has the same weight and credibility as every other; no authority is "privileged" over any other.

The Internet gives new meaning to the British expression describing intellectuals, "chattering classes." On their own home pages, subscribers can communicate to the world every passing reflec­tion, impression, sensation, obsession or perversion.

Michael Kinsley, editor of the new cyberspace journal Slate, defensively insists that his magazine will retain the "linear, rational thinking" of print journalism. To have to make that claim is itself testimony to the non-linear, non-rational tendency of the new medium. Each article in Slate gives the date when it was "posted" and "composted" (archived). Composted! One recalls the computer­ programming acronym a few years ago—"GIGO ," for "garbage in, garbage out." (As it happens, the articles in Slate are not garbage, but much on the Internet is.)

One need not be a Luddite, or even a neo-Luddite, to be alarmed by this most useful, most potent, most seductive and most equivocal invention.


GERTRUDE HIMMELFARB is professor emeritus of history at the Graduate School and University Center of the City University of New York. This article originally appeared in The Chronicle of Higher Education (November 1, 1996) and is reprinted here with the kind permission of Professor Himmelfarb.