Published April 2, 2015
The first time I ever saw the words Littera scripta manet, they were inscribed on an exquisite little poster hanging beside the desk of a typographer employed by one of the last letterpress printers in North America. The meaning in English is “The written word remains,” a declaration that had an immediate appeal for me. Attributed to the Roman poet Horace, and embraced by the great English printer William Caxton, this ancient saying bespeaks a quiet pride in the weight and dignity of the written text, and especially in the permanence, fixity, and distilled authority of fine printing. The spoken word depends upon the flickering and inconstant flame of oral transmission for its survival. Not so the written word. The page on which it appears is a tangible thing, an embodied thing, one that occupies an enduring place in the world, a distinct niche in space and time, a home on one’s bookshelf. As such, the written word is more fully in the world, even as it shows a greater capacity to transcend the world, to raise our gaze above the immediate, replacing the world’s ephemera with enduring objects of contemplation.
But all this needs to be re-thought in the age of the Internet, and of the digitization of all knowledge and communications. For if there is one thing we know, or are coming to understand, it is that the things posted on the Internet—the digital word, or image, or song, or video—also remain, whether we like it or not, and irrespective of their importance. This omnivorous, placeless, and decontextualized permanence of the Internet contrasts sharply, even jarringly, with the ephemeral and whimsical and personal and intimate character—some of it potentially embarrassing and even damaging—of so many of the things we are posting on it. The millions who expend so much of their time and energy on social media such as Twitter, Facebook, Instagram, Tumblr, and the like are largely unconcerned (to put it mildly) that today’s words might be overheard by their contemporaries. But neither are these social-media devotees thinking much about the likelihood that today’s words will be overheard twenty or more years from now, by their prospective employers or their own children, or by their political, professional, or personal rivals, words that will then be shorn of any friendly or explanatory context. Behind this wonderfully empowering technology, which seems to inspire so much fancy and froth and expressive creativity, there is also a relentless and remorseless capture of experience, which feeds an elephant that never forgets.
This is a price some are unwilling to pay. One of the reasons we value the right of privacy is that we value the right to conceal or withhold, and thereby to have some say over the terms of our engagement with the larger world. Every one of us has done and said things we wish could be forgotten, and in the fullness of time they generally are. Or so it used to be.
That erasure is far harder to be assured of in a world in which the elephant that never forgets also never sleeps, and in which the digitally stored word remains, and remains, and remains. Which is why the top court of the European Union has ruled that its existing data protection law guarantees “the right to be forgotten,” enabling citizens to compel “data controllers” such as Google to delete certain private information after a certain time. While this “right” remains contested, and is so far generally confined to the deletion of public access to unwanted and outdated online data about oneself, it could eventually be construed to extend far beyond that, to almost any imaginable data that could affect one’s reputation and self-presentation.
The impulse behind such rights talk is understandable. No one can blame the victim of a false arrest or wayward prosecution for wanting the record expunged. And the decontextualized and disaggregated character of information on the Internet lends itself to all kinds of abuses, providing factoid fodder for the kind of gotcha journalism and political opposition research—not to mention sheer character assassination—we already have in such abundance. Such practices not only coarsen our public discourse and frighten worthy people away from the ordeal of public service, but also render rational discussion of public issues almost impossible. And on a personal level, do we want to be judged and forever found wanting because of the time we posted something injudicious, or said something uncharitable, or did something impulsive or wrongheaded or exuberantly imprudent?
I fear especially in this regard for the young, who feed the elephant with unrestrained candor, and seem heedless of the possibility that they may be thereby compiling the elements of a future brief against themselves. It seems lamentable that one should have to counsel caution in these matters. Whatever happened to the prerogatives of youth? During the presidential campaign of 2000, George W. Bush deflected persistent charges relating to his reputation for youthful misbehavior with the following statement: “When I was young and foolish, I was…young and foolish.” An evasive answer, possibly, but I think, in retrospect, a good one, in that it reflects the ecology of mind we need to keep in view. Young people need to be able to be young, to live out the exuberance and callowness of youth, to make their own missteps, and to be able to count on some measure of erasure and forgiveness of all that as they move forward into adulthood. But the unsleeping digital beast they love so much may make that more difficult in the years to come.
This should not be read as a call for the wanton erasure of memory. On the contrary. Memory is the very core of our personal identity, and it is most powerful when it is purposeful, and selective. Above all, it requires that we possess stories and narratives—contexts—that link facts in ways that are both meaningful and true, rather than treat them as a mass of disaggregated data, to be exploited as we, or others, might wish. What makes for intelligent and discerning memory is not the mere capacity for massive retention, but a certain balance and order in the mental economy of remembering and forgetting. In other words, memory takes an active role in thinning out the mental trees so that the forests can be discerned. We need to retain less if we are to remember more. In so doing, we may rediscover the enduring virtues of ink on paper, of scripta that remain in one place, as the vehicle for a new kind of samizdat, one that eschews the digital grid altogether.
— Wilfred M. McClay is a senior fellow of the Ethics and Public Policy Center.