The following was published in the Washington Post on September 26, 2019.
By Benjamin A. Saltzman, assistant professor in the Department of English, Division of the Humanities, at the University of Chicago
News of privacy breaches and secret surveillance is a regular feature of the digital age. With Facebook’s announcement Friday that it has suspended more developer apps for misusing users’ data than previously identified, the company revealed how little we know about the life of our data, even when we already know it’s been breached.
Although we might be most immediately concerned with how our personal data is mishandled, such stories point to a much larger problem: The global data economy mines human information to predict and influence behavior in ways most of us are incapable of comprehending. We are in thoroughly new territory. Yet to better understand what this means for the future of privacy, we need to look back to a much older idea, one from the Middle Ages.
At the risk of simplifying what was a complex and dynamic social force, Christians in the early European Middle Ages, between roughly the 5th and 11th centuries, believed that God knew all human secrets, yet God’s secrets remained fundamentally unknowable to human beings. This widespread and deep-seated belief in an omniscient and mysterious being shaped institutional structures and social behavior in profound ways, as human efforts at concealment were considered futile. Today’s global data economy is the new form of mysterious omniscience. And as the reach of these technologies expands, their mystery will be one of the greatest barriers to its regulation.
The belief in an omniscient God structured almost every part of medieval life, especially around the rapidly developing legal systems of the time. Law codes named God as a constant witness in legal disputes in which human witnesses were considered deficient and God’s judgment functioned as a compelling legal tool. Trial by ordeal, for example, revealed God’s will through a physical test, such as burning the person’s hand with a hot iron, bandaging it and unwrapping the bandages several days later to reveal whether it had healed (innocence) or was infected (guilt). The certainty produced by such procedures relied upon an awareness of the ultimate inscrutability of God’s methods.
The belief in God’s omniscience didn’t disappear after 1200, of course, but its importance did begin to fade in social and legal institutions. The Fourth Council of the Lateran in 1215 outlawed clergy from participating in trials by ordeal. Yet even such regulations could not fully eliminate this early medieval practice, nor were authorities interested in curtailing belief in God’s omniscience itself. Forms of the ordeal would still be used in witch trials during the 17th century, and even in secular settings to this day, oaths are sworn to God.
By the 18th century, new secular forms of institutional power and surveillance emerged. Jeremy Bentham, for example, theorized the panopticon, a prison structure designed to harness a prisoner’s belief that he was always being watched to shape his behavior in favor of docility. What makes the architecture of the panopticon work is the mysterious omniscience of the prison guards, who can see from their tower into every cell without ever being seen themselves.
Today, a new form of mysterious omniscience is having a similarly widespread and unpredictable effect on human social behavior. We know that data about us is being compiled at breathtaking speeds, but most of us have little way of knowing what information is being collected, how it is being used and, crucially, how the various algorithms, clouds, networks and devices even work. This is nearer to the medieval version of God’s omniscience than it is to the panopticon to which it is sometimes compared.
Indeed, as scholar Shoshana Zuboff has written, firms actively confuse the public about the data they process so that their capabilities “remain inscrutable to all but an exclusive data priesthood.”
Speakers and other smart devices in our homes, cars, phones and tablets constantly listen and collect our data. And sometimes, as we now know, there are human listeners, too. But Alexa and Siri are not like other people overhearing our secrets and the humdrum of our daily lives. They are plugged in to a larger structure of data and machine learning, one that we are ill-equipped to understand. A telling example is the tendency for people to think that their phones are listening to their conversations and on that basis serving them ads that correspond to products they have been discussing. In truth, those ads are driven by algorithms that have learned how we think, the ways we feel and, by extension, the kinds of things we might be interested in purchasing at any given moment.
In the Middle Ages, God’s omniscience disciplined people through a peculiar combination of fear and a belief in the benevolence of the divine. Medieval Christians believed God was essentially good but feared Him greatly. This was why early medieval legal institutions could rely on God as a reliable witness.
Where people in the early Middle Ages assigned benevolence to and held tremendous fear in their omniscient God, we have been facing — indeed embracing with remarkably little fear — this mysteriously omniscient technology reasoned to be benevolent because it makes life more convenient.
The way medieval law used God’s omniscience in cases of unreliable testimony foreshadows a future — in some ways, one already here — in which the information collected into that mysteriously omniscient entity (including data recorded by devices and retained by corporations) can be harvested and harnessed as evidence in courts of law, particularly where no other human witnesses are available to testify.
In these cases, corporations have so far resisted sharing the data with the state, with the exception of counterterrorism efforts. But it also contributes to the corporate entity’s growing omniscience and mysteriousness, precisely the same kind of secret omnipotence that made God’s judgment such a strong motivator of behavior in the lives of medieval believers.
The great challenge in the age of surveillance capitalism is its unprecedented nature, as Zuboff argues. But though it is unprecedented in the modern era, we find its roots in the early medieval period. If, as the philosopher Michel Foucault famously argued, the move from the late medieval gallows to the 18th-century panopticon removed the punisher while intensifying the discipline, then the move from the panopticon to this future iteration of mysterious omniscience could potentially entail a more insidious form of discipline stripped of the fear of punishment and, with its godlike status, of the possibility of democratic regulation.