One of the characters in the film “The Social Network” delivers what is intended as a generation-defining line: “We lived on farms, then we lived in cities, and now we’re gonna live on the Internet.” A scary thought.
What kind of life is it that is lived on the Internet? In my post (“Becoming Faceless?“) of Oct. 31, 2009, I sketched how we are changed as persons by the technologies we use and also gave my reasons for being a recluse in the world of social media.
Most of my colleagues in the organization I work with are more “plugged in” than I am. They have strange cords and gadgets semi-permanently attached to their ears, and much of their waking hours are spent on Facebook, Twitter, Skype and e-mail. With the click of a button they can send off comments and replies to a large and anonymous audience.
But I have ceased to expect a thoughtful, considered reply to any of my own e-mails. It may be that my letters disappear under the sheer mass of “information” with which my colleagues are inundated. Or, as is more likely, they simply do not have the time to switch off and think before they click the reply button.
This does raise the important question: how is human communication suffering as a result of the widespread use of the new communication media?
If anybody is tempted to dismiss my comments as the nostalgic rants of a Luddite, let him read Jaron Lanier’s recent book, “You Are Not a Gadget: A Manifesto” (Penguin, 2010). Lanier is no technophobe or ignoramus. One of the pioneers of virtual reality (indeed, he was the one who first coined the term “virtual reality”), Lanier belongs to that rare breed of engineers who reflects philosophically on their work.
He is cynical about the reductionist tendencies prevalent in the field of computer science (for example, reducing thinking to “information processing” and prostrating oneself before machines). He points out that every software program embodies a personal philosophy. “It is impossible to work with information technology without also engaging in social engineering.”
The slightest change in something as seemingly trivial as the ease of use of a button can sometimes completely alter behavior patterns. For instance, Stanford University researcher Jeremy Bailenson has demonstrated that changing the height of one’s avatar in immersive virtual reality transforms self-esteem and social self-perception.
Lanier points out that “anti-human rhetoric” abounds in the world of computing. Kevin Kelly, founder of Wired magazine, has stated that we don’t need authors anymore, since all the ideas of the world, all the fragments that used to be assembled into coherent books by identifiable authors, can be combined into one single, global book.
“People degrade themselves in order to make machines seem smart all the time,” writes Lanier. “Before the crash, bankers believed in supposedly intelligent algorithms that could calculate credit risks before making bad loans. We ask teachers to teach to standardized tests so a student will look good to an algorithm … The attribution of intelligence to machines, crowds of fragments, or other nerd deities obscures more than it illuminates …Treating computers as intelligent, autonomous entities ends up standing the process of engineering on its head. We can’t afford to respect our own designs so much.”
His most scathing comments are directed at the developers of what has come to be called Web 2.0. “It breaks my heart when I talk to energized young people who idolize the icons of the new digital ideology, like Facebook, Twitter, Wikipedia and free/open/Creative Commons mashups.”
In the preface to the book he states: “You have to be somebody before you can share yourself.” But for Mark Zuckerberg sharing your choices with everybody (and doing whatever they do) is being somebody. When a human being becomes a set of data on a website like Facebook, he or she shrinks. We are squeezed into “multiple-choice identities.”
Wikipedia obliterates context and personal perspective – without which information can be dangerously misleading. Yet research any topic on an Internet search engine, and the first site you will be referred to is Wikipedia.
Recent political and legal debates in the United States over the WikiLeaks “revelations” has perhaps obscured other serious threats to freedom – those posed by the advertising industry and credit card companies that can buy and manipulate personal data for private profit.
In an article titled “Generation Why?” the novelist Zadie Smith, who teaches English at Harvard and is only a decade or so older than Zuckerberg, complains that “our denuded networked selves don’t look more free, they just look more owned.”
For her Facebook reflects a sad reality: “500 million sentient people entrapped in the recent careless thoughts of a Harvard sophomore with a Harvard sophomore’s preoccupations. What is your relationship status? (Choose one. There can be only one answer. People need to know.) Do you have a ‘life?’ (Prove it. Post pictures). Do you like the right sort of things? (Make a list. Things to like will include: movies, music, books and television, but not architecture, ideas and plants.)”
Given the huge numbers of Christians involved in the world of computers and information technology, especially in India, South Korea and the U.S., why is there so little critical and theological reflection of this nature emerging in our churches and seminaries?
Christians, of all people, should be profoundly interested in communication, given that the self-communication of God in human flesh is at the heart of the Gospel. Why has it been left to secular humanists and others to articulate the prophetic insights that we desperately need in our technology-driven environment?
Vinoth Ramachandra is secretary for dialogue and social engagement for the International Fellowship of Evangelical Students. He lives in Sri Lanka. This column previously appeared on his blog.
Secretary for dialogue and social engagement for the International Fellowship of Evangelical Students. He lives in Sri Lanka.