A tragic and disturbing story from last week involved tweeting. A young woman who was being stalked tweeted her impending doom a few days before her murder. “So scared right now,” and “I got me an uglyass stalker” were some of her tweets. Then, closer to her death, she tweeted, “This can’t be happening…” In essence, she was broadcasting events leading to her demise.
In the same week, a teenager in a suburb of Baltimore posted murder-suicide references on Facebook before taking a shotgun to school and wounding one of his classmates on the first day of class.
Going back a month or so, there was another horrific story about a teenager who lost an arm to an alligator in Florida while swimming with his friends. He heroically fought the reptile and managed to get away, minus one limb. But the story didn’t seem fully realized without involving social media, and apparently the victim felt the same. According to ABC News, before going into surgery to close up his wound, he asked his friend “to snap a photo of him in the trauma unit and post it on Facebook.”
The examples, of course, are endless. Yet they all seem to suggest that man’s 21st century response to dramatic events is not necessarily just to simply interact with it, but to also record it. If communication technology was created to enhance our daily lives, something has dramatically shifted along the way: More and more, we are altering our behaviors in service of the digital world.
So many of us now have been raised on video games, cell phones and iPods. We’ve spent a large bulk of our lives in chat rooms, on Skype and posting videos to YouTube, to the extent that we’ve become news reporters and newsmakers, without even making much of an effort. We announce our actions and, in some cases, our impending demise online without giving it much thought. We have been so conditioned to invest our emotional life in the virtual space that it has become second nature. What’s more, many of us have learned to split our attention, with one eye on the electronic mirror and the other on reality.
Indeed, more and more, we are beginning to believe that we do not fully exist without some sort of electronic imprint in the virtual world, a digital projection of ourselves, a validation of our existence.
Pipiatum ergo sum? I tweet, therefore I am?
Wafaa Bilal, a photography professor at NYU, a couple years ago went a step further and implanted a camera in the back of his head as part of an art project. The camera broadcast a live stream of images to a museum in Qatar. On his skull, the real and the digital converge, and the real is photographed for the benefit of the digital.
The trend is “self-tracking,” according to the Economist, and a market for these devices is rapidly emerging. There are wireless devices that can track people’s physical activities, while other devices can take measurements of brainwave activity at night, to chart users sleep patterns online.
“People around the world are now learning how to leverage the incredible power inherent in the URL to create what is essentially a parallel universe of digital identities,” noted Robert Young, an Internet entrepreneur. But in this new industry, he observed, “the raw materials for the ‘products’ are the people… the key is to look at self-expression and social networks as a new medium and to view the audience itself as a new generation of ‘cultural products.'”
Perhaps it’s too early to tell the long-term effects of an oversaturated information age on human evolution. But according to the New York Times, scientists say the constant use of computers and cellular telephones is causing a significant, evolutionary shift in our brain’s wiring.
But one of the most troubling consequences of devoting so much attention to the virtual world is the death of empathy. Clifford Nass, a communications professor at Stanford, told the New York Times that empathy is essential to the human condition. However, given the virtualization of the real world, and tendency for many to multitask, “we are at an inflection point,” he said. “A significant fraction of people’s experiences are now fragmented.”
Which may very well explain a story that involves professor Bill Nye, popularly known as “the Science Guy” on TV a couple years ago. He collapsed on stage out of exhaustion as he prepared to give a lecture. But instead of rushing to the stage to help him, the LA Times and other media reported, many students in the audience took out their cell phones, snapped photos, texted and tweeted the event.
Or consider this now famous story involving YouTube. On March 30, 2008, a group of teenagers in Florida lured one of their own peers to one of the girl’s homes and videotaped her beating. With one girl behind the camera to record the episode, and two boys guarding the door, the rest mercilessly beat the young woman into a concussion. It was for a dual purpose: to “punish” the victim for allegedly “trash talking” about them on MySpace, and to post the footage on YouTube. The most telling line during the beating, however, was when the young woman behind the camera yelled out: “There’s only 17 seconds left. Make it good.”
Seventeen seconds left, that is, in a 10-minute slot – the maximum time one can post a video segment on YouTube. The time frame and the incident prompted a journalist to quip, “Well, Warhol was only off by five minutes.”
10-Minutes of Fame
What makes that incident unusual is not the violent act itself – girl fights have been well reported, after all – but that the girls’ actions were dictated not by a pure act of revenge but by a kind of exhibitionism rarely seen before. Stranger still is that, increasingly, the electronic world dictates exactly how an action should be carried out. The collective beating of the young woman, for instance, was directed to intensify as the video neared its 10 minute mark. (Did their beating lose steam, one wonders, when the camera stopped rolling?)
This modern mindset has given psychologists and anthropologists enough material to study what they call the “disinhibitive effect” on the Internet. Road Rage is quickly giving way to Net Wrath. Like actors who are trained to lose their reservations on stage, many now take daring risks for the virtual world – nevermind that they might have repercussions in the real one. They show it all, or do something enormously bizarre or violent, to garner lots of hits, lots of eyeballs.
Andy Warhol may have been off by five minutes, but he was otherwise frighteningly prophetic. A future in which everyone can be famous for about 10 minutes has indeed arrived. We have all become actors, filmmakers and reporters. We begin to believe that we are not fully ourselves, that we are not viable in the new system, unless we make some sort of electronic imprint, some sort of projection of ourselves, in the virtual world. Diaries, once locked away and hidden, have now gone electronic in the form of blogs and vlogs. The real event to some may no longer be as important as its virtual image, which can be relived online.
No one doubts that communication technology has enhanced humankind in marvelous ways. But it comes with a price. Mary Shelley’s Frankenstein; or, The Modern Prometheus was a seminal work and a warning that the discovery of electricity could create a monster at the cusp of the industrial revolution, and that punishments awaited those who dared steal powers from the Gods. The Wachowski brothers’ extraordinary movie The Matrix, made at the end of the 20th century, in which humans are enslaved and permanently trapped in a simulated reality, was in a way the same warning. Man in the 21st century has transcended geographical and even biological constraints, and found the power to translate himself in various media across the globe. But as a result he is seriously fragmented.
The hero in this new myth necessarily needs to become a prophet. For his is the arduous task of reintegrating the various fragments of the self, hearing the symphony in the cacophony, seeing the human in the digital — or else, man will suffer being trapped forever, in the halls of mirrors.