Sometimes, when the rate of social change is so rapid and its effects so broad and widespread, all points along the time continuum become murky and unrecognizable. As is almost always the case, the future still remains a mystery, but now, for many people, the present is just as incomprehensible and our life in the past is hard to fathom. As Chuck Klosterman wrote a few years ago, “This is what it’s like to live in an accelerated culture.” Of course, the notion of an accelerated culture is mostly in reference to the recently witnessed quantum leap in communication technology and how its nearly unprecedented growth has radically and irrevocably transformed the way the world operates.
Living in an age where it often seems like the entire world is instantaneously available at our fingertips, it can be hard to remember that less than two decades ago, the vast majority of the general public had no concept whatsoever of the Internet, and relied almost exclusively on the 6 p.m. newscast or the morning’s local paper to learn of the news from the past 24 hours. Fifteen years ago email was still largely a novelty tool, while 10 years ago, Google was still a Silicon Valley start-up, toying with a keyword advertising model to help fund the creation of a “single, integrated, universal digital library.”
Today, technological innovation has dramatically changed the equation. Now, the Internet is such an integral component of modern living that many would be hard-pressed to pinpoint a time in their day when they weren’t “connected” to the web through some means. And that eager start-up search engine with the funny name? It has since grown into one of the most financially successful and socially influential companies in the world, and maybe ever. Simply put, the Internet, and its various offspring — email, online information databases, blogging, social media — have become so pervasive that it is now nearly impossible to imagine a world without them.
In an unorthodox move and with a particular eye towards the sudden influx of user-generated applications and content available online, Time magazine appointed “You” as the Person of the Year for 2006. No, not you John Smith, or you Jane Doe, but “You,” in the broader abstract sense, as in all of you — really — as in humanity. That year stood out as a notable affront to the “Great Man” theory of history, and as Time’s reporter, Lev Grossman, put it, 2006 was “a story about community and collaboration on a scale never seen before.” The advent and proliferation of online media and social networks has created an entirely new version of the Internet. According to Grossman, “The new Web is a very different thing. It’s a tool for bringing together the small contributions of millions of people and making them matter.”
Critics of the social media phenomenon, now and then, are quick to question the real-world significance of these new online platforms. Sure, you can share bar photos with your friends on Flickr.com and Tweet about your favourite grocery produce; you can conveniently plan self-indulgent birthday parties on Facebook and broadcast the menial tidbits of your daily life via YouTube, but how do these “small contributions” have any merit? There are approximately 200,000 uploaded videos of “funny cats” on YouTube. How does this matter, and where is the lasting societal significance to any of this?
The truth is, it has been almost four years since social networking made its first big splash into the mainstream collective consciousness and yet, people are still struggling to find an answer to that question. Initially both Facebook and Myspace were often disparaged for their role in the apparent decline in conventional modes of human interaction, and dismissed as the new flavours of the week, a sorry trend that would ultimately find its way onto the social junk heap. However, in recent months there have been a couple of major world events whose particular outcomes signal a potential long-term viability for the web. As both the events in Iran and Haiti demonstrate, these new online communicative tools may have finally found their merited niche leading to true social significance.
Much like the Tiananmen Square student demonstrations 21 years ago, the Iranian Uprising this past June was symbolized by one instantly iconic, yet previously anonymous person. The image of the “Unknown Rebel,” grocery bags in hand, staring down a procession of Chinese tanks became one of the most widely-circulated photographs of the 20th century and helped mobilize a new generation of young people to push for democracy around the world. Last June, through the lens of new media, a young Iranian woman, Neda Agha-Soltan had a similar impact, albeit under the most tragic of circumstances.
As she peacefully observed the Tehran street demonstrations in protest of suspicious national election results, Neda was inexplicably gunned down by a rooftop sniper. This atrocity would have likely gone down in the long lineage of unseen and unknown human horror, except, in this case, the lens of a single mobile phone managed to capture the scene in its gruesome entirety. Within minutes, the video had gone viral. Neda’s dying moments were immortalized in blogs and tweets and then broadcast on YouTube for the world to witness. The remarkable thing about the Uprising was that, in a situation where all international media was barred from the scene, Neda’s video and other instances of first-hand citizen journalism became the sole source of information. As the world was clamouring for news, the story would be told exclusively through frantic tweets, grainy snapshots and most painfully, by Neda’s blood on the street and the vacancy in her eyes.
While in Iran user-generated content exposed the most discouraging aspects of humanity in all of its brutal and personal detail, the online community’s response to the Haitian earthquake demonstrates its tremendous potential for tangible social change. According to analysis compiled by the Pew Research Center’s Project for Excellence in Journalism, users on both Facebook and Twitter were active contributors to the Haitian relief effort. In the 48 hours immediately following the disaster, 83 per cent of links on Twitter were in reference to Haiti, many of which implored followers to send aid. In fact, 2.3 million tweets were sent featuring either the words “Red Cross” or “Haiti” and nearly 200,000 mentioned “90999” or “Yele,” two popular text codes set up for Haitian donations. As of Jan. 28, activity on Twitter and other social media had cultivated approximately US$8 million in relief funds. Although critics often disparage them, Twitter’s active celebrity community, with their massive follower bases, were also instrumental in the dissemination of information and in their advocacy for aid.
Despite these recent developments Twitter, YouTube and the rest of the social media universe remains an experiment in human interaction. Without a doubt, there remains much to explore and the extent of user-generated content’s productive capacity is still unknown. Still, it is becoming increasingly clear and wise to acknowledge that this new form of communication is not going anywhere. If it won’t be Twitter or YouTube, then there will be other upstarts to take their place. With this being the case, users should create their content with the examples of Iran and Haiti in mind — to illuminate and expose the very worst, and highlight and facilitate the very best.