In the angst and narcissism-fuelled internet age, what follows is likely the most angst-ridden and narcissistic post to ever make it to my website.
What began as a means of safeguarding critical information and fostering data sharing between academics has evolved into one of the most remarkable technological innovations in history. Even 20 years ago, who could have really predicted the way that internet communication has changed the way that people interact, how the globe has been connected and even how elections are won?
Perhaps the most staggering element of the internet has been the pace and scale of the changes we have witnessed. This speed has begun to influence the rate of change outside of cyberspace as well. The swiftness of political change in the United Kingdom and the United States over the past several months would have, in the pre-internet age, have been considered a revolution. Instead, it appears to be the new normal.
I am just old enough to remember the old internet - the geeky repository of knowledge (both useful and useless). It was a basic place, simply HTML pages which looked more like glorified word documents than a truly discreet medium in their own right. In school, research on the internet was a real treat, Microsoft Encarta was a genuine alternative for a classroom lacking a printed encyclopedia. My first website went online in 2002 and was hosted, just like everybody else’s, by Geocities. It had a colourful background, scrolling text, and whatever other bells and whistles HTML could provide.
But things change, Yahoo! shut down Geocities in 2009 (except, it seems, in Japan). Of course, people had moved on to Blogger and Wordpress by then as Web 2.0 took over, with headers and page elements instead of fixed HTML. The content of these new, more malleable pages, stayed largely the same however.
It’s often the case that the pioneers don’t get to share in the fruits of their labours. In the case of social media, MySpace got us used to the idea of forming networks and posting our lives for people to see. It wasn’t until Facebook that the idea reached maturity and marked the real shift in how the internet behaved. I never bothered with MySpace, but did dabble with Facebook during my early days at university. In time, I dropped it, deciding that I really wasn’t that interested in the lives of those around me.
Of course, this also means that I experienced Facebook before it decided to become a news provider, encouraging established outlets to post news directly to Facebook. This shift towards becoming a news provider is now widely-considered to have been one of the deciding factors in the election of American President Donald Trump. As someone who still relies on mainstream news providers (or, as President Trump would put it, some elite conspiracy revolving around the BBC, The Guardian, The New York Times and The Washington Post), the whole phenomenon of fake news disseminated through social media largely passed me by, as if I was living in some parallel reality.
After dropping Facebook, I took to Twitter. While I appreciated the brevity of 140 characters, I increasingly found that I was scrolling through mountains of information that did not apply to me. I could find the same news stories through mainstream outlets and I didn’t really feel the need to follow segmented rants between a few parties who could just as easily have aired their grievances privately. Twitter has, however, been an invaluable tool for journalism, giving real-time updates on events such as the sweeping (often unsuccessful) moves for change in the Middle East.
Both Twitter and Facebook share one important element which has shaped the evolution of the internet more than any other - both are free services. I once listened to a fascinating discussion with one of the original architects of the World Wide Web. He was talking about the initial decision about whether internet content should be made available free of charge or as a paid service (as Usenet was). In hindsight, he said that the paid service would have been more beneficial in the long term, freeing users from the constant burden of advertising.
It is worth dwelling on this point for a moment. Yes, a free internet is what made it so universal. Anyone, anywhere, has the ability to access more information than the entirety of humanity to date combined. But servers are not free, nor is the infrastructure to convey their content. It used to be said that a single Google search took the same amount of energy as boiling a kettle. With subscriptions paid to internet service providers only covering some of the infrastructure cost, the internet has relied on advertising to pay the bills.
This has meant that the likes of Facebook and (to a lesser extent) Twitter are forced to find ways to monetize their content, especially since becoming accountable to shareholders. While Twitter struggles to do this, Facebook has created a science of targeted advertising metrics that is simply sublime. By mining user data, and requiring a surprising amount of personal information from its users, Facebook can provide targeted advertising to suit any marketer in the world. Do you need 25-year-old males living in Toronto who like red cars? Too vague? How about those living within a ten-block radius of Yonge Street? Facebook can do it. No more wasted campaigns.
But data mining goes far beyond social media. Virtually every online service does this to some extent. Google was a pioneer in targeted advertising through its search engine. Apple has prided itself on robust security to protect your data, while also making its software intuitive in ways that can only work by sharing your personal information with it.
The internet has grown up. It has become commercial. A recent discussion on CBC’s Ideas highlighted just how the Facebook model of the internet has created an insular online world where people are supposed to blindly accept whatever shows up in their “Newsfeed” (a clever choice of name for something that can display an ad for deodorant, a fake news story about carrots and a photos of your friend’s new puppy side by side). It is a world where you have to “like” something (you can’t simply agree - in fact, Facebook resisted allowing a “dislike” for years). Everybody you know is a “friend”, even if you only met them once for a minute.
With the parallel development of mobile technology, content is increasingly diluted to fit onto the screen of a smartphone. At the same time, the availability of the internet through phones has meant that the workday is never really over. Employers increasingly expect employees to be working, or at least reachable, at all hours of the day. While France has recently legislated against this, the global trend is towards working for longer hours without an increase in compensation. So much for the future when computers were to reduce our workweek to a mere three days!
The irony is not lost on me, but the best articulation of my concerns for the future of internet-based technology can be found through one of its winners - Netflix. The subscription-based TV and film provider also mines user data to provide recommendations for viewing (and, one would assume, what new content to offer). One of its premiere productions, Black Mirror, paints a vividly dystopic vision of a future in which technology has reached its logical conclusions.* Imagine the unsettling tone of the Twilight Zone meets the internet age, in which life is literally online. It’s a disturbing series of episodes, but exceptionally well done and deserving of serious thought.
So what is one to do, in this seemingly chaotic reality heading towards a world where we really are ruled by machines? In my view, there are only two things to do: think critically and be stubborn. The internet has made us believe that everything posted in cyberspace is amazing and worthy of the whole world seeing it. This has never been, and never will be true. The internet has allowed us to share our passions with those who also share them, but in the social media age, traditional websites are effectively dying.
For those of us, such as myself, who do not partake in the Facebook age, our web presence is effectively invisible. For those of us, such as myself, who are not comfortable sharing our lives with the whole world, it is time to regroup and let the internet work for us again. It is, after all, an incredible tool, best demonstrated by the pinnacle of the geeky internet: Wikipedia, the largest encyclopedia ever created. Thanks to dedicated editing, it remains a bastion of largely useful information, free of the commercial pressures that a for-profit entity must face.
As we rethink our relationship with technology, we must not bury our heads in the sand. the internet and its peripherals are here to stay and we need to know how our world is changing as a result. It is, however, possible to stay in the loop while remaining an observer from the sidelines. Excellent services, such as Leo Laporte’s TWiT network, offer critical assessments of developments in every facet of the high-tech arena. Meanwhile, organizations such as the Nieman Lab and Digiday monitor the implications for information in the evolving medium that is the internet.
And so I begin to make the internet work for me again. This means that my online presence also must work for me again.
- Effective immediately, my online collection of photography on flickr is being removed. My collection on railpictures.ca will remain and may be updated.
- My Twitter account, @thomasblampied, is being deleted.
- This website will remain as a relic of when the internet was a geeky place, full of useful and useless information. It will not, however, be updated anymore.
- My academic site, Pro Bono History, may be updated from time to time. For the academic world, the internet does remain an incredible resource which has transformed the way research and scholarship evolve forever.
The past few months have been dramatic and likely upsetting for anybody who values critical, erudite thought, regardless of their political ideology. It’s time to take a step back, think about how we want to use technology, and make the internet a wondrous place again.
*Black Mirror was first broadcast on Britain's Channel 4, but has since moved to Netflix.