THE ARCHIVAL ARTICLE:
THE BACKGROUND STORY:
Today’s archival article was one of the first “think pieces” I wrote for Metroland, moving beyond the record and concert reviews that defined my earliest days with the newspaper. While the Internet had become available for public use as the World Wide Web in 1991, it remained primarily an academic community until late 1994 or early 1995. At that point, the emergence of the Netscape browser, Yahoo!, eBay, Amazon and their pioneering peers transformed the digital world, opening it more widely for free exploration by “regular folks” who happened to have computers, and were ready to move on from the more canned experiences that America Online and CompuServe were then providing over our land lines.
A good deal of the traditional print media chatter of the time focused on “sky is falling” narratives about how the Web was going to kill books and newspapers, was never going to become a manageable and trustworthy source of dependable news, and was going to over-power our tiny human brains with more information than we could process. I was assigned to explore the latter facet of that transitional period, and this article reflected my findings.
Re-reading it now, a quarter-century later, evokes two basic reactions. First, it’s charming in its datedness. Remember Netscape? Magellan? USENET? And remember a world without Google, Twitter, Facebook, blogs, and smart phones? But on the flip side, some of the questions we were asking even then remain frighteningly relevant today, most notably “How do we distinguish good information from bad information?” I’m not sure we’re answering that one any better today than we were in 1995.
I interviewed some subject matter experts for the story and asked them what they thought the future might hold for the World Wide Web and its users. It’s interesting to see how most of them were on the right tracks with their forecasts, though the way in which those tracks were actually laid curved in some unforeseen directions, as did the language which we use to discuss them.
I tended to stay on the front-lines of web culture community-building for many years after writing this piece, and was an early adopter of more platforms and trends than I can actually recall at this point, moving from ASCII bulletin boards to LISTSERVs, from mailing lists to MMORPGs, from blogs to tweets, and from Cyber-Yugoslavia to Six Degrees to Friendster to Orkut to Xanga to MySpace to LinkedIn to Facebook to Twitter to Google+ to Ello, and God only knows what other passing fancies came, crashed, and burned without leaving any memories of note — or became so problematic for one reason or another that I abandoned them.
As I wrote just before I started this archival series, at this stage in my life, I’ve pretty much decided that a social media blackout is about the best way for me to go from a mental health and time management standpoint. I maintain this website, obviously, and I keep a LinkedIn account for professional reasons (posts here cross-post there), and I have been a registered member of the Fall Online Forum since around 2007, though my involvement there is increasingly cyclical, with longer stays away than active periods of participation. That’s about it for interactive social networking for me. Beyond that, I have identified my own trusted news sources, I consider a smart phone paired with Wikipedia to be the real-world manifestation of the Hitchhikers Guide to the Galaxy, I do much of my shopping online, and I appreciate having my music and movies being delivered to me at home.
I’m not a futurist, so I won’t hazard any sweeping guesses as the what the next quarter-century will bring with regard to “internet information overload” — except on one point. I am all but certain that the “How can we tell real news from fake news?” issue will remain with us in 2045 and beyond, as propagandists will be just as effective as pornographers at staying one step ahead of the controls implemented to neuter them for the betterment of society on a national and global basis.
Here’s hoping that I live long enough to test this prediction. And even more, here’s hoping that I am wrong.