avatarDean Brooks

Summary

The article discusses the notion that humanity may have peaked in the late 1990s to early 2000s, as suggested by rapper Zuby and author N. Sherwood, juxtaposed against the author's skepticism and critique of nostalgia.

Abstract

The article presents a debate on whether the late '90s to early 2000s represented a zenith for humanity, as argued by Twitter personality Zuby and supported by author N. Sherwood. Zuby's tweet, which sparked the discussion, is met with initial dismissal by the author, who reflects on the tendency to idealize one's youth. Sherwood's subsequent Twitter thread provides a range of reasons why the era might be considered a peak, touching on technological progress, societal cohesion, and media consumption. However, the author critically examines these points, questioning the objectivity of declaring a peak in human history and highlighting the subjective nature of progress. The article concludes that while the era held positive aspects, the tendency to glamorize the past is a common human trait that may not accurately reflect historical reality.

Opinions

  • The author is skeptical of the claim that the late '90s to early 2000s was humanity's peak, viewing it as a form of nostalgia.
  • Zuby's conservative and Christian perspectives are noted, though the author does not share these views.
  • The article suggests that conservatism is mainstream in everyday life but is seen as alternative online, where liberalism prevails.
  • Nostalgia is characterized as a right-wing trait, with the left critiqued for fantasizing about future utopias.
  • The author disagrees with the idea that technological advancements have led to overindulgence and entitlement specific to the present era.
  • Sherwood's argument that the '90s represented a Goldilocks zone of technology is seen as subjective and not necessarily an improvement over current times.
  • The author challenges the notion that past media, such as newspapers, were inherently better sources of information than today's digital media.
  • The idea of a monoculture in media consumption is refuted, with the author arguing that media has always

Did Humanity Peak in the Late ‘90s-Early 2000s?

Or is this just pedestalization of the past?

Source: Screenshot of ZubyMusic Twitter.

I’ve followed ZubyMusic on Twitter for almost three years now, at least since around 2019.

If you’re not familiar with the artist, “Zuby,” short for Nzube Olisaebuka Udezue, is a 36-year old English rapper educated at Oxford University, with a substantial and growing audience of worldwide fans. Known mostly for his music, he’s also a strong conservative voice, often criticizing identity politics, and is a Christian. He’s self-released three albums, and has a podcast and YouTube channel.

I’m not a Christian myself, nor do I listen to rap. In fact, I’ve never once even listened to Zuby’s music, as I think “Christian” and “rap” sounds about as cringe as almost anything the “Christian” world tries to attach itself to in the secular realm in order to be hip and relevant. Christian comedy. Christian rock. Christian movies. Ugh. No, thanks.

Still, I like Zuby because he often makes interesting and thought-provoking tweets. Even if I don’t always agree, it’s nice to get a different or unique perspective on current events, especially on Twitter. It’s funny how conservativism is actually quite maintream and common in everyday life, yet online it’s seen as odd and “alternative,” with liberalism and left-wing politics seen as the default. In reality, it’s much more evenly split.

Last October Zuby tweeted the above comment, which I frankly dismissed almost immediately. I think there’s a temptation to glamorize one’s youth, seeing it as some bygone golden age. Zuby, born in 1986, would have had his most formative childhood years in the ’90s, and been a teen for the first half of the ’00s. I remember John Stewart on The Daily Show saying something like “you don’t miss that era, you just miss being a carefree child,” in response to a pre-sex scandal disgraced Bill O’Reilly saying how he felt the decade of the 1950’s (O’Reilly’s youth) constituted Americas best years. Politically, Stewart and myself are quite opposed, though I have to admit the guy could be pretty insightful at times.

Nostalgia-gazing is something particularly characteristic of the right wing. And while it’s soothing and addictive, it’s also as pointless and counter-productive as the left’s own habit of future utopia fantasizing. Neither side seems to want to deal with the here and the now, preferring to longingly await a DeLorean to whisk them away to another timeline. No wonder things remains such a mess, when both sides abdicate their responsibility in the present.

Then this morning I was reminded of Zuby’s tweet by Nick Sherwood, author of The Social Virus: Social Media’s Psychological and Social Impact on America (And What We Can Do About It). He posted a series of tweets articulating why he feels Zuby is correct.

Source: Screenshot of N. Sherwood’s tweet.

The above was followed by a long thread of reasons and supportive evidence, some of which I thought had credence. Others I found questionable. And by “others,” I mean most. And by “questionable” I mean mostly B.S.

To begin, I don’t think it’s possible to declare any particular era in human history a “peak” at all, given that so many cultures and nations around the world are undergoing vastly different experiences than others, both positive and negative.

If we’re talking strictly the Western world (America and Western Europe), one could make the argument the late ’90s to early 2000s certainly wasn’t a bad era. The Cold War had ended, and the economy and job market were strong. But that’s looking at things from the macro view. For someone working a cash register in a small town in Idaho, was their life any better or worse, or much different for that matter, than ten years prior?

Sherwood continues:

Source: Screenshot of Sherwood’s Tweet.

I agree with the first half of the second sentence, if by “progress” we’re talking technologically and socially. No doubt the ’90s was an era of progress. But so was the ’80s, the ’70s, and almost every decade before. At least in America and other places in the world. “Progress” is also subjective. No doubt Lenin and Stalin would have considered their Communist Revolution in Russia “progress.” But was it? Big doubt.

The second half of the statement is basically meaningless. How do you even measure levels of overindulgence and entitlement? These are aspects of human nature, and I don’t think humanity has evolved much, if at all, in just the past 25 years. So I’d say there’s a good chance that we’re seeing the same levels of indulgence and entitlement now that we saw a quarter-century ago. Maybe now it’s just more visible due to social media.

Moving onto his next points:

Source: Screenshot of N. Sherwood’s tweet.

Sherwood seems to posit that the late ’90s/early 2000’s comprised some kind of Goldilocks “sweet spot” era in which we had the just the “right amount” of technology. Not too much to where it became omnipresent, like the smartphone in everyone’s pocket, but just enough to where it acted in the background.

Again, this is highly subjective. One man’s too much technology is another man’s not enough. I can certainly remember people fixating on computers even as far back as the mid-90s, when the internet became more accessible to the mainstream.

Infrastructurally speaking, we’ve been dependent on computers probably since the 1960s. Almost all of our telecommunications, major medical equipment, civil defense systems, etc. all depend on computers and microchips.

If we’re talking about how the ’90s was the beginning of computers separating people into their own bubbles as everything went digital, there’s an argument for that. I do think people were more fluid socially back then than they are now. Younger generations today can’t seem to effectively communicate unless it’s through a screen. It was Millennials, afterall, who popularized “ghosting.” When people are reduced to simple online avatars, it’s much easier to dismiss their humanity and snap them out of your existence. People today shy from conflict more readily, and terms like “social anxiety” are prevalant.

Source: Screenwhot of Sherwood’s tweet.

I wrote for a newspaper as a teen. Had my own column. I also worked in the printing industry for eight years as it transitioned into the digital age. Newspapers are cool, but I wouldn’t associate them specifically as being the best or even a good source of information necessarily. At least, not anymore than radio or TV. Local news hasn’t really changed in 25 years, either. Traffic on I-95. Some guy got busted for dealing drugs. A kindergarten teacher retires. New waffle restaurant just opened. The song remains the same.

It’s true we get hit way more with B.S. news alerts and app notifications. But that’s a simple fix. I either delete a misbehaving app, or don’t turn on notifications at all. The only alerts I get on my phone are from my Medium app, which is actually starting to get on my nerves.

But again, Sherwood is really making more of a case against smartphones, and by extension social media, and not so much a case for the ‘90s/2000s being some golden era. You can’t just argue in the negative. Smartphones didn’t exist during the Bubonic Plague in Europe either, and I don’t think anyone would argue those were good times. Not unless they’re some hardcore “survival of the fittest” Darwinist fanatic, or something.

Source: Screenshot of Sherwood’s tweet.

What?! Has this guy not heard of the John Birch Society, which handed out leaflets and pamphlets pandering to very specific and extreme right wing beliefs WAY back in the ’50s and ‘60s?

Or The Daily Worker newspaper, published by the Communist Party USA back in the 1920s?

Or Bop Magazine, delivering steamy servings of teen heart throbs like Jonathan Taylor Thomas, Johnny Depp, and Jonathan Brandis?

Hmmm…if the ’90s was peak anything, it was was Peak Hot Guys Named John.

Source: Screenshot of Sherwood’s tweet.

No matter how many streaming or cable channel options exist, there are effectively only a small number that any one person will ever regularly watch, as there is only so much attention one can give, and limited time.

And why is the expansion of entertainment media necessarily a bad thing? You wouldn’t say the same about the millions of books that have been printed in the last few hundred years. So why would TV shows and movies be any different. There being five million Star Wars movies/shows/books/toys is annoying to me, yes, but it’s not like it ruins the quality of my life. I just ignore it, like anyone older than twelve and who possesses a frontal lobe should.

Source: Screenshot of Sherwood’s tweet.

Ah, so media is only “good” if EVERYONE is watching so they can dicuss it the next morning around the water cooler. Got it. That being the case, I guess the daily state broadcasts North Korea puts out to all its slaves, er, “citizens” must be of the highest excellence. I’m sure KCT fosters something a bit more than a “semblance of monoculture.”

It’s true that much of pop culture and media is fractured amongst varying demographics and audiences. But that’s always been the case. I can remember my friends and I discussing how freaking awesome the T-1000 was around the school cafeteria the year Terminator 2: Judgment Day came out, only to get blank stares from the girls, who themselves were talking about Beauty and the Beast. Then going home and my step-dad telling me to shut-up about “Turdinator” while watching a re-run of Welcome Back Kotter. Then running to my mom to whine that her husband insulted my hero Arnold, only for her to shut the door in my face so she could watch Knots Landing.

Like that South Park videogame, it’s always been a fractured but whole, Sherwood.

Monoculture is a myth. No matter how big a movie is, it’s likely not even three percent of the world population will even see it. Take Avatar, the highest grossing movie of all time not adjusted for inflation, at almost $3 billion in global ticket sales. In 2009, the year Avatar premiered, if the average movie ticket was $7.50, then that means a maximum of 400,000,000 saw James Cameron’s remake of Fern Gully in theaters, out of around 7 billion people. Except that number doesn’t count the people who went to go see the movie repeatedly. And it doesn’t count the fact that many people paid way more to see it in glorious 3D. If you cut that number in half to 200,000,000, that means only about 3% of the world population saw Avatar. Even if you double it to 6%, that’s still pitifully low in the grand scheme of things. And that’s the biggest movie ever released.

To put that in perspective, the biggest religion in the world, according to the Pew Research Center, is Christianity, and it hasn’t even cracked 1/3 of the global population with its 2.2 billion followers.

Source: Screenshot of Sherwood’s tweet.

No, Chapter 1 of the internet was “How Much Freaking Longer is This Thing Going to Take to Log On, Goddammit!” With the sub-chapter “Don’t Use the Phone I’m on AIM Right Now!” Chapter Two was “When Are We Getting Broadband, Everyone Else Has It Now!”

The internet sucked 98% of the time back in the ’90s. It wasn’t cool. It wasn’t aweome. You didn’t find anything “fresh.” It was where you IM’d your friends from school until some creep found your teen chat room and tried to have cybersex with you. There’s a reason why To Catch a Predator came out in the mid-2000s right after the supposed “golden age” of the internet. It’s because the world wide web, due to its anonymity and wild west novelty, empowered a lot of perverts in the early days.

The internet was also a place for piracy. Remember Napster, which single-handedly almost destroyed the entire music industry? “I Love the ’90s” my ass, especially if you played in a band named Metallica.

The internet was weird, distrusted, seen as a fleeting fad by some, buggy, slow, mostly useless, and the driver of the Dot Com meltdown. Saying the internet was “cool” back then before high-speed and regulation is like saying bloodletting was cool before modern medicine discovered viruses and bacteria.

Source: Screenshot of Sherwood’s tweet.

Ah yes, that wonderful period in the late ’90s and early 2000s when politicians never pandered for votes, didn’t treat those across the aisle like horrid zombies, and joined arms as fellow Americans. Back then we didn’t have contested elections, or impeachment trials, or “vast right wing conspiracies,” or third party presidential runs conducted by eccentric billionaires. Politicians didn’t lie. They never even used foul language. Certainly they didn’t have affairs with interns, or cheat on their cancer-stricken wives. Or invade countries based on false claims of weapons of mass destruction. None of that ever happened.

Source: Screenshot of Sherwood’s tweet.

If kids growing up and maturing sooner is your benchmark for the golden years, then you’d have to look way past the ’90s. Back to, say, during WWII, when kids lied about their age so they could go to war.

Take the case of Calvin Graham, for instance. Born in Canton, TX, Graham signed up for the U.S. Navy after the bombing of Pearl Harbor at 12 years old. He’d later get wounded by sharpnel at the Naval Battle of Gaudalcanal, for which he’d receive the Bronze Star and the Purple Heart. Graham would eventually get booted from the Navy after attending his grandmother’s funeral without permission. Get married at age 14. Become a father a year later. Divorced at 17. Then join the Marine Corps at 17 to serve in the Korean War. Then break his back in 1951 after falling off a pier.

Look at that. Two wars. Two branches of the military. Married and divorced. Has a kid. And even gets his first case of workman’s comp. All before most kids even learn how to shave.

Sorry, kids were not free-roaming Mad Max badasses in the ’90s. They were mostly soft, squishy, sticky bags of shit. Eating Lucky Charms, Pop Tarts, and Ellio’s Pizza. Capable only of Nintendo marathons, watching Saturday morning cartoons, remembering the Konami code, and making fun of Michael Jackson’s face.

I don’t know what causes people to glamorize and pedestalize the past. Nostalgia has practically become its own genre now, with Hollywood dumping ‘80s-inspired crap like Stranger Things on us constantly like Nickelodeon slime. I remember the ’80s, man. I was a kid then, too. Well, mostly I remember watching TV and movies during the late ’80s, and not having to worry about a whole hell of a lot. What do you mean the Russkies could drop a nuke on us any moment? I don’t care, I’m watching Inspector Gadget here and drinking chocolate milk.

For sure, sometimes I miss not having any responsibility other than deciding what kind of dinosaur I want to be for Halloween. But it’s kind of ridiculous and suspect to declare any particular era “humanity’s peak” when it just so happens to coincide with your childhood. It almost sounds like indulgence and entitlement, come to think of it.

Think of it this way. Right now there’s a horrible war going on in Ukraine. It’s the worst of times for anyone who lives there now. But somewhere in Colorado, Florida, Canada, or maybe even Japan, some kid somewhere is having the time of his life. He’ll grow up thinking it never got any better than the late teens and twenties. The ’90s and early 2000s will be as foreign to him as the ’60s and ’70s are to a Millennial or Gen Z’er.

And you know what? He’ll probably be right. At least he didn’t have to deal with the Macarena.

Nostalgia
1990s
2000s
Progress
Politics
Recommended from ReadMedium