I became an adult before the year 2000. We used to regard that year as the dividing line between the past and the future. You can see it in a few surviving cultural relics, like 2001: A Space Odyssey, I’m Going to Party Like It’s 1999, which was in near constant rotation in 1998 and 1999, and of course, Back to the Future.
Now, we’re long past 1999 and 2001. Today, Marty Mcfly would be an old man, and everything I see and do has a veneer of, “So this is what it’s like in the future.”
I don’t live too far from an amusement park called Yomiuriland. I walk a hiking trail near there sometimes and listen to the undulating screams of rollercoaster riders. There’s a ride that lifts people to the top of a tower and then drops them.
As I watch this happen, I shake my head and think to myself, “Oh, the things humans do for entertainment. This is how we turned out.”
Obviously, this isn’t really, “how we turned out,” unless we’re near the end of our run, and maybe we are.
We spent 300,000 years primarily concerned with not dying, and then advances in medicine and technology made our lives both longer and easier. The Oregon Trail only opened up in 1841, and if you played The Oregon Trail game, you know how easy it was to die back then. Penicillin was discovered in 1928.
Whenever the news gets me down, it always helps me to look at the big picture like this. I don’t say this to minimize what we’re going through today, but to highlight why it’s so difficult. We haven’t yet adapted to our strange new situation. Life used to be physically hard, but it was straight forward. Today, life is physically easy, but it’s a psychological nightmare.
I don’t know how most people felt prior to 150 years ago because I’ve only read things written by wealthy aristocrats from back then, and their lives were far less risky than everyone else’s. (Except for Edgar Allen Poe, and you know how things turned out for him.)
Today, we have many more voices and far less curation. In 2020, the amount of data created, captured, copied, and consumed in the world was 59 zettabytes. One zettabyte is 8,000,000,000,000,000,000,000 bits. If this amount of data has resulted in some sort of understanding about who we are today, I don’t know what it is.
Mostly, I only know about myself, the people I know, and the people whose work I read. Many are deeply panicked about current events and climate change, struggling with mental and physical health, or just trying to keep it together long enough to stay afloat. Many of us are wondering how we might fashion a life that serves some sort of purpose.
If the only thing that had changed in the past 150 years was that life got physically easier, we probably would’ve been fine, but we also had The Industrial Revolution. This launched an era where the advancement of technology would quickly outpace our ability to adapt to it.
Maria Popova populates her book, Figuring, with thinkers from around the beginning of The Industrial Revolution. They knew their great leaps in technology would lead to more, and questions like, “Who will we become?” and “What will we build?” were on the forefront of their minds. I was buoyed by their insight and optimism.
Today, I see people asking, “who are we?” and “what have we built?” and answering with pessimism. I suppose that’s the result of letting these questions lie for 150 years, only to revisit them after things started looking really grim.
For example, in “AI-art isn’t art,” Erik Hoel contemplates how AI could potentially replace graphic artists. He writes:
But if the replacement does come to pass then it will exsanguinate so much meaning from our lives. For you were born into a world where most things were made by human consciousness. You may die in a world where nothing is made by human consciousness.
In 2018, The Pew Research Center reported on the future of artificial intelligence. Most experts on AI mostly fear the future loss of human autonomy, privacy, jobs, and survival skills, as well as an increased vulnerability to cybercrime (and cyberwarfare).
For some reason, these scary future projections always remind me of the chess Grand Master Ben Finegold. He often uses the phrase, “the truth hurts,” when he teaches. Within the context of chess, “the truth” is a set of moves that come to an inevitable conclusion, but, of course, it’s only true within the parameters of the game.
With technology, I get the sense that people feel like we’re in a game with an inevitable (not so great) conclusion. But, we invented the game. We can always change the rules.
The trouble is that we all know that we can’t change any rules without collective action, and we don’t have control over how other people behave, vote, and react. We know that, as a collective, our ethical development is still pretty primitive.
Perhaps it’s not that complicated, though. Maybe if we individually try to be more thoughtful and civic minded when we make our personal choices (whatever that means to us), our choices will find some natural alignment, and our world would improve without any spectacular rule changes.
This might already be happening, which is why everything looks so crazy, right now. We’re experiencing disorganized growth.
In the research article, “The cultural niche: Why social learning is essential for human adaptation,” the authors write:
We owe our success to our uniquely developed ability to learn from others. This capacity enables humans to gradually accumulate information across generations and develop well-adapted tools, beliefs, and practices that are too complex for any single individual to invent during their lifetime.
This is who we are. We’ll always share and accumulate information. We do this both individually and collectively. We’re doing it in a shorter amount of time than ever before because we have no choice. We must scramble to learn lessons that we’d previously learned over generations.
Sometimes, I wonder if our fear of the future is less about the reality of our situation and more the result of our intense disappointment with the post 2000’s world. We thought we’d have solved our problems by now, but instead, our technology has merely revealed the truth of who we are.
We have deep, ugly social problems that we’ve long needed to fix, and the future has merely made our long-standing problems urgent and unignorable. Without these revelations, we could not heal, adapt, and come better.
Some things to check out:
This is appreciation for Frida Kahlo like I’ve never seen before.
I’ve been listening to the podcast Quitted, almost non-stop the past couple of weeks.
Reading this has brought to mind Alfred Korzybski’s concept of time binding. Time binding has allowed us to learn from those who came before, depend upon each other and feel responsible for one another, and yet it is a concept that Joseph DeVito has used to help formulate the idea of static evaluation. I see more people engaging in static evaluation these days, judging people for not who they are now, but who they once were. It is a corruption of time binding and a corruption of what it means to be human, but I think it is largely influenced by the algorithms used on social media. AI retains everything, and that is where, I think, the biggest problem is. Throughout the centuries, people have made decisions about what knowledge to retain and record, and what to discard. Static evaluation does not allow for personal growth nor advancement of society.
I have a reoccurring nightmare about drowning in garbage. In some ways, humanity is experiencing this already as we drown in information that should have long ago been deleted and forgotten. I know people are frightened by AI becoming sentient, but what more people should fear is a loss of sentience among humans.
When I first saw the title of this post, I was reminded of one of the Northwest Earth Institute’s discussion courses that I wanted to take part in. It is about living simply, but is focused on a sort of mindfulness regarding technology. I took part in one of the organization’s other discussion courses, but haven’t found people in my area who are interested in this one. The link is here: https://store.ecochallenge.org/products/a-different-way
People can preview the first section from that page.
The problem is regulation and A.I. ethics isn't keeping up with how platforms, Cloud services and A.I. is being utilized, militarized, harnessed and developed in secret.