There are plenty of existential risks. AI, nuclear war, and chemically engineered pandemics are the most likely and are all pretty horrifying. I was on a run today, and I wondered exactly why I have been stressing so much about these risks. Spending years reading and writing about these topics has made me a more anxious person, hesitant to bring children into a world of suffering on a mass scale. Even if there is only a 10% chance humanity gets wiped out this century, it is clearly possible that the year of this extinction could be a horrible time to endure. I would die, my family would all die, my kids would all die, my wife would die, all of my friends would die, and everyone I've ever met would die. This is very clearly the most depressing outcome imaginable. However, the more I thought about it, the more I realized: this is the trade you make. This is what it means to be human. This is how every human has lived throughout history: in a period of extreme violence and uncertainty.
If you lived in Germany many hundreds of years ago, you were probably pretty worried that a group of barbarians would raid your cabin, steal your wife, and kill you and your children. If you lived in Egypt thousands of years ago, you were probably a slave and your family could have been killed on a whim by a despot. In fact, in the majority of societies, the majority of people were placed completely at the mercy of authoritarian rulers or warring overlords. The life of a father or mother caught in a current African civil war is par for the course, it is modern life in the West that is the exception. Or is it? Just because we ignore the prospect of nuclear war doesn't mean we don't constantly sit at the brink. Just because we don't read up on chemically engineered pandemics doesn't mean a modified strain of smallpox won't run through New York City and kill 95% of the population. Nothing has changed. Humans have always faced individual and group extinction, we are just moving the boundaries a little more.
Looking at ex-risk, the only truly scary one from an existential standpoint is AI. Not because it will kill us all (bioweapons and nukes can too), but because it could make us live near-forever in a pit of tremendous suffering. The good news is, this outcome is probably not likely, and the future of AI is basically entirely uncertain. Also, it is the only one out of the three main ex-risks that through development could lead to something resembling a utopia. Regardless, looking too far towards the long term future of humanity can distract from the present. The stresses of current life and the existential struggle we personally have with accepting death are really nothing new. There were plenty of brave men and women who choose to end their lives out of circumstance or out of fear of some circumstance (they heard the barbarians charging over the hill). We should not discount these decisions. But we should also realize that plenty of people stuck it out. Maybe for some pre-programmed survival instinct, or maybe for some other reason. Not only are things pretty good in the world right now on an absolute basis (very, very few people live every day in perpetual bouts of fear and suffering), but things on a relative basis are getting better every year. The long-term trajectory of the human race is positive, perhaps in an exponential fashion. Perhaps not. Personally, I plan to stick around long enough to find out.
No comments:
Post a Comment