A lot of longtermists care a lot about existential risk, called X-risk for short. Existential risk is the risk that humanity dies out (nukes, pandemics, AI). A different risk is suffering risk, called S-risk. Suffering risk is the risk that humans are stuck in place (authoritarian government takes over and stops progress), or that humans are tortured forever (AI simulates digital minds and tortures them relentlessly, or enslaves humanity and tortures us in "real life"). 80,000 hours estimates that there are less than fifty people in the world who are thinking through how to reduce S-risk. Again, it seems pretty weird that we live in a world of eight billion people where less than fifty are seriously concerned about the very real possibility of worldwide enslavement. For the most part, I don't see any way towards S-risk that isn't agent-driven. What I mean is that only through massive advancements in technology would this sort of grand-scale suffering be possible. We generally think of technology as good, but as Tim Urban writes: “technology is a multiplier of both good and bad. More technology means better good times, but it also means badder bad times.”
Only through advanced technology do I see such potential for mass suffering. Sure, maybe aliens descend from the sky and torture the species for millennia, but, contrary to what Independence Day would lead you to believe, we probably can't do anything about that. We can, however, massively influence the types of technology that are developed. We can put in place extremely forward-looking safeguards. S-risk is really the main reason I stick so closely to the topic of artificial intelligence. Nukes and pandemics are bad (and I'm sure some authoritarian government could use these to blackmail their citizens), but all you have to do to see S-risk in action is to watch The Matrix one time. Obviously, I doubt robots start using humans for batteries (have they heard about nuclear power?). But in many ways, the paperclip maximizer is the least scary robot.
No comments:
Post a Comment