Friday, April 21, 2023

We Should Ignore The Environment

     Yes, climate change is a problem and humans are making it worse. But it is not going to kill us all, and it is not even going to kill most of us. Most countries are pushing for greater environmental regulations, and pretty much everyone is now aware of the issues at hand. The solutions are harder to discern, but there are now entire industries devoted to building a more sustainable future. Climate change is a problem, but there are simply more pressing existential risks at hand (AI, nuclear war, chemically engineered pandemics). Using investing terminology, I would say that climate change is overvalued and these others are massively undervalued. If you are passionate about the environment and plan to live your life campaigning and donating for that cause, that is perfectly acceptable. Positive impacts are positive impacts, and you deserve to follow your passion. However, for people with less heavy interest, we should focus our time and effort elsewhere.

    Still, from an effective altruist standpoint, how should we handle environmental issues? In my opinion, every environmental problem is simply an energy problem. Water quality and availability is an energy problem (reverse osmosis, you can use energy to convert bad water to good water). Carbon in the atmosphere is an energy problem (the type of energy you use puts it in, you can use energy to take it out). The world getting warmer or colder is an energy problem (massive amounts of energy can be used to heat and cool things). Thankfully, there is a carbon neutral way to generate massive amounts of energy with minimal waste: nuclear energy. Anyone who claims to care about sustainable energy yet campaigns against nuclear is a legitimate fraud. The existence of so many irrational and regressive people astounds me, and because of my sheer rage at such irrationality I find it hard to even discuss the issue. For context, I have a shirt that says "Go Green, Go Nuclear" and am looking forward to the day when nuclear fusion catapults us forward into a post-scarcity society. 

    Despite my provocative title, we should still care broadly about the environment. We should clean up trash, reduce carbon emissions, and advocate against companies that pollute. Most importantly, we should strongly campaign for nuclear energy and push forward with nuclear fusion research. To me, this is by far the most undervalued aspect of climate change prevention, whereas every other aspect is overvalued. We should do all of these things because we should care about the future of the planet and the future of the human race. Still, we should focus the bulk of our time on more important things.

Challenge: Think About Nuclear War Every Minute

    In 2022, I spent a significant portion of my time worrying about nuclear war. The Russian invasion of Ukraine was scary for everybody, but I estimated the likelihood of nukes flying at 10% throughout the crisis. It is really, really hard to live a normal life if you think that there is a 10% chance that everyone you know dies that year. Was I wrong in my estimate? JFK said that he thought the probability of nuclear war during the Cuban missile crisis was between 33% and 50%. During that crisis we now know that it very nearly happened, thanks to Vasili we scraped by. In 2022, the public simply couldn't have known about the close calls and near accidents behind the scenes. We simply had to read the rhetoric of Vladimir Putin and hope that the actions of the USA and NATO didn't back him into too deep of a corner. I think 10% was, in retrospect, a very fair estimate. Thankfully that number has dropped quite a bit recently.

    In addition to being substantially freaked out all year, I also read a bunch of books about nuclear history and nuclear war. It is insane that we lives our lives in spite of the terrifying fact that every moment we are a button click away from near annihilation. The chapter about Hiroshima in "The Making of the Atomic Bomb" by Richard Rhodes is seriously the most horrifying thing I have ever read. Then I read about global nuclear policy and how terribly accident and manipulation prone our nuclear systems are, and I realized we are still balancing on the edge of a knife. There's not really a solution to this problem. Obviously, we should try to minimize the risk of nuclear accidents and try to disarm the worldwide "Doomsday machine," but what really keeps things in balance is mutually assured destruction. It sucks that decision theory combined with world-ending weaponry is the main cause for the current near-peace in the world between global superpowers. Without these weapons I doubt we would have made it this long without a far reaching global conflict. We'll have to see how the world progresses with new technological weapons (such as AGI) that are winner-takes all and not beholden to mutually assured destruction.

    I have been thinking a lot less about nuclear war this year, as that fear has been replaced by AI. However, I don't think we should take our eyes completely off the ball. Nuclear risk could very well be an ex-risk, and I think that most people discount the probability just due to survivorship bias. No self-interested nation would ever give up all their nuclear weapons, and true world peace is not around the corner. Unfortunately, we will have this risk hanging over our heads for the duration of our lives. This sucks, so we should probably do our best with lobbying and donations to make it marginally less likely that the lives of us and our families will end in horrific deaths.

Sunday, April 16, 2023

Allocating Donations

     We've established in "The Investment Mechanics of Giving" that treating donations as investments is very useful. When you invest, you expect to either receive dividends (or some other cash flow) or capital appreciation (being able to sell your investment for a higher price than what you bought it). For donations, the money is no longer yours, but the impact you have is similar to that of a dividend (there are positive effects over time, perhaps forever). We also touched on the idea that you should probably have a diversified donation portfolio, and there should be a balance between less "risky" bets (such as global poverty/health, where you can be pretty sure you are doing good) and "riskier" bets (such as ex-risk, where you will likely have no impact but there is a small chance of massive impact). Since the money is no longer yours, one benefit of a donation portfolio is that it does not have to be rebalanced. The weighting you assign between the different causes only matters when you send the money to the charities. This allocation between different causes is a deeply personal decision, but there could be useful guidelines.

    I plan to have a series about each of the major Effective Altruism causes, and I will give my own opinions about how to structure your donation portfolio. Finance in the age of machines is tricky business, as if you really believe AGI is near you should probably structure your investments and your donations very differently. If you are very concerned about ex-risk and want to donate, your question is probably as follows:

              “We’re probably all going to die. So where do I invest? Where should I donate? Can I have an impact?”

              I would urge caution with this line of thinking. “Doomsayers” have existed throughout history and they have all been wrong. Yes, we are probably at the highest level of existential risk in human history (aside from the early formation of humans when there were like 200 of us) but predicting when the world will end is entirely unpredictable. Since we don’t know how the world will end, it’s hard to determine exactly when and where to donate and invest. So, still save for retirement, and you should probably still donate to non-existential risk charities. I would use Toby Ord’s “The Precipice” as a pretty good guide to existential risk probabilities. He estimates a 16% chance of existential risk within the next 100 years, so you should probably live your life assuming that in over 80% of the scenarios you make it to retirement. He estimates that there is a 10% chance AI kills us all and a 0.1% change climate change does, so we should probably focus more on AI. If you disagree with his numbers that could inform your asset (donation) allocation. If you think the AI risk is actually 50%, that should probably make up the bulk of your donations. Obviously, feasibility and capacity constraints matter. I wonder if there would be demand for model portfolios, or example donation portfolios that you could model your own donations after (here is the % Tody Ord donates to global health, nuclear risk, etc). It is probably the case that unfortunately, not enough people donate for this to really be of use.

Saturday, April 15, 2023

How Much Do You Owe the World?

     How do you determine how much to donate? This is a very complicated question. You should save for retirement, provide for your family, and save money for emergencies. Also, you should absolutely spend money on yourself and your happiness. Not just for some overly robotic reason such as "make sure you spend money on yourself so that you feel happy and continue to donate," but also because it is legitimately your right to spend money on yourself. It is not evil for you to dine at a nice restaurant, or spend a hundred dollars on roulette in Vegas, or buy a boat. Yes, you could probably instead donate that money to an effective charity. Yes, you could probably save someone's life with the money you spent on alcohol this year. But we need to be reasonable. You didn't cause the world's problems, you didn't create malaria and you didn't invent death. Also, it's extremely hard to make a positive impact with donations, and there is absolutely no certainty that anything you do contributes positive to the world over the long term. Maybe the child you cure of malaria grows up to be a warlord and unleashes a pathogen on a rival community that kills thousands. Maybe the alignment research you contribute to leads to a faster timeline to ASI, and without this quickened pace humanity would have solved alignment. We need to understand how limited the information we have is, and we have to be comfortable making decisions in situations of extreme uncertainty.

    While it is very unlikely that the drowning child that you save grows up to be a serial killer, it is still possible. So, should you let the child drown? That would be ridiculous. It's very, very easy to live a selfish life. Maybe all the excuses you are making for walking past the drowning child (well we don't really know what will happen, it's just natural selection preventing overpopulation, someone else will probably save him) are just that, excuses. I don't think you owe strangers your entire life. Read the book Strangers Drowning for a first-hand look about how miserable living your life entirely for others could make you. Still, we obviously owe something. I think that the 10% of your income is a pretty good metric. We also want to make sure we are living enviable lives, and it's hard to convince others to follow you if you are living in poverty because of your donations. Look at the life of Jesus, and then look at the lives of the typical Christian American. "It is easier for a camel to go through the eye of a needle than for a rich man to enter the kingdom of God". Has there ever been a more thoroughly ignored statement? 

    We should be practical when designing our giving pledges. We want to leave room for humanity and avoid cold-hard calculation. Not only is a life of "give everything, even the shirt off your back to the poor" unlikely to persuade others, it is also probably not morally required. You don't owe everything to the world, but you probably owe more than you are currently contributing. If you live your life believing this you will probably wind up doing a lot of good.

The Investment Mechanics of Giving

     The investment mechanics of effective giving are complicated. How much to give depends mostly on your personal capacity constraints. For example, if you are saving up for medical school and are about to need to take out $300,000 in student loans, you should probably not donate anytime soon and wait until you have paid back your loans. If you are planning to start a company that will have a big positive impact on the world, you should probably save up to ensure your company has adequate runway. Planning for children is especially hard, and given the cost of raising a child to 18 and funding their college education, it makes sense why so many people fail to donate. Better people than me have established why giving is so obvious and important (The Life You Can Save, Doing Good Better), but I don't necessarily think that 10% of your income is that realistic in all circumstances. Sometimes, it will probably be better and more effective to not donate for three years and then donate 20% of your income for five years. Timing matters, and there are no hard and fast rules when it comes to what to do with your money.

    We've established that you should donate, so now we should discuss what to donate to. There are two ways to structure your donation portfolio:

   1. Pick and choose a variety of causes that you would like to support

    2. Look at the broad level of donations across the board, and donate to the most underfunded

    Let's discuss the first option. Let's say that you care about nuclear risk, AI alignment, and global poverty. You don't really care that much about climate change, as it seems a lot of institutions and governments are aware of the issue and are working to combat it. Let's assume that you make a decent salary and donate a lot of it, and over the next 40 years you manage to donate $1 million dollars in total. Nuclear risk and AI alignment are long shots, and we can think of them as risky investments. If you fund an AI alignment company, it is unlikely that you will have a direct impact. Given the complexity of the problem and all the unknowns associated with it (will AGI happen in our lifetime, will it rapidly progress to ASI, will additional funding make any sort of difference) your additional $1 million dollars is unlikely to move the needle much. However, if your contribution happens to lead to some form of research that prevents unaligned AI or makes the ASI treat humanity better over the long term future, you might have a massive impact. Nuclear war and other existential risks (ex: chemically engineered pandemics) are probably also long shots, with a small likelihood of impact but a massive impact if they "hit." These are a bit like buying a lottery ticket, except the actual odds are completely unforeseeable. 

    Global poverty, on the other hand, is a bit like buying a government bond. GiveWell, my favorite charity in the world, does a tremendous job of finding the most effective charities in the global poverty space. GiveWell does a great job measuring impact, and you can be pretty sure that your donations are saving lives. There are probably some slightly "riskier" global poverty initiatives that aren't sanctioned by GiveWell because their impact is less easily measured, but those I would consider similar to a slightly riskier bond. Why all the investment terminology? I want you to start thinking of your donations as an investment portfolio. All of the same considerations that you think about when planning your own investments also apply here. Instead of a 60% stock and 40% bond portfolio, you should probably be in a 60% global poverty and 40% existential risk portfolio, or something similar. This way, you will ensure that your "investments" are doing good, the 60%, which will help you to continue to donate and feel good about yourself. Also, with the 40% you can ensure that you are intellectually engaged in ex-risk, and I would guess learning and thinking about these sort of topics is way more fun and stimulating.

    The counterpoint to this type of investing is bullet point 2 a few paragraphs above, which states that you should really only donate to one cause. Maybe it is the case that one of the ex-risks facing humanity is extremely underfunded. Maybe the "bang for buck" of nuclear risk charities is way higher than other areas, and even a small donation could tip the scales greatly towards averting a nuclear war. Unlike traditional investing, we are all in this together. The aggregate, global portfolio is really what matters and what determines the allocations to specific causes. In this case, you should focus all of your donations on this one "undervalued" investment. The problem is, I find it hard to believe anyone can adequately forecast how important each of these issues is relative to each other. From my conversations with alignment research companies it seems that they are actually overfunded, and they have quite a bit of cash just sitting there doing nothing. In that case, donating to them contributes effectively nothing, whereas you could donate to global poverty and have an impact. Still, in my opinion AI alignment is by far the most pressing issue facing humanity, and this issue will have a direct impact on all other ex-risks and even long term global poverty. However, I'm not quite sure where to donate, and I want to be sure I'm not just providing funding for someone to quit their job to learn about AI for 3 months and then start an unsuccessful AI alignment blog. Also, I want to be careful I'm not actually contributing to capabilities research and making the problem worse. So, what should we do?

    I pretty much stick by some blended portfolio of causes, such as a 60/40 portfolio of global poverty and ex-risk. Diversification is very important, and it is easy to have horrendous results if you put all your eggs in one basket (imagine you find out that the single charity you have been donating 10% of your income to for the last 20 years is stealing money or spending it very ineffectively). I think there should be some short term, easy "wins" such as GiveWell in any portfolio, so that you can be sure you are making a positive impact. Then, with some leftover cash you have some fun and swing for the fences, and maybe contribute very positively to humanity's long term future. Happy investing!

Friday, April 14, 2023

Free Will

    The less free will we have, the more we should be sympathetic towards other people's situations. If free will doesn't exist, we should look at a criminal with pity. They didn't choose this life, they didn't choose their parents. They didn't choose their upbringing, or their bad influences, or their faulty brain chemistry. Since we don't have free will, we shouldn't really judge. However, if we are judging others heavily even though free will doesn't exist, we shouldn't be too hard on ourselves. If we look at that criminal and say out loud "god, what scum," we shouldn't judge ourselves too much, even if that reaction isn't logical. We can't help that we judge them, given our upbringing, genetics, and neurochemistry. See the unfolding paradox here? 

    The less free will we have, the more understanding we owe towards other people but also the less understanding we owe other people. If someone else is a bigot then it is not really their fault, but that means its not really my fault either for being a bigot. If we can't hold others accountable, we shouldn't hold ourselves accountable either. This is why I'm not really a fan of using free will as an argument for anything. It strays too close to the nihilistic "everything is permissible" boundary to be useful for anything. Maybe everything is malignantly useless, but Pascal's Modified Wager is useful here. If there is a 50% chance that nothing matters and a 50% change that something matters, you should probably live your life believing that something matters. If you are wrong, it doesn't matter anyway. Maybe free will doesn't exist, but then it doesn't really matter what you do or think because you can't change your mind or affect the events that were set in motion by the big bang. So you might as well think that it does. Maybe it doesn't work the way tradition thinks it does, and we probably should be empathetic towards people's background and situation. But the stakes are high, and it is easy to create very bad incentives by leaning too far to either side.

Doing Good, or Not Doing Bad?

      Effective Altruism, as a philosophy, is very simple. Basically, the argument is that if you shouldn't do bad in the world, that me...