Friday, March 24, 2023

Taxes, and then Death

    Let's talk about taxes for a second. A lot of people don't donate because the government forcefully takes some of their money each year for taxes. These taxes go on to fund things like military, the healthcare system, and a very small amount goes to foreign aid. Taxes are not a substitute for donating. In theory, the majority of taxes can benefit you as an individual. Taxes are a way to avoid the tragedy of the commons. You pay taxes to fund the military, so the military protects you from invaders. This works for the court systems, police department, and fire department as well. Taxes also act as mandatory insurance. Even if you are a millionaire and could afford your own healthcare, if all of your money gets stolen, you will have a safety net. If you live under an authoritarian regime that does nothing but bad things that hurt people with your money, figuring out how to safely pay less taxes would be a good thing. If 100% of your taxes went to the most effective ways to positively benefit humanity, paying more than your required share in taxes would be a good thing. Ultimately, no large country operates near either of these. 

    I would argue that the majority of the taxes you pay do nothing to positively affect humanity. Getting the government to do better things with the money they take is good, but given the massive problems with bureaucracy and the lack of good incentives/controls, I would shy far, far away from advocating that paying more in taxes is an effective substitute for effective donations. That is not to say that countries would benefit from a better tax structure, but rather that the current tax structures do not allow you to have a positive marginal benefit on society.

    Now let's talk about the second certainty in life, death. End of life care is extremely expensive. One study in the Journal of General Internal Medicine in 2020 found that the average cost for the last month of life in the US was around $18,500. That's quite a bit of money, more than the average salary of a typical human in 2020. Personally, if I hit an age where I am extremely ill and content with my previous life, I may choose to hit the "little red button." Not to say that this should be required, but it should be something we all think more about.

Losing Time Effectively

     Time is money. In this regard, losing money effectively is no different than losing time effectively. I chose to name this blog "losing money effectively" because I think seeing donations as a optimal way to lose money is a funny perspective. Also, I have a finance background and eventually hope to make some contributions to the wider world of effective altruism through this lens.

    The only way to get money is to spend time acquiring it, valuable time that could be spent doing something else. In this sense, donating money could mean something substantial for your time. If you are a billionaire this is not the case, but if you have rigid retirement goals and are not a billionaire, a life of donating 10% of your income per year means something. Maybe you retire later. Or, maybe you spend more time doing tasks that you would not have otherwise (housework instead of hiring a maid, washing car instead of car wash, child care instead of thousands of hours raising children). The less money you have, the more time you would save by acquiring more money. So, why is this important?

    Making a lot of money and donating it, also called earning to give, basically makes the assumption that donating money can be much more valuable than donating time. My point above is that these are not really that distinct, as you are donating time either way. Depending on your situation, you may just be donating your time in a much more effective way. To be clear, I am not sold on utilitarianism. If you are totally sold on the EA philosophy, you could argue that retirement is immoral. If the cost of saving a life is $4,500, then by working another year at a job that pays $45,000, you could save ten lives. Sounds like a potential moral obligation to me. I think utilitarianism is a good lens through which to view the world, especially service work, rather than a hard and fast religious dogma. Instead of working a a soup kitchen in a first world country and making $35,000 at year, maybe the call to make $85,000 as a law clerk and donating $40,000 could be a good trade. You don't get the same pat on the back, but utilitarianism tells you that you shouldn't care about that. This way of thinking feels robotic, but it shouldn't.

    If you are like me, you have already begun to suspect that making money and doing good might be two distinct steps. Maybe you should do each separately, simply due to the difficulty of getting an optimal outcome by pursuing both at the same time. I will keep searching for a career that threads this needle, high pay and high impact, but maybe it is not the worst thing ever if that job does not exist?

My Problem with Utilitarianism

     You are in a room with Jeff Bezos, the previous CEO of Amazon. He sold all of his Amazon stock and is now sitting on 100 billion dollars worth of cash. He is thinking of burning all of it (assume the supply of money in the US stays the same regardless). He says to you, "if you sleep with me, I will donate 100 billion dollars to the most impactful charities in the world. Otherwise the money burns." Are you morally obligated to sleep with Jeff Bezos? I really, really hope not. But taking some utilitarian arguments to the extreme, it seems that through your inaction you may be "causing" an extreme amount of suffering. Perhaps we cure malaria five years earlier than we would otherwise, perhaps world hunger is pushed off for a decade. Thousands of people, perhaps millions, could live fulfilling lives full of heath and wellness as a result of this donation. Are you making a morally terrible decision by not taking up Jeff on his offer? Are you in some sense responsible for the suffering that you could have avoided?

    To me, this is the most convincing argument against utilitarianism and against donating as a whole. If you start thinking of counterarguments that ignore my main point, you are missing my main point. Swap Bezos with a robot that wants you to cut off your arm in exchange for 100 billion, or this entire scenario for the big red button I mentioned in a previous post. It may be the case that personal dignity and even human rights are worthless, except insofar as they contribute to the collective well being of humanity as a whole (over the long term). Maybe human rights are only worth something because if we treat them as worthless, then even greater harm and suffering would persist over time. In a one-off occasion such as the above, maybe there is no room for human dignity. But wow, is it hard to believe that.

    One aspect of effective altruism that many people don't like is related heavily to the Bezos example above. The meat of the question is this: "should you bow to power?" If becoming subservient to your overlords is required in order to do good in the world, count me out. Why would I ever want to be part of a system that values my personal worth and dignity so low? Am I somehow responsible for the suffering of a world that I had no hand in creating? It is easy to take a look at these arguments and say "wow, you are right. Guess I'll go back to being selfish and never helping anyone ever." That is also missing the point. I think that the water is really murky. Thousands of people have debated derivatives of the trolley problem, and I know I am not adding much flavor here. Basically all I am asking is if you have to throw yourself in front of the trolley to save a million, less be a immoral person. Maybe its just the millions of years of survival instinct programmed into me, but I can't see how this would be the case.


Thursday, March 23, 2023

Would You Press the Button?

     There is a big red button if front of you. If you press this button, cancer will be cured. Wars stop, and so does world hunger. Humans may still have some problems, but suffering for billions of people ceases. Plus, another cause specific to your preferences are solved. Maybe factory farming ends, or climate change, or racism. Take your pick. So, there is this big red button is in front of you, and if you press it, a lot of really, really good things happen. Things that you could by no means accomplish in your lifetime, things so good that your decision to press the button will be the single most important and beneficial decision to humanity so far in human history. Don't overthink this and make excuses as to why maybe this wouldn't be a good thing (ex: well, if cancer wasn't a thing there would be overpopulation, etc.) No. I am telling you that it will be really, really good for the rest of the world if you press the button. However, there is a catch. If you press the button, you die. Instantly. Now, here comes the question.

Would you press the button?

    Contemplate this for a minute. This is an interesting question. It a personal question, one that reflects how much you truly value your life. However, I think there is a more important question. 

If you don't press the button, are you a horrible person? 

    Through your inaction, your refusal to make a sacrifice, billions will suffer around the world. You will maybe only enjoy another seventy or eighty years on the planet, yet you would choose this over the happiness and preferences of the entire world. Should we condemn you for being so selfish? Should we shun you, should doing such a selfish thing be considered worse than any crime you have heard of?

    If I was given a choice between my life and that of a family member, the decision is simple. If it meant one of my family members should live on, I would end my life without hesitation. Given the choice between my life and that of five strangers, I would pick my life. This is logically consistent with how I live my life, given all of the things that I fail to do for other people. Given the choice between my life and the lives of a thousand strangers, how should I choose? These questions matter. If utilitarianism is really a good moral guide, inaction is morally bad. Refusing to press the button is an objectively bad decision, and we should all be shamed for making the decision. It has been true for all of human history that martyrdom is noble. It is an act of heroism that should be celebrated. Giving up your life so that others may live is extremely honorable. But maybe, just maybe, it is actually required.

Monday, March 20, 2023

The Savior Complex

    When I was growing up, I was convinced that I would be important. If you had asked me at age fifteen, "are you going to change the world?" I would have leaned towards "yes" as an answer. It is easy to have delusions of grandeur when you are successful early in life. I was athletic, intelligent, and saw nothing but success in both areas for the first twenty years of my life. I thought that I had higher moral character than most of those around me. I would have denied it at the time, but I'm sure a large part of me wanted to be wealthy and powerful. Once I became an actual adult, my perspective changed. I saw, for the first time, how difficult achieving lasting impact was. Every amazing biography I read was by someone who was already forgotten, or soon to be. I would credit books and life experience equally in changing my perspective, and I began to realize that maybe it was ok to not be massively successful. Maybe distancing yourself from the rat race was a good thing, maybe making a stable income, raising a good family, and accomplishing reasonable goals was good enough. Maybe you help others along the way, sometimes volunteering your time and money, but mostly your positive impact deals with character. You treat those around you well. You are a great husband, and a great father. You are nice to the convenience store workers. You smile. And then, at some point, you die. That seemed like a good enough life. One where you don't obsess over worldly possessions, you don't obsessive over money or power. Maybe not the most impressive life, but a very dignified one.

    Or maybe not. Maybe it is actually immoral to live your life in that way. Maybe seeking wealth and power is not only good, it is the only thing you should do. Maybe rising through the ranks and moving mountains is the point of life, and every materialistic urge you have can actually further your moral progress. What sort of ridiculous philosophy would encourage this? Simple, Effective Altruism. In a world where your moral significance is measured based on the positive impact you have on others, this argument is persuasive. Raising a good family and being nice to a few people is good, but making one hundred million dollars and donating it all to fighting malaria is orders of magnitude better. Those in positions of wealth and power are in a much better position to do good, so seeking to be in that position is reasonable. It may hard to disentangle which part of your drive to the top is selfish, and which part is altruistic, but the end goal is the same. There are two steps: achieve wealth and status, and then use this wealth and status to make a positive impact. This first step may be reasonable, or it may be of the devil. Since becoming an effective altruist I have struggled with wrapping my brain around the first step. I know it won't make me happy, but how much does my happiness matter? Generally seeking money so strongly would be seen as selfish, but through this lens my desire to not seek money can be seen as selfish. My desire to not work myself to death, to enjoy my life, to have fun and relax, can be seen as doing nothing in the face of massive suffering. Silence is violence, in this sense. Longtermism takes this a step farther, pushing the dreaded main character syndrome even further down the line.

    If humanity dies out, that is a massive moral loss. Trillions of potential future lives vanish, and we have no way of knowing if sentient life will reemerge in the universe. Maybe it doesn't, and everything becomes truly meaningless. What sort of person would I be if I ignored this? Maybe spending my life attempting to make even a small impact on ex-risk can tip the scales. Should I even start a family? Or should I devote the next two decades to working 100 hours a week to ensure that when AGI comes it is friendly? The more this line of thinking continues, the more I become humanity's savior. Humility vanishes, and it seems clear that I know better than everyone. People are living their lives in ignorance, while I, the smartest individual in the room, know that nuclear war, bioterrorism, and AI all threaten their lives. I am working tirelessly to prevent this, and yet I don't receive any thanks. I am a true martyr, sacrificing in obscurity. This savior complex seems impossible to avoid. My guess is that many EAs have this, especially those dealing in longtermist issues (MIRI comes to mind). Since I identify with the cause, I am not convinced this savior complex is irrational. However, it is clearly sad and unrealistic. The weight of the world simply cannot fall on the shoulders of a single individual. The world's current problems and future problems do not depend on me, and it is likely that nothing I do will change anything. It cannot possibly be true that I am humanity's savior. I am not the main character. Only through recognizing this can character really begin to build. Only through taking a look at the night sky can we begin to recognize our place in the universe. Hubris used to be the enemy, and it still is. Materialism used to be the enemy, and it still is. Maybe focusing on altruistic impact makes the picture less clear, but maybe it doesn't. Maybe staying humble and shedding self-importance is the only path forward. A path towards an impactful life full of dignity and happiness.

Sunday, March 5, 2023

Asymmetric Payoffs for Ex-Risk

    Another idea I have been kicking around is the idea of asymmetric payoffs. There are many professions where playing offense is extremely lucrative, and playing defense is valuable but not lucrative. One example of this is computer security. If you are a decent hacker, you can probably make millions of dollars through white-hat hacking, or you could just steal actual money. Being a great cybersecurity analyst who writes amazing firewalls will be useful, but you may only make six figures unless you can scale this skill to multiple companies. Being a great hedge fund manager is tremendously lucrative, but being an amazing head of compliance is not. In both these cases, if everything goes perfectly well for the "defense," you are simply doing your job.

    AI safety, nuclear prevention, and bioterrorism prevention all fall under this same logic. If you prevent mass suffering through these means, it is unlikely that anyone will know or give you credit. The person that adds one ingenious subroutine that prevents unaligned AI will forever be unknown, since we will not know if the outcome would have been different otherwise. Despite massive personal stress, the Russian-Ukraine war (I prefer calling it the Russian civil war) has not, at this point, resulted in a mass nuclear holocaust. As a result, some of my good friends have told me that they are not too worried about nuclear war in the future. This weird bias seems to plague entire institutions. Given that nuclear war hasn't yet happened, I am not quite sure who to thank. However, I am sure there have been people in positions of power who make the right decision at the right time, and as a result prevented millions or billions of deaths. If nuclear war happens, it will probably be obvious who to blame. I am not sure what to do with this information, expect thank those who have worked in "defensive fields." They have probably massively benefitted humanity, and they have not received due recognition. To Vasili Arkhipov and those alike, thank you.

Doomsday Arbitrage

    In the face of nuclear war, should you buy stocks? My argument is that you absolutely should, because of  a concept I am calling Doomsday Arbitrage. I am officially coining this term, and I will explain below.

    Let's say that Russia announces that it is going to send a nuclear missile hurling towards the United States in two hours. The market is open, and the S&P drops ten percent instantly. If you put all of your money into the stock market, and this ends up not happening, you will likely make ten percent of your money back. If the nukes actually start flying, then money is the least of your problems. USD will probably become worthless in a post-nuclear-holocaust society, and regardless it will be locked up in some TD Ameritrade account. The servers linking this money to you may be destroyed, the TD Ameritrade buildings could be blown up, and/or every employee at TD Ameritrade might be dead. The global banking system has likely collapsed, and you are now in the midst of a post-apocalyptical society that will probably run on the barter system. You'll probably soon be victim to some form of incredible violence. So, you should buy stocks.

    Basically, I am saying that selling puts on the S&P500 at $100 is free money (assuming you hedge volatility). If the S&P500 ever goes below $100, you simply have bigger problems. Money is probably worthless, and you are in mortal danger. This line of thinking becomes more applicable when you apply it to a near term catastrophe in your personal life. I have always been of the opinion that given current bankruptcy laws, we should be way more risk seeking. If you can flip a coin between winning one million dollars and losing one million dollars, flip the coin. Worst case you file for bankruptcy and start back at $0. The limited downside of financial crimes and the high upside of name recognition mean that every hedge fund trader should probably insider trade, or at least take wild speculative bets. My guess is that they probably do.

    I do not think that the concept of Doomsday Arbitrage is going to make you money. However, it could  be a VERY useful idea for risk management. Is your financial advisor having money problems? Fire him. Is your fund manager near bankruptcy? Fire him. Given the asymmetric payoff for many financial professionals, you as an investor need to be very sure that you are including personal "doomsdays" into your analysis. Profiting from this sort of highly skewed asymmetric risk tolerance is difficult, unless you are in a position near ruin. Let's say you are under the poverty line and are given the chance to flip a coin. If the coin lands on heads, you get one million dollars. If the coin lands on tails, you get negative ten million dollars. This is a one time bet. If there are bankruptcy laws, flip that coin without hesitation. If you're going to be held to that debt to the rest of your life, whether you flip or not depends on how bad your life currently is. This should seem obvious, but I don't think people take full advantage of the situation. They also confuse high potential payoffs with high expected return (ex: lottery). You can translate this line of thinking outside of the financial sector, into everyday life. Given that the world is extremely uncertain, life is fleeting and fragile, and a variety of actual doomsday scenarios may be waiting around the corner, we should all be taking much greater risks.

Saturday, March 4, 2023

Losing Money Effectively

    I figured that I would start a blog about donations, impact investing, and various other finance related ways to do good in the world. There are very few finance inclined people involved in effective altruism, and as a result there doesn't seem to be a lot of guidance surrounding how to structure your donations.

    Donating money to the most effective causes is the same as losing money in the most effective way. There are many discussions about how to gain capital, and there are many discussions about how to avoid losing capital, but there are not many discussions about how to best lose capital. In order to make any sort of impact on the world, you will have to part with capital. Donations are a great way to lose money, but impact investing and ESG allocations are others. Whenever you exclude potential investments from your opportunity set, you at the very least lose out on diversification. Generally, you are left with a sub-optimal portfolio in regards to risk and return. This is fairly obvious to everyone involved in empirical finance, but the marketing dollars spent by various interested groups has made this unfortunately controversial. In my opinion, it is very unlikely that a portfolio that contributes positively to the world (as the starting goal) will end up being the most optimal portfolio. That is not a problem! If doing good in the world is a goal, then losing out some diversification or some expected returns should not dissuade any investor. If it does, then you actually don't care about doing good in the world.

    So, what are the best ways to lose money? Hopefully, together we will find out.

Doing Good, or Not Doing Bad?

      Effective Altruism, as a philosophy, is very simple. Basically, the argument is that if you shouldn't do bad in the world, that me...