The case against effective altruism
Second order effects of thinking you're literally saving the world
Effective altruism (EA) has come into the spotlight recently due to one of its most well known proponents, Sam Bankman-Fried being discovered as a fraud.
In a nutshell, EA is about applying logic and evidence when deciding on charitable giving. Advocates of effective altruism would argue that giving should be made to optimize some objective universal measure (e.g. number of global lives saved). The de-personalization of giving is thought to be a strength as it removes biases and global injustices.
The second order effects include a focus on making money. An EA believes he has an effective means of converting dollars to a social good like ending human suffering. Naturally the EA may optimize his life to make a lot of money to serve that goal.
What would you do to save a drowning child?
If I saw a child drowning, I wouldn’t hesitate to try and save that child. I would gladly risk some personal safety or material possession for a chance to save a life. Not only that, but I would risk things that don’t belong to me. If there was a lifeboat nearby, I would commandeer it without thinking twice.
My biggest issue with EA is that its most adherent proponents believe their models and mission so much that they see drowning children everywhere. Removing the humanity behind giving and seeing everything as transactional could be used to justify a lot of bad behavior. It’s easy to fall into a God complex thinking you’re literally saving the world.
EA also can give off creepy colonial vibes. Would Americans accept a birth control initiative targeting Indian reservations being funded by some Chinese NGOs? Certainly not, but we think it’s okay to meddle in local affairs because we have spreadsheets that are calibrated for the “optimal” birth rate.
Effective altruism and consequentialist utilitarian
When EA is mixed with a consequentialist utilitarian philosophy and a probabilistic view of the world, it becomes particularly dangerous. Here’s SBF on the topic:
To maximize your expected value, you must aim for it and then march blindly forth, acting as if the fabulously lucky SBF of the future can reach into the other, parallel, universes and compensate the fails on SBFs for their losses. It sounds crazy, or perhaps even selfish—but it’s not. It’s math. It follows from the principle of risk-neutrality.
“I think it’s hard to justify being risk-averse on your own personal impact,” SBF told me when I quizzed him about it—“unless you’re doing it for personal reasons.” In other words, it’s selfish not to go for broke—if you’re planning on giving it all away in the end anyway.
Whereas someone else might choose to forgo some potential personal fortune if it means committing fraud, an EA has a lot more on the line. Remember, millions of lives! It doesn’t even have to work in this timeline. Just look at the expected value over an infinite set of universes. If there’s a 51% chance of saving a million people and a 49% chance of killing a million people, the math would tell you that you have to do it. You basically became the paperclip maximizer.
Ineffective Altruism
Instead of sending your money to a faceless global NGO promising to save X lives for every Y dollars it receives, consider instead buying your local pee-wee baseball team a new set of uniforms. Is this objectively the best way to do charitable giving? Certainly not. It’s hyper local and fulfilling something relatively high in the Maslow hierarchy of needs.
But consider the workflow. You call up the coach, get the supplier details, call the supplier, order the uniforms and in a few weeks you go see the kids playing in the new uniforms you bought them. You can even cut out a few steps and cut the coach a check directly.
Your money will go directly to what you set out to do. It won’t be going to consultants or paying for expensive real estate of the foundation, or phone banks operators, or tote bags, or anything but baseball uniforms. As a bonus, if you come across the opportunity to commit billions of dollars worth of fraud to fund this charitable cause, you’d probably decide it’s not worth it.
Who made that guy king of EA though.