Modern society has an uncomfortable, often contradictory relationship with “blood money”. Here, I use the term “blood money” to describe money that is either gained through the use of violence/ threats of violence (coercion), or comes directly from an individual or organization that employs such tactics as a matter of course. For example, money from a kidnapping ransom or money that comes from a murderous crime syndicate.
At first glance, the issue would seem to be without controversy. Most people would probably say that blood money shouldn’t be knowingly accepted by people who want to keep their conscience clean. There are even laws in place that reflect these values, such as “Know Your Customer” and “Anti-Money Laundering” laws that exist to prevent criminals from using the financial system.
And yet, upon investigation it would appear that society’s aversion to blood money is more rhetoric than reality. This is perhaps due to how deeply blood money has penetrated society, how thoroughly violence and coercion have permeated society’s customs and norms, to the point where in response to breaking the law, as Mike Gogulski has pointed out, “the penalty is always death“. Thus blood money becomes inescapable as it circulates through the economy. Nobody’s hands are completely clean.
This blog post was prompted by a tweet that came across my feed criticizing Elon Musk for suggesting that he could finance taking Tesla private again using money from Saudi Arabia.
Already well known for their public executions and stifling of dissent within the Kingdom, Saudi Arabia is currently under investigation for allegedly sending a “killing team” to Istanbul to murder a Washington Post journalist inside of the Saudi Consulate.
As a primarily deontological ethicist, I sympathize with the point of view expressed by the tweet’s author towards Elon Musk. Back in 2016, I myself took a similar jab at Uber:
And yet at the same time, I also sympathize with the Elon Musks and Ubers of the world, at least when it comes to this specific issue. It’s easy to fall into the trap of coldly calculated consequentialism when you’re making big decisions that affect millions of people and involve the cooperation of thousands of others.
How discerning can you be about the moral purity of your employees, your partners, your investors when you’re dealing with numbers that big? In a world where society runs on blood money, the only choice it appears we have is how dirty we allow our hands to get. It’s impossible to completely isolate oneself from evil, given the totality of modernity. This is evidenced by the vanishingly small number of “uncontacted peoples” left on Earth.
Thus it seems that no one can be pure, and at best we can only negotiate about how bloody we allow our hands to get before we invoke the moral judgement of our peers. As much as we want the issue of blood money to be clean and simple, black or white, it would seem that all of ours hands are dirtied by shades of gray.
Faced with such a reality, the best choice appears to be a mix of deontological ethics and consequentialism: commit to a limited number of specific values (for example, don’t murder, don’t steal, the “golden rule”) and then try to optimize for the best outcomes. Sometimes that may mean tolerating or even partnering with others whose actions run counter to those values, as in the case of Tesla taking money from Saudi Arabia. While Elon Musk might never murder a journalist with his own bare hands, he will tolerate taking money from someone who has in pursuit of a larger goal. For Musk, the ends would justify the means.
I’m reminded of a quote from philosophy professor Will MacAskill on the 80,000 Hours podcast. He says:
…[I]t seems like given the obvious analogy with decision making under empirical uncertainty, we should do something like expected value reasoning where we look at a probability that we assign to all sorts of different moral views, and then we look at how good or bad would this action be under all of those different moral views. Then, we take the best compromise among them, which seem to be given by the expected value under those different moral views.
Elon Musk might make the decision that, while he would prefer not to finance his company with blood money from an organization that murders people, he expects that the outcome will be a net improvement over the outcome if he didn’t take the blood money. He can’t do nothing – everyone has to act, action means decisions, decisions mean consequences, and so we must try to act in a way that leads to the best possible outcomes.
So Musk decides, I’m not directly harming anyone by taking the money, in fact I’m using the money to help people, and I’m not responsible for how the Saudis use the returns on their investment in Tesla, so I will take their blood money and use it to make the world a better place. (Of course, this is hypothetical; I’m not sure what Elon Musk’s real justification for taking the money would be.)
In a poll I started in response to this issue, respondents were nearly evenly split on the question of whether it is morally wrong to accept blood money in the pursuit of noble goals, with those answering “no” only narrowly coming out ahead and about a third of respondents abstaining from the question altogether:
Written responses ranged from “Yes it is morally wrong because it legitimizes bad behavior” to “No it is not wrong to take the money but it is wrong to pay it back” and finally “The ends justify the means”; essentially samples across the whole spectrum of possible answers. And I’m not sure any one of them is the “right answer”.
I ask myself if I would take the blood money. Regardless of what I’d intend to do with it, I feel certain that the answer would be “no”. But then I wonder, what about blood money two or three steps removed from the source? How faded would the blood on the money have to be for me to feel comfortable taking it?
And for that question, I don’t have a good answer.
Email is probably the most popular decentralized messaging protocol. Add yourself to my email contacts if you would like to stay in touch!