The document discusses some potential mistakes of effective altruism, including disregarding interpersonal values, making bad life choices, and adopting unbalanced views. It argues that while the goal of doing the most good is simple, human cognitive limitations make it easy to systematically err when attempting to implement it. Various biases, short-sightedness, overconfidence in reasoning methods, and disregard of human psychology and common sense can lead effective altruists astray. Careful consideration of alternative perspectives, convergence of views, outside opinions, and moderation are recommended to avoid mistakes in effectively doing good.
3. Argument
• Effective altruism: “Doing the most good by using reason
and evidence”
• An explicit attempt of following this simple principle can
sometimes result in systematic mistakes
4. Why?
• Methods and moral theories associated with EA such as
expected value theory, Bayesian probability theory,
utilitarianism
• It’s easy to make a mistake when running on corrupted
hardware (human brains)
• Paradoxical: new knowledge about rationality can in
some cases lead to worse decisions compared to no
such knowledge
Yudkowsky, E. (2007) Knowing About Biases Can Hurt People.
6. • Dismissing interpersonal virtues and social rules such as
honesty, integrity, friendship, kindness
• Lying/stealing to save money to donate
• Behaving arrogantly
→ Being perceived as unfriendly, rude, untrustworthy,
manipulative
7. Sources
• Short-sightedness (not taking into account long-term
effects and unintended consequences)
• Ignorance or disregard of human psychology in social
contexts
• Not integrating reasonable aspects of other moral views
8. Instrumental harm
• Impartial altruism and instrumental harm are psychologically dissociated
(Oxford Utilitarianism Scale; Kahane et al., 2017)
• Impartial altruism is central for EA:
willingness to give personal resources to help impartially
• Instrumental harm not relevant for EA:
willingness to harm a few for the greater good
• Associated with psychopathic tendencies (Kahane et al., 2015)
• Signals untrustworthiness (Everett et al., 2016)
Kahane, G., Everett, J. A. C., Earp, B.D., Caviola, L., Faber, N.S., Crockett, M.J., Savulescu, J. (2017). Beyond Sacrificial Harm: Positive and
Negative Dimensions of Everyday Utilitarian Decision-Making. Psychological Review.
Kahane, G., Everett, J. A. C., Earp, B. D., Farias, M., & Savulescu, J. (2015). ‘Utilitarian’ judgments in sacrificial moral dilemmas do not reflect
impartial concern for the greater good. Cognition, 134, 193-209.
Everett, J. A. C., Pizarro, D. A., & Crockett, M. J. (2016). Inference of trustworthiness from intuitive moral judgments. Journal of Experimental
Psychology: General, 145(6), 772.
9. Countermeasures
• Follow virtue ethical and deontological standards in
everyday life
• Be honest, trustworthy, cooperative, friendly, kind
• Don’t push people from footbridges
Schubert, S., Garfinkel, B. & Cotton-Barratt, O. (2017) Considering Considerateness: Why communities of do-gooders should be exceptionally
considerate
Christiano, P. (2016) Integrity for consequentialists.
Tomasik, B. (2013) Why honesty is a good policy.
11. Examples
• Working too much (burnout)
• Working on something you don’t enjoy (not sustainable)
• Hastily giving up your current career
• Hastily giving away all your money
• Not allowing yourself to live a normal life (hobbies,
friends, family)
• Giving up on old friendships (interpersonal virtues)
12. Sources
• Over-enthusiasm
• Dedication signalling (to others and yourself)
• Short-sightedness
• Ignorance or disregard of human psychology, its needs
and limitations
• Disregard of common-sensical lifestyle and life choices
13. Countermeasures
• Don’t be less dedicated! But don’t let
your dedication drive epistemic/
instrumental mistakes
• Make important life decisions very
carefully
• Take outside view
• Take the needs and limitations of human
psychology seriously
• Take common sense seriously and adopt
some moderation (e.g. working hours)
15. • Unreflective endorsement of anti-commonsensical views
• Fragile conclusions
• Conclusions implying that everything else is unimportant
and everyone else is wrong
• Conclusions considered unethical by other reasonable
perspectives
• Too strong belief oscillations
17. Sequence thinking
• Considering only one view
• Not considering plausible alternative
assumptions
• Easy to make a mistake somewhere
Karnofsky, H. (2014) Sequence thinking vs. cluster thinking.
18. Explicit EVT and Bayes
• Underestimating complexity and uncertainty
• Overestimating the discrepancies between different
options’ expected value
• Hastily identifying unjustified Pascalian scenarios
• Updating too strongly on new weak evidence
19. Signalling contrarianism
• Appearing cool and wise by holding weird beliefs
• Bias against common sense
Alexander, S. (2010) Intellectual Hipsters and Meta-Contrarianism.
20. Countermeasures
• Be epistemically modest
• Integrate many plausible views
• Consider the opposite view
• Apply some “cluster thinking”
• Identify convergences between different views
• Take common sense and outside view seriously
Karnofsky, H. (2014) Sequence thinking vs. cluster thinking.
Beckstead, N. (2013) Common sense as a prior.
21. How can progress still occur?
• If a conclusion is extremely anti-commonsensical, a red flag should be raised
• But the conclusion can still be justified if
• the case for it is extremely strong
• it is robust, i.e. many perspectives converge towards it
• there is agreement among experts
• there is reason to assume disagreers (e.g. general public) are wrong, biased
• Examples: far future, anti-speciesism, but not: steal-to-give
• Weird and anti-commonsensical ideas are important and should be encouraged
• New ideas will sometimes justifiably result in anti-commonsensical conclusions
22. Conclusion
• Doing the most good is a simple principle but difficult to
implement
• Given our limited cognitive abilities, it’s easy to make
mistakes when attempting to follow the principle
• Let’s identify and avoid these mistakes!
26. Opposite
• Unwillingness to change the status quo (e.g. career)
• e.g. due to low dedication or self-serving reasons (maybe
with rationalizations)
• Probably more people in the latter category, but the
former is more harmful on an individual level
27. Groupthink
• Groupthink within EA community
• “Groupthink is a psychological phenomenon that
occurs within a group of people in which the desire for
harmony or conformity in the group results in an
irrational or dysfunctional decision-making outcome. “
29. Fanaticism
Extreme version and combination of above points
• Too extreme views
• Anti-commonsensical views and behavior
• Violation of prevailing social norms
• Behavior considered unethical by others
• Obsessive enthusiasm
• Uncritical zeal
• Little tolerance for other views
30. Fanaticism
• If a set of ideas has the potential of being interpreted the
wrong way, be careful. historically, lots of harm has been
caused by fanatics