Content deleted Content added
Make all citation templates Citation Style 1 for consistency per WP:CITEVAR. |
Use Template:Multiref2 for existing bundled citations. |
||
Line 40:
=== Effective altruism ===
In 2017, ''Slate Star Codex'' ranked fourth on a survey conducted by Rethink Charity of how [[effective altruism|effective altruists]] first heard about effective altruism, after "personal contact", "''[[LessWrong]]''", and "other books, articles and blog posts", and just above "''[[80,000 Hours]]''."<ref>{{Cite web|last1=Mulcahy|first1=Anna|last2=Barnett|first2=Tee|last3=Hurford|first3=Peter|date=17 November 2017|title=EA Survey 2017 Series Part 8: How do People Get Into EA?|url=https://rtcharity.org/ea-survey-2017-part-8/|url-status=live|archive-url=https://web.archive.org/web/20190429135314/https://rtcharity.org/ea-survey-2017-part-8/|archive-date=29 April 2019|access-date=9 September 2020|website=Rethink Charity}}</ref> The blog discusses moral questions and dilemmas relevant to effective altruism, such as moral offsets (the proposition that bad acts can be cancelled out by good acts), ethical treatment of animals, and trade-offs of pursuing systemic change for charities.<ref>{{multiref2
| {{Cite book|last1=Chan|first1=Rebecca|url=https://www.worldcat.org/oclc/1126149885|title=Oxford Studies in Philosophy of Religion|last2=Crummett|first2=Dustin|date=29 August 2019|publisher=[[Oxford University Press]]|isbn=978-0-19-188069-8|___location=Oxford|pages=|chapter=Moral Indulgences: When Offsetting is Wrong|doi=10.1093/oso/9780198845492.003.0005|oclc=1126149885|chapter-url=https://oxford.universitypressscholarship.com/view/10.1093/oso/9780198845492.001.0001/oso-9780198845492-chapter-5|archive-url=https://web.archive.org/web/20200909014312/https://oxford.universitypressscholarship.com/view/10.1093/oso/9780198845492.001.0001/oso-9780198845492-chapter-5|archive-date=9 September 2020}} === Artificial intelligence ===
Alexander regularly wrote about advances in [[artificial intelligence]] and emphasized the importance of [[AI safety]] research.<ref>{{cite book|last=Miller|first=James D.|chapter=Reflections on the Singularity Journey|date=2017|chapter-url=https://link.springer.com/10.1007/978-3-662-54033-6_13|title=The Technological Singularity|series=The Frontiers Collection|volume=|pages=223–228|editor-last=Callaghan|editor-first=Victor|archive-url=https://web.archive.org/web/20200909014324/https://link.springer.com/chapter/10.1007%2F978-3-662-54033-6_13|place=Berlin, Heidelberg|publisher=Springer Berlin Heidelberg|language=en|doi=10.1007/978-3-662-54033-6_13|isbn=978-3-662-54031-2|archive-date=9 September 2020|editor2-last=Miller|editor2-first=James|editor3-last=Yampolskiy|editor3-first=Roman|editor4-last=Armstrong|editor4-first=Stuart}}</ref>
In the long essay "Meditations On Moloch", he analyzes [[Game theory|game-theoretic]] scenarios of cooperation failure like the [[prisoner's dilemma]] and the [[tragedy of the commons]] that underlie many of humanity's problems and argues that AI risks should be considered in this context.<ref>{{multiref2
| {{Cite journal|last=Sotala|first=Kaj|date=2017|title=Superintelligence as a Cause or Cure for Risks of Astronomical Suffering|url=http://www.informatica.si/index.php/informatica/article/view/1877/1098|journal=Informatica|volume=41|pages=389–400|archive-url=https://web.archive.org/web/20200220215810/http://www.informatica.si/index.php/informatica/article/view/1877/1098|archive-date=20 February 2020|via=}} === Controversies and memes ===
Line 56 ⟶ 58:
===Shiri's scissor===
In the short story "Sort By Controversial", Alexander introduces the term "Shiri's scissor" or "scissor statement" to describe a statement that has great destructive power because it generates wildly divergent interpretations that fuel conflict and tear people apart. The term has been used to describe controversial topics widely discussed in social media.<ref>{{multiref2
| {{Cite news|last=Lewis|first=Helen|date=19 August 2020|title=The Mythology of Karen|work=The Atlantic|url=https://www.theatlantic.com/international/archive/2020/08/karen-meme-coronavirus/615355/|url-status=live|access-date=9 September 2020|archive-url=https://web.archive.org/web/20200830034317/https://www.theatlantic.com/international/archive/2020/08/karen-meme-coronavirus/615355/|archive-date=30 August 2020|issn=1072-7825}} | === Anti-reactionary FAQ ===
|