LessWrong: Difference between revisions

Content deleted Content added
No edit summary
Rescuing 1 sources and tagging 0 as dead.) #IABot (v2.0.9.5
 
(564 intermediate revisions by more than 100 users not shown)
Line 1:
{{Short description|Rationality-focused community blog}}
{{Website-stub}}
{{For|the concept of choosing the least undesirable of available options|lesser evil}}
{{Use dmy dates|date=June 2022}}
{{Infobox website
| name = LessWrong
| logo = LessWrong logo.svg
| logo_size = 250px
| logocaption = <!-- LessWrong logo. -->
| screenshot = <!-- Screenshot of the home page -->
| caption = <!-- Screenshot of the LessWrong home page. -->
| collapsible = <!-- yes -->
| collapsetext = <!-- ? -->
| url = {{URL|https://www.lesswrong.com/|LessWrong.com}}
| commercial = <!-- ? -->
| type = [[Internet forum]], [[blog]]
| registration = Optional, but is required for contributing content
| language = English
| num_users = <!-- The number of registered users the website has. -->
| content_license = <!-- The license of the content of the site. Works same as content_licence. -->
| owner = <!-- ? -->
| author = [[Eliezer Yudkowsky]]
| editor = <!-- The person or entity that edits the website. -->
| launch_date = {{start date and age|2009|2|1|df=no}}
| revenue = <!-- The approximate revenue of the site. -->
| ip = <!-- ? -->
| current_status = Active
| footnotes = <!-- ? -->
| programming_language = [[JavaScript]], [[Cascading Style Sheets|CSS]] (powered by [[React (JavaScript library)|React]] and [[GraphQL]])
}}
 
'''LessWrong''' (also written '''Less Wrong''') is a community [[blog]] and [[Internet forum|forum]] focused on discussion of [[cognitive bias]]es, [[philosophy]], [[psychology]], [[economics]], [[rationality]], and [[artificial intelligence]], among other topics.<ref name=faq>{{cite web|url = http://wiki.lesswrong.com/wiki/FAQ#What_is_Less_Wrong.3F|title = Less Wrong FAQ|publisher = LessWrong|access-date = 25 March 2014|archive-date = 30 April 2019|archive-url = https://web.archive.org/web/20190430134954/https://wiki.lesswrong.com/wiki/FAQ#What_is_Less_Wrong.3F|url-status = live}}</ref><ref name="businessinsider">{{cite news|url = http://www.businessinsider.com/ten-things-you-should-learn-from-lesswrongcom-2011-7|title = You Can Learn How To Become More Rational|last = Miller|first = James|website = [[Business Insider]]|date = July 28, 2011|access-date = March 25, 2014|archive-date = 10 August 2018|archive-url = https://web.archive.org/web/20180810011032/https://www.businessinsider.com/ten-things-you-should-learn-from-lesswrongcom-2011-7|url-status = live}}</ref> It is associated with the [[rationalist community]].
LessWrong is a community blog devoted to “refining the art of human rationality.”<ref name="businessinsider">http://www.businessinsider.com/ten-things-you-should-learn-from-lesswrongcom-2011-7</ref>
 
==External LinksPurpose==
LessWrong describes itself as an online forum and community aimed at improving human reasoning, rationality, and decision-making, with the goal of helping its users hold more accurate beliefs and achieve their personal objectives.<ref>{{Cite web |date=14 June 2019 |title=Welcome to LessWrong! |url=https://www.lesswrong.com/posts/bJ2haLkcGeLtTWaD5/welcome-to-lesswrong |website=LessWrong}}</ref> The best known posts of LessWrong are "The Sequences", a series of essays which aim to describe how to avoid the typical failure modes of human reasoning with the goal of improving decision-making and the evaluation of evidence.<ref name="NYer2">{{cite magazine |last1=Lewis-Kraus |first1=Gideon |date=9 July 2020 |title=Slate Star Codex and Silicon Valley's War Against the Media |url=https://www.newyorker.com/culture/annals-of-inquiry/slate-star-codex-and-silicon-valleys-war-against-the-media |url-status=live |archive-url=https://web.archive.org/web/20200710020419/https://www.newyorker.com/culture/annals-of-inquiry/slate-star-codex-and-silicon-valleys-war-against-the-media |archive-date=10 July 2020 |access-date=4 August 2020 |magazine=[[The New Yorker]] |language=en-us}}</ref><ref>{{cite web|url = https://www.lesswrong.com/highlights|title = Sequences Highlights|publisher = LessWrong|access-date = 12 July 2024|archive-date = 6 July 2024|archive-url = https://web.archive.org/web/20240706170927/https://www.lesswrong.com/highlights|url-status = live}}</ref> One suggestion is the use of [[Bayes' theorem]] as a decision-making tool.<ref name="businessinsider" /> There is also a focus on psychological barriers that prevent good decision-making, including [[fear conditioning]] and [[List of cognitive biases|cognitive biases]], that have been studied by the psychologist [[Daniel Kahneman]].<ref>{{cite news|url = https://www.theguardian.com/lifeandstyle/2012/mar/09/change-life-answer-easier-question|title = This column will change your life: asked a tricky question? Answer an easier one|last = Burkeman|first = Oliver|date = March 9, 2012|access-date = March 25, 2014|newspaper = [[The Guardian]]|archive-date = 26 March 2014|archive-url = https://web.archive.org/web/20140326013744/http://www.theguardian.com/lifeandstyle/2012/mar/09/change-life-answer-easier-question|url-status = live}}</ref> LessWrong is also concerned with artificial intelligence, [[transhumanism]], [[Existential risk|existential threats]], and the [[Technological singularity|singularity]].<ref name="Observer">{{Cite news|url=https://observer.com/2012/07/faith-hope-and-singularity-entering-the-matrix-with-new-yorks-futurist-set/ |title=Faith, Hope, and Singularity: Entering the Matrix with New York's Futurist Set |last=Tiku |first=Nitasha |date=2012-07-25 |website=[[The New York Observer|Observer]] |access-date=2019-04-12 |archive-date=12 April 2019 |archive-url=https://web.archive.org/web/20190412214151/https://observer.com/2012/07/faith-hope-and-singularity-entering-the-matrix-with-new-yorks-futurist-set/ |url-status=live }}</ref>
* [http://www.lesswrong.com Less Wrong]
* [http://lesswrong.com/about/ Less Wrong About Page]
 
==References History ==
[[File:Eliezer Yudkowsky, Stanford 2006 (square crop).jpg|thumb|[[Eliezer Yudkowsky]] at [[Stanford University]] in 2006]]
{{Reflist}}
LessWrong developed from Overcoming Bias, an earlier group blog focused on human rationality, which began in November 2006, with artificial intelligence researcher [[Eliezer Yudkowsky]] and economist [[Robin Hanson]] as the principal contributors. In February 2009, Yudkowsky's posts were used as the seed material to create the community blog LessWrong, and Overcoming Bias became Hanson's personal blog.<ref>{{cite web |title=Where did Less Wrong come from? (LessWrong FAQ) |url=http://wiki.lesswrong.com/wiki/FAQ#Where_did_Less_Wrong_come_from.3F |url-status=live |archive-url=https://web.archive.org/web/20190430134954/https://wiki.lesswrong.com/wiki/FAQ#Where_did_Less_Wrong_come_from.3F |archive-date=30 April 2019 |access-date=March 25, 2014}}</ref> In 2013, a significant portion of the [[rationalist community]] shifted focus to Scott Alexander's [[Slate Star Codex]].<ref name="NYer2"/>
 
=== Artificial intelligence ===
 
Discussions of AI within LessWrong include [[AI alignment]], [[AI safety]],<ref name=":0" /> and [[Artificial consciousness|machine consciousness]].{{Citation needed|date=July 2024}} Articles posted on LessWrong about AI have been cited in the news media.<ref name=":0">{{Cite news |last=Chivers |first=Tom |date=Nov 22, 2023 |title=What we've learned about the robot apocalypse from the OpenAI debacle |url=https://www.semafor.com/article/11/22/2023/what-weve-learned-about-the-robot-apocalypse-from-the-openai-debacle |url-status=live |archive-url=https://web.archive.org/web/20240303221226/https://www.semafor.com/article/11/22/2023/what-weve-learned-about-the-robot-apocalypse-from-the-openai-debacle |archive-date=3 March 2024 |access-date=2024-07-14 |work=[[Semafor (website)|Semafor]] |quote=Since the late 1990s those worries have become more specific, and coalesced around Nick Bostrom's 2014 book ''Superintelligence: Paths, Dangers, Strategies'' and Eliezer Yudkowsky's blog LessWrong. }}</ref><ref>{{Cite magazine |last=Newport |first=Cal |date=2024-03-15 |title=Can an A.I. Make Plans? |url=https://www.newyorker.com/science/annals-of-artificial-intelligence/can-an-ai-make-plans |access-date=2024-07-14 |magazine=The New Yorker |language=en-US |issn=0028-792X}}</ref> LessWrong, and its surrounding movement work on AI are the subjects of the 2019 book ''The AI Does Not Hate You'', written by former [[BuzzFeed]] science correspondent Tom Chivers.<ref>{{cite magazine |last1=Cowdrey |first1=Katherine |date=21 September 2017 |title=W&N wins Buzzfeed science reporter's debut after auction |url=https://www.thebookseller.com/news/wn-wins-three-way-auction-buzzfeed-science-reporters-debut-642406 |url-status=live |archive-url=https://web.archive.org/web/20181127122337/https://www.thebookseller.com/news/wn-wins-three-way-auction-buzzfeed-science-reporters-debut-642406 |archive-date=27 November 2018 |access-date=2017-09-21 |website=[[The Bookseller]]}}</ref><ref>{{Cite book |last=Chivers |first=Tom |title=The AI Does Not Hate You |publisher=Weidenfeld & Nicolson |year=2019 |isbn=978-1474608770}}</ref><ref>{{Cite news |last=Marriott |first=James |date=31 May 2019 |title=The AI Does Not Hate You by Tom Chivers review — why the nerds are nervous |url=https://www.thetimes.com/business-money/technology/article/the-ai-does-not-hate-you-superintelligence-rationality-and-the-race-to-save-the-world-by-tom-chivers-review-why-the-nerds-are-nervous-klx0xd293 |url-status=live |archive-url=https://web.archive.org/web/20200423043650/https://www.thetimes.co.uk/article/the-ai-does-not-hate-you-superintelligence-rationality-and-the-race-to-save-the-world-by-tom-chivers-review-why-the-nerds-are-nervous-klx0xd293 |archive-date=23 April 2020 |access-date=2020-05-03 |work=[[The Times]] |issn=0140-0460}}</ref>
 
=== Effective altruism ===
LessWrong played a significant role in the development of the [[effective altruism]] (EA) movement,<ref>{{cite book |last1=de Lazari-Radek |first1=Katarzyna |title=Utilitarianism: A Very Short Introduction |last2=Singer |first2=Peter |date=2017-09-27 |publisher=Oxford University Press |isbn=9780198728795 |page=110 |author-link2=Peter Singer}}</ref> and the two communities are closely intertwined.<ref name=chiversEA>{{Cite book|last=Chivers|first=Tom|title=The AI Does Not Hate You|publisher=Weidenfeld & Nicolson|year=2019|isbn=978-1474608770|chapter=Chapter 38: The Effective Altruists}}</ref>{{rp|227}} In a survey of LessWrong users in 2016, 664 out of 3,060 respondents, or 21.7%, identified as "effective altruists". A separate survey of effective altruists in 2014 revealed that 31% of respondents had first heard of EA through LessWrong,<ref name=chiversEA /> though that number had fallen to 8.2% by 2020.<ref>{{Cite news|url=https://forum.effectivealtruism.org/posts/tzFcqGmCA6ePeD5wm/ea-survey-2020-how-people-get-involved-in-ea|title=EA Survey 2020: How People Get Involved in EA|date=2021-05-20|access-date=2021-07-28|website=Effective Altruism Forum|last=Moss|first=David|archive-date=28 July 2021|archive-url=https://web.archive.org/web/20210728202421/https://forum.effectivealtruism.org/posts/tzFcqGmCA6ePeD5wm/ea-survey-2020-how-people-get-involved-in-ea|url-status=live}}</ref>
 
===Roko's basilisk===
{{main|Roko's basilisk}}
In July 2010, LessWrong contributor Roko posted a [[thought experiment]] to the site in which an otherwise [[Friendly artificial intelligence|benevolent<!-- LW itself refers to it as a "friendly AI"--> future AI system]] tortures people who heard of the AI before it came into existence and failed to work tirelessly to bring it into existence, in order to incentivise said work. This idea came to be known as "[[Roko's basilisk]]", based on Roko's idea that merely hearing about the idea would give the hypothetical AI system an incentive to try such [[blackmail]].<ref name="insider">{{cite news |author=Love |first=Dylan |date=6 August 2014 |title=WARNING: Just Reading About This Thought Experiment Could Ruin Your Life |url=http://www.businessinsider.com/what-is-rokos-basilisk-2014-8 |access-date=6 December 2014 |website=[[Business Insider]] |archive-date=18 November 2018 |archive-url=https://web.archive.org/web/20181118105522/https://www.businessinsider.com/what-is-rokos-basilisk-2014-8 |url-status=live }}</ref><ref name="Slate July 2014">{{cite news |last1=Auerbach |first1=David |author-link=David Auerbach |date=17 July 2014 |title=The Most Terrifying Thought Experiment of All Time |work=[[Slate (magazine)|Slate]] |url=http://www.slate.com/articles/technology/bitwise/2014/07/roko_s_basilisk_the_most_terrifying_thought_experiment_of_all_time.single.html |access-date=18 July 2014 |archive-date=25 October 2018 |archive-url=https://web.archive.org/web/20181025091051/http://www.slate.com/articles/technology/bitwise/2014/07/roko_s_basilisk_the_most_terrifying_thought_experiment_of_all_time.single.html |url-status=live }}</ref><ref name="Observer"/>
 
===Neoreaction===
After LessWrong split from Overcoming Bias, it attracted some individuals affiliated with [[neoreaction]] with discussions of [[eugenics]] and [[evolutionary psychology]].<ref>{{Cite web|url=http://fusion.net/story/312592/peter-thiel-transhumanist/|title=The Strange and Conflicting World Views of Silicon Valley Billionaire Peter Thiel|last=Keep|first=Elmo|date=22 June 2016|website=Fusion|access-date=2016-10-05|quote=Thanks to LessWrong’s discussions of eugenics and evolutionary psychology, it has attracted some readers and commenters affiliated with the alt-right and neoreaction, that broad cohort of neofascist, white nationalist and misogynist trolls.|archive-date=13 February 2017|archive-url=https://web.archive.org/web/20170213070610/http://fusion.net/story/312592/peter-thiel-transhumanist/|url-status=live}}</ref> However, Yudkowsky has strongly rejected neoreaction.<ref>{{Cite journal|last=Riggio|first=Adam|date=23 September 2016|title=The Violence of Pure Reason: Neoreaction: A Basilisk|url=https://social-epistemology.com/2016/09/23/the-violence-of-pure-reason-neoreaction-a-basilisk-adam-riggio/|journal=Social Epistemology Review and Reply Collective|volume=5|issue=9|pages=34–41|issn=2471-9560|quote=Land and Yarvin are openly allies with the new reactionary movement, while Yudkowsky counts many reactionaries among his fanbase despite finding their racist politics disgusting.|access-date=5 October 2016|archive-date=5 October 2016|archive-url=https://web.archive.org/web/20161005194701/https://social-epistemology.com/2016/09/23/the-violence-of-pure-reason-neoreaction-a-basilisk-adam-riggio/|url-status=live}}</ref><ref>{{cite web|url=http://yudkowsky.tumblr.com/post/142497361345/this-isnt-going-to-work-but-for-the-record-and|title=Untitled|author=Eliezer Yudkowsky|date=8 April 2016|work=Optimize Literally Everything (blog)|access-date=7 October 2016|archive-date=26 May 2019|archive-url=https://web.archive.org/web/20190526100513/https://yudkowsky.tumblr.com/post/142497361345/this-isnt-going-to-work-but-for-the-record-and|url-status=live}}</ref> Additionally, in a survey among LessWrong users in 2016, only 28 out of 3060 respondents (0.92%) identified as "neoreactionary".<ref name=":1">{{cite book |last1= Hermansson |first1= Patrik |last2= Lawrence |first2= David |last3= Mulhall |first3= Joe |first4= Simon |last4= Murdoch |title= The International Alt-Right. Fascism for the 21st Century? |chapter= The Dark Enlightenment: Neoreaction and Silicon Valley |___location= Abingdon-on-Thames, England, UK |publisher= Routledge |year= 2020 |chapter-url= https://books.google.com/books?id=43fNDwAAQBAJ&q=3060 |isbn= 9781138363861 |access-date= 2 October 2020 |archive-date= 13 June 2022 |archive-url= https://web.archive.org/web/20220613193540/https://books.google.com/books?id=43fNDwAAQBAJ&q=3060 |url-status= live }}</ref>
 
Ana Teixeira Pinto, writing for the journal ''[[Third Text]]'' in 2019, describes Less Wrong as being a component in a "new configuration of fascist ideology taking shape under the aegis of, and working in tandem with, neoliberal governance" - pointing out that, not only was it the origin place of the [[Roko's Basilisk]] but that, also, the ethno-nationalist blog "More Right" emerged out of the LessWrong community.<ref name=pinto>{{cite journal |last1=Pinto |first1=Ana Teixeira |date=May 2019 |url=https://www.tandfonline.com/doi/abs/10.1080/09528822.2019.1625638 |title=Capitalism with a Transhuman Face: The Afterlife of Fascism and the Digital Frontier |journal=Third Text |volume=33 |issue=3 |publisher= Taylor & Francis |pages=315-336 |doi=10.1080/09528822.2019.1625638 |access-date=August 5, 2025|url-access=subscription }}</ref>
 
== User base ==
According to the Community Survey 2023, conducted among 558 users of the forum, the user base consists of 75% [[Cisgender|cis]] [[Male|males]] and 9.6% cis [[Female|females]], with the rest describing themselves as [[Transgender|trans]] or [[Non-binary gender|non-binary]]. Users are in most cases between 20 and 35 years old. Almost half of the users are from the [[United States]] and most of the remainder are from [[Western Europe]] or [[Canada]]. The ethnic makeup was 78.9% [[Non-Hispanic whites|non-Hispanic White]], 4.9% [[East Asian people|East Asian]], 4.2% [[South Asians|South Asian]], 3.6% [[White Hispanic and Latino Americans|white Hispanic]], 2.6% [[Middle Eastern people|Middle Eastern]], 0.7% [[Black people|Black]] and 5.1% others. LessWrong users are highly educated (with the majority having at least a [[Bachelor's degree]]) and work primarily in [[Information technology|IT]], [[engineering]] or other [[STEM fields]]. A majority of 67% describe themselves as [[Atheism|atheists]] and only 3.7% as convinced [[Theism|theists]]. In terms of political orientation, the most frequently mentioned answers were [[Liberalism|liberal]] (32.3%), [[Libertarianism|libertarian]] (25.2%) and [[Social democracy|social democratic]] (22.3%).<ref>{{Cite web |date=2024-02-16 |title=2023 Survey Results |url=https://www.lesswrong.com/posts/WRaq4SzxhunLoFKCs/2023-survey-results |website=LessWrong}}</ref>
 
===Notable users===
LessWrong has been associated with several influential contributors. Founder Eliezer Yudkowsky established the platform to promote rationality and raise awareness about potential risks associated with artificial intelligence.<ref name="Miller2017">{{cite book |last=Miller |first=J.D. |title=The Technological Singularity |publisher=Springer |year=2017 |isbn=978-3-662-54033-6 |editor1-last=Callaghan |editor1-first=V. |series=The Frontiers Collection |___location=Berlin, Heidelberg |pages=225–226 |chapter=Reflections on the Singularity Journey |quote=Yudkowsky helped create the Singularity Institute (now called the Machine Intelligence Research Institute) to help mankind achieve a friendly Singularity. (Disclosure: I have contributed to the Singularity Institute.) Yudkowsky then founded the community blog <nowiki>http://LessWrong.com</nowiki>, which seeks to promote the art of rationality, to raise the sanity waterline, and to in part convince people to make considered, rational charitable donations, some of which, Yudkowsky (correctly) hoped, would go to his organization. |editor2-last=Miller |editor2-first=J. |editor3-last=Yampolskiy |editor3-first=R. |editor4-last=Armstrong |editor4-first=S.}}</ref> [[Slate Star Codex|Scott Alexander]] became one of the site's most popular writers before starting his own blog, Slate Star Codex, contributing discussions on AI safety and rationality.<ref name="Miller2017" />
 
Further notable users on LessWrong include [[Paul Christiano (researcher)|Paul Christiano]], [[Wei Dai]] and [[Zvi Mowshowitz]]. A selection of posts by these and other contributors, selected through a community review process,<ref name="Gasarch2022">{{cite journal|last=Gasarch|first=William|title=Review of "A Map that Reflects the Territory: Essays by the LessWrong Community"|journal=ACM SIGACT News|volume=53|issue=1|year=2022|pages=13–24|doi=10.1145/3532737.3532741|quote=Users wrote reviews of the best posts of 2018, and voted on them using the quadratic voting system, popularized by Glen Weyl and Vitalik Buterin. From the 2000+ posts published that year, the Review narrowed down the 44 most interesting and valuable posts.}}</ref> were published as parts of the essay collections "A Map That Reflects the Territory"<ref name="Lagerros2020">{{cite book|title=A Map That Reflects the Territory: Essays by the LessWrong Community|author1=Lagerros, J.|author2=Pace, B.|author3=LessWrong.com|isbn=9781736128503|year=2020|publisher=Center for Applied Rationality|url=https://books.google.com/books?id=czALzgEACAAJ}}</ref> and "The Engines of Cognition".<ref name="Pace2021">{{cite book|title=The Engines of Cognition: Essays by the LessWrong Community|author1=Pace, B.|author2=LessWrong|isbn=9781736128510|year=2021|publisher=Center for Applied Rationality|url=https://books.google.com/books?id=3yOkzgEACAAJ}}</ref><ref name="Gasarch2022"/><ref name="Gasarch2022b">{{cite journal|last=Gasarch|first=William|title=Review of "The Engines of Cognition: Essays by the Less Wrong Community"|journal=ACM SIGACT News|volume=53|issue=3|year=2022|pages=6–16|doi=10.1145/3561064.3561066}}</ref>
 
Ziz LaSota, who was the leader of the [[Zizians]] (an offshoot of the rationalist community), was a LessWrong user. The group was eventually banned from LessWrong and associated meetups and conferences due to an alleged pattern of aggressive behavior.<ref name="Ratliff-2025">{{cite magazine |last1=Ratliff |first1=Evan |title=The Delirious, Violent, Impossible True Story of the Zizians |url=https://www.wired.com/story/delirious-violent-impossible-true-story-zizians/ |access-date=26 February 2025 |magazine=[[Wired (magazine)|Wired]] |date=21 February 2025 |archive-date=February 26, 2025 |archive-url=https://web.archive.org/web/20250226064839/https://www.wired.com/story/delirious-violent-impossible-true-story-zizians/ |url-status=live |url-access=subscription|quote=Their collective exile from the rationalist community was virtually complete. They were banned from LessWrong.com, along with various CFAR meetups and conferences. An anonymous rationalist launched a site, Zizians.info, branding them “the Zizians” for the first time and warning that the group was a cult.}}</ref>
 
== See also ==
{{Portal|Internet}}
* [[Center for Applied Rationality]], a rationalist nonprofit organization based in [[Berkeley, California]]
* [[TESCREAL]]
 
== References ==
{{Reflist|30em}}
 
{{LessWrong}}
{{Effective altruism}}
{{Transhumanism footer}}
 
[[Category:Internet forums]]
[[Category:Transhumanist organizations]]
[[Category:Internet properties established in 2009]]
[[Category:Effective altruism]]
[[Category:Rationalism]]
[[Category:LessWrong rationalists]]