Content deleted Content added
No edit summary |
Cleaned up the merge |
||
Line 1:
{{Short description|none}}
{{Cleanup|date=August 2025|reason=The article contains exorbitant amounts of anti-AI info.}}
{{Short description|none}}
[[Artificial intelligence]] is used in [[Wikipedia]] and other [[Wikimedia projects]] for the purpose of developing those projects.<ref>{{cite web |last1=Marr |first1=Bernard |title=The Amazing Ways How Wikipedia Uses Artificial Intelligence |url=https://www.forbes.com/sites/bernardmarr/2018/08/17/the-amazing-ways-how-wikipedia-uses-artificial-intelligence/#7cbdda802b9d |website=Forbes |language=en |date=17 August 2018}}</ref><ref name="NYT-20230718">{{cite news |last=Gertner |first=Jon |title=Wikipedia's Moment of Truth - Can the online encyclopedia help teach A.I. chatbots to get their facts right — without destroying itself in the process? + comment |url=https://www.nytimes.com/2023/07/18/magazine/wikipedia-ai-chatgpt.html |date=18 July 2023 |work=[[The New York Times]] |url-status=bot: unknown |archiveurl=https://web.archive.org/web/20230718233916/https://www.nytimes.com/2023/07/18/magazine/wikipedia-ai-chatgpt.html#permid=126389255 |archivedate=18 July 2023 |accessdate=19 July 2023 }}</ref> Human and [[Internet bot|bot]] interaction in Wikimedia projects is routine and iterative.<ref>{{cite arXiv |last1=Piscopo |first1=Alessandro |title=Wikidata: A New Paradigm of Human-Bot Collaboration? |date=1 October 2018 |eprint=1810.00931|class=cs.HC }}</ref>▼
[[File:Example_of_AI-generated_article_getting_nominated_for_speedy_deletion.png|thumb|246x246px|AI-generated draft article getting nominated for [[speedy deletion]] under G15 criteria]]
▲[[Artificial intelligence]] is used in
Various articles on [[Wikipedia]] have been created entirely with or with the help of [[artificial intelligence]]. AI-generated content can be detrimental to Wikipedia when unreliable or containing fake citations.
To address the issue of low-quality AI-generated content, the [[Wikipedia community]] created in 2023 a [[WikiProject]] named [[Wikipedia:WikiProject AI Cleanup|AI Cleanup]]. On August 2025, Wikipedia adopted a policy that allowed editors to nominate suspected AI-generated articles for [[speedy deletion]].
== Using artificial intelligence for Wikipedia ==
=== Beginnings ===▼
On December 6, 2022, a Wikipedia contributor named Pharos created the article "[[Artwork title]]" in his sandbox, declaring he used ChatGPT to experiment with it and would extensively modify it. He noted that the text needed to be toned down for neutrality. Another editor tagged the article as "[[original research]]", arguing that the article was initially unsourced AI-generated content, and sourced afterwards, instead of being based on reliable sources from the outset. Another editor who experimented with this early version of ChatGPT said that ChatGPT's overview of the topic was decent, but that the citations were fabricated. [[Wiki Education Foundation]] reported that some experienced editors found AI to be useful in starting drafts or creating new articles. It said that ChatGPT “knows” what Wikipedia articles look like and can easily generate one that is written in the style of Wikipedia. It warned editors that ChatGPT had a tendency to use promotional language. Miguel García, Wikipedia editor from Spain, said that when ChatGPT was originally launched, the number of AI-generated articles on the site peaked. He added that the rate of AI articles has now stabilized due to the community's efforts to combat it. He said that majority of the articles that have no sources are deleted instantly or are nominated for deletion.▼
On August 2025, the Wikipedia community created a policy that allowed users to nominate suspected AI-generated articles for [[speedy deletion]]. Editors usually recognize AI-generated articles because they use citations that are not related to the subject of the article or fabricated citations. The wording of articles is also used to recognize AI writings. For example, if an article uses language that reads like an [[LLM]] response to a user, such as "Here is your Wikipedia article on” or “Up to my last training update”, the article is typically tagged for speedy deletion. Other signs of AI use include: excessive use of [[em dashes]], overuse of the word "moreover", promotional material in articles that describes something as "breathtaking” and formatting issues like using curly [[Quotation mark|quotation marks]] instead of straight versions. During the discussion on implementing the speedy deletion policy, one user, who is an article reviewer, said that he is “flooded non-stop with horrendous drafts” created using AI. Other users said that AI articles have a large amount of “lies and fake references” and that it takes a significant amount of time to fix the issues.▼
=== Hoaxes and malicious AI use ===▼
AI has been used on Wikipedia to advocate for certain political viewpoints in articles covered by [[Contentious topics on Wikipedia|contentious topic]] guidelines. One instance showed a banned editor using AI to engage in [[edit wars]] and manipulate [[Albanian history]]-related articles. Other instances included users generating articles about political movements or weapons, but dedicating the majority of the content to a different subject, such as by covering [[JD Vance]] or [[Volodymyr Zelenskyy|Volodymyr Zelensky]] in a non-neutral way. Ilyas Lebleu said that there are many reasons to why some users add AI-generated content to Wikipedia, he said that they include deliberate vandalism with the intention to create a hoax, self-promotion or them falsely thinking that AI-generated content is correct.▼
=== ORES ===
Line 34 ⟶ 21:
In August 2018, a company called Primer reported attempting to use artificial intelligence to create Wikipedia articles about women as a way to address [[gender bias on Wikipedia]].<ref>{{Cite magazine |last1=Simonite |first1=Tom |date=3 August 2018 |title=Using Artificial Intelligence to Fix Wikipedia's Gender Problem |url=https://www.wired.com/story/using-artificial-intelligence-to-fix-wikipedias-gender-problem/ |magazine=Wired}}</ref><ref>{{cite web |last1=Verger |first1=Rob |date=7 August 2018 |title=Artificial intelligence can now help write Wikipedia pages for overlooked scientists |url=https://www.popsci.com/artificial-intelligence-scientists-wikipedia |website=Popular Science |language=en}}</ref>
▲=== Beginnings of generative AI ===
In 2022, the public release of [[ChatGPT]] inspired more experimentation with AI and writing Wikipedia articles. A debate was sparked about whether and to what extent such [[large language model]]s are suitable for such purposes in light of their tendency to [[Hallucination (artificial intelligence)|generate plausible-sounding misinformation]], including fake references; to generate prose that is not encyclopedic in tone; and to [[Algorithmic bias|reproduce biases]].<ref>{{Cite web |last=Harrison |first=Stephen |date=2023-01-12 |title=Should ChatGPT Be Used to Write Wikipedia Articles? |url=https://slate.com/technology/2023/01/chatgpt-wikipedia-articles.html |access-date=2023-01-13 |website=Slate Magazine |language=en}}</ref><ref name="vice">{{cite news |last1=Woodcock |first1=Claire |date=2 May 2023 |title=AI Is Tearing Wikipedia Apart |url=https://www.vice.com/en/article/ai-is-tearing-wikipedia-apart/ |work=Vice |language=en}}</ref> Since 2023, work has been done to
In 2023, the Wikipedia community created a [[WikiProject]] named [[Wikipedia:WikiProject AI Cleanup|AI Cleanup]] to assist in the removal of poor quality AI content from Wikipedia. On October 2024, a study by [[Princeton University]] revealed that about 5% of 3,000 newly created articles (created on August 2024) on [[English Wikipedia]] were created using AI. The study said that some of the AI articles were on innocuous topics and that AI had likely only been used to assist in writing. For some other articles, AI had been used to promote [[Business|businesses]] or political interests.<ref name=":0">{{Cite news |last=Wu |first=Daniel |date=August 8, 2025 |title=Volunteers fight to keep 'AI slop' off Wikipedia |url=https://www.washingtonpost.com/technology/2025/08/08/wikipedia-ai-generated-mistakes-editors/ |access-date= |newspaper=[[The Washington Post]] |language=en-US |issn=0190-8286}}</ref><ref>{{Cite web |last=Stokel-Walker |first=Chris |date=November 1, 2024 |title=One in 20 new Wikipedia pages seem to be written with the help of AI |url=https://www.newscientist.com/article/2454256-one-in-20-new-wikipedia-pages-seem-to-be-written-with-the-help-of-ai/ |url-access=subscription |access-date= |website=[[New Scientist]] |language=en-US}}</ref>
▲In 2022, the public release of [[ChatGPT]] inspired more experimentation with AI and writing Wikipedia articles. A debate was sparked about whether and to what extent such [[large language model]]s are suitable for such purposes in light of their tendency to [[Hallucination (artificial intelligence)|generate plausible-sounding misinformation]], including fake references; to generate prose that is not encyclopedic in tone; and to [[Algorithmic bias|reproduce biases]].<ref>{{Cite web |last=Harrison |first=Stephen |date=2023-01-12 |title=Should ChatGPT Be Used to Write Wikipedia Articles? |url=https://slate.com/technology/2023/01/chatgpt-wikipedia-articles.html |access-date=2023-01-13 |website=Slate Magazine |language=en}}</ref><ref name="vice">{{cite news |last1=Woodcock |first1=Claire |date=2 May 2023 |title=AI Is Tearing Wikipedia Apart |url=https://www.vice.com/en/article/ai-is-tearing-wikipedia-apart/ |work=Vice |language=en}}</ref> Since 2023, work has been done to [[Wikipedia:Artificial intelligence#Discussion timeline|draft Wikipedia policy on ChatGPT]] and similar [[large language model]]s (LLMs), e.g. at times recommending that users who are unfamiliar with LLMs should avoid using them due to the aforementioned risks, as well as noting the potential for [[libel]] or [[copyright infringement]].<ref name="vice" /> Some relevant policies are linked at [[Wikipedia:WikiProject AI Cleanup/Policies|WikiProject AI Cleanup/Policies]].
▲On December 6, 2022, a Wikipedia contributor named Pharos created the article "[[Artwork title]]" in his sandbox, declaring he used ChatGPT to experiment with it and would extensively modify it
▲On August 2025, the Wikipedia community created a policy that allowed users to nominate suspected AI-generated articles for [[speedy deletion]]. Editors usually recognize AI-generated articles because they use citations that are not related to the subject of the article or fabricated citations. The wording of articles is also used to recognize AI writings. For example, if an article uses language that reads like an [[LLM]] response to a user, such as "Here is your Wikipedia article on” or “Up to my last training update”, the article is typically tagged for speedy deletion.<ref name=":02">{{Cite news |last=Wu |first=Daniel |date=August 8, 2025 |title=Volunteers fight to keep 'AI slop' off Wikipedia |url=https://www.washingtonpost.com/technology/2025/08/08/wikipedia-ai-generated-mistakes-editors/ |access-date= |newspaper=[[The Washington Post]] |language=en-US |issn=0190-8286}}</ref><ref>{{Cite web |last=Maiberg |first=Emanuel |date=August 5, 2025 |title=Wikipedia Editors Adopt 'Speedy Deletion' Policy for AI Slop Articles |url=https://www.404media.co/wikipedia-editors-adopt-speedy-deletion-policy-for-ai-slop-articles/ |access-date= |website=[[404 Media]] |language=en}}</ref> Other signs of AI use include
Ilyas Lebleu, founder of WikiProject AI Cleanup, said that he and his fellow editors noticed a pattern of unnatural writing that they managed to connect to ChatGPT. He added that AI is able to mass-produce content that sounds real while being completely fake, leading to the creation of [[hoax]] articles on Wikipedia that he was tasked to delete. Wikipedia created a guide on how to spot signs of AI-generated writing.<ref>{{Cite web |last=Clair |first=Grant |date=August 20, 2025 |title=Wikipedia publishes list of AI writing tells |url=https://boingboing.net/2025/08/20/wikipedia-publishes-list-of-ai-writing-tells.html |access-date= |website=[[Boing Boing]] |language=en-US}}</ref>
▲=== Hoaxes and malicious AI use ===
In 2023, researchers discovered that ChatGPT frequently fabricates information and makes up fake articles for its users. At that time, a ban on AI was deemed "too harsh" by the community.<ref>{{Cite web |last=Woodrock |first=Claire |date=May 2, 2023 |title=AI Is Tearing Wikipedia Apart |url=https://www.vice.com/en/article/ai-is-tearing-wikipedia-apart/ |archive-url=https://web.archive.org/web/20241004054831/https://www.vice.com/en/article/ai-is-tearing-wikipedia-apart/ |archive-date=October 4, 2024 |website=[[Vice Magazine]]}}</ref><ref>{{Cite web |last=Harrison |first=Stephen |date=August 24, 2023 |title=Wikipedia Will Survive A.I. |url=https://slate.com/technology/2023/08/wikipedia-artificial-intelligence-threat.html |website=[[Slate Magazine]]}}</ref> AI was deliberately used to create various hoax articles on Wikipedia. For example, an in-depth 2,000-word article about an Ottoman fortress that never existed was found by Ilyas Lebleu and his team.<ref>{{Cite web |last=Durpe |first=Maggie |date=October 10, 2024 |title=Wikipedia Declares War on AI Slop |url=https://futurism.com/the-byte/wikipedia-declares-war-ai-slop |access-date= |website=[[Futurism (website)|Futurism]]}}</ref><ref>{{Cite web |last=Funaki |first=Kaiyo |date=October 25, 2024 |title=Wikipedia editors form urgent task force to combat rampant issues with recent wave of content: 'The entire thing was ... [a] hoax' |url=https://www.thecooldown.com/green-business/ai-content-wikipedia-volunteers-editing/ |website=TCD}}</ref> Another example showed an user adding AI-generated misinformation to an article on [[Estola albosignata|''Estola albosignata'']], a species of beetle. The paragraph seemed normal but referenced an unrelated article.
▲AI has been used on Wikipedia to advocate for certain political viewpoints in articles covered by [[Contentious topics on Wikipedia|contentious topic]] guidelines. One instance showed a banned editor using AI to engage in [[edit wars]] and manipulate [[Albanian history]]-related articles. Other instances included users generating articles about political movements or weapons, but dedicating the majority of the content to a different subject, such as by
=== Simple Article Summaries ===
|