Wikipedia:WikiProject Conservatism

(Redirected from Wikipedia:RYT)


    Welcome to WikiProject Conservatism! Whether you're a newcomer or regular, you'll receive encouragement and recognition for your achievements with conservatism-related articles. This project does not extol any point of view, political or otherwise, other than that of a neutral documentarian. Partly due to this, the project's scope has long become that of conservatism broadly construed, taking in a healthy periphery of (e.g., more academic) articles for contextualization.

    Major alerts

    A broad collection of discussions that could lead to significant changes of related articles

    Articles for deletion

    Redirects for discussion

    Good article nominees

    Good article reassessments

    Requests for comments

    Peer reviews

    Requested moves

    Articles to be merged

    Articles to be split

    Articles for creation

    Watchlists

    WatchAll (Excerpt)
    Excerpt from watchlist concerning all the articles in the project's scope
    Note that your own edits, minor edits, and bot edits are hidden in this tab

    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    1 September 2025

    31 August 2025

    For this watchlist but about 3X in length, visit: Wikipedia:WikiProject Conservatism/All recent changes
    WatchHot (Excerpt)
    A list of 10 related articles with the most (recent) edits total
    173 edits Donald Trump
    87 edits 2025 British anti-immigration protests
    79 edits Anti-German sentiment
    71 edits William Gayley Simpson
    62 edits James Dobson
    61 edits List of Donald Trump 2024 presidential campaign non-political endorsements
    48 edits Greg J. Dixon
    47 edits Alligator Alcatraz
    41 edits Imran Khan
    38 edits US federal agencies targeted by DOGE

    These are the articles that have been edited the most within the last seven days. Last updated 27 August 2025 by HotArticlesBot.



    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    1 September 2025

    31 August 2025

    For this watchlist but about 5X in length, visit: Wikipedia:WikiProject Conservatism/Hot articles recent changes
    WatchPop (Excerpt)
    A list of 500 related articles with the most (recent) views total

    This is a list of pages in the scope of Wikipedia:WikiProject Conservatism along with pageviews.

    To report bugs, please write on the Community tech bot talk page on Meta.

    List

    Period: 2025-07-01 to 2025-07-31

    Total views: 66,358,927

    Updated: 17:56, 6 August 2025 (UTC)

    Rank Page title Views Daily average Assessment Importance
    1 Donald Trump 1,670,201 53,877 B High
    2 Elon Musk 955,237 30,814 B Low
    3 Pam Bondi 754,444 24,336 C Low
    4 Karoline Leavitt 576,085 18,583 B Low
    5 Benjamin Netanyahu 569,371 18,366 B Mid
    6 Kristi Noem 491,112 15,842 B Low
    7 Fourteen Words 443,555 14,308 Start Low
    8 JD Vance 442,215 14,265 B Mid
    9 America Party 420,974 13,579 C Low
    10 Candace Owens 393,570 12,695 B Low
    11 Theodore Roosevelt 382,080 12,325 B High
    12 Laura Loomer 381,299 12,299 C Low
    13 Charlie Kirk 377,243 12,169 B Low
    14 Dan Bongino 358,584 11,567 C Mid
    15 George W. Bush 341,692 11,022 B High
    16 Vladimir Putin 341,392 11,012 B High
    17 Winston Churchill 337,258 10,879 GA Top
    18 Rupert Murdoch 335,753 10,830 B Low
    19 Ronald Reagan 333,723 10,765 FA Top
    20 George Santos 328,366 10,592 B Low
    21 Zionism 319,045 10,291 B Low
    22 Narendra Modi 298,174 9,618 GA Top
    23 William McMahon 290,966 9,386 Start Unknown
    24 Stephen Miller (advisor) 289,919 9,352 B Low
    25 Richard Nixon 271,193 8,748 FA High
    26 George H. W. Bush 267,975 8,644 B High
    27 Mel Gibson 265,011 8,548 B Mid
    28 Nick Fuentes 259,765 8,379 B Low
    29 Republican Party (United States) 255,640 8,246 B Top
    30 Barron Trump 254,935 8,223 B Low
    31 Rishi Sunak 253,752 8,185 B High
    32 Curtis Sliwa 250,810 8,090 C Low
    33 Dwight D. Eisenhower 242,692 7,828 B High
    34 Pete Hegseth 241,922 7,803 B Low
    35 Greg Abbott 232,453 7,498 B Mid
    36 Sanseitō 222,797 7,187 Start Low
    37 Trump family 212,284 6,847 B Low
    38 Mike Johnson 210,974 6,805 C Mid
    39 Project 2025 201,226 6,491 B Mid
    40 Francisco Franco 201,049 6,485 C Mid
    41 Lara Trump 199,651 6,440 C Low
    42 Oswald Mosley 197,888 6,383 B Low
    43 John Wayne 193,904 6,254 B Low
    44 Margaret Thatcher 193,603 6,245 A Top
    45 Marco Rubio 192,480 6,209 B Mid
    46 Gerald Ford 184,852 5,962 B High
    47 French Revolution 183,821 5,929 B Top
    48 QAnon 181,157 5,843 GA Mid
    49 Thomas Massie 180,160 5,811 B Low
    50 Jon Voight 178,118 5,745 C Low
    51 Liz Truss 176,690 5,699 FA Mid
    52 American Party of the United States 175,384 5,657 Start Mid
    53 Jordan Peterson 169,311 5,461 B Low
    54 Robert Duvall 165,729 5,346 B Low
    55 Heil Hitler (song) 165,513 5,339 C Low
    56 Tucker Carlson 165,107 5,326 B High
    57 Bharatiya Janata Party 164,098 5,293 GA Top
    58 Norman Tebbit 160,013 5,161 B Mid
    59 Linda McMahon 158,319 5,107 B Low
    60 Ted Cruz 157,818 5,090 B Mid
    61 Jair Bolsonaro 156,951 5,062 B Mid
    62 John Malkovich 156,811 5,058 C Low
    63 Reform UK 156,055 5,034 C High
    64 Cold War 151,157 4,876 B Top
    65 Chuck Norris 150,954 4,869 B Low
    66 Kelsey Grammer 148,441 4,788 B Low
    67 Marjorie Taylor Greene 147,543 4,759 GA Low
    68 James Stewart 146,357 4,721 GA Low
    69 Fyodor Dostoevsky 142,412 4,593 B Low
    70 Tom Homan 140,805 4,542 C Low
    71 Lisa Murkowski 139,252 4,492 C High
    72 William McKinley 138,965 4,482 FA Low
    73 Shirley Temple 138,845 4,478 B Low
    74 Boris Johnson 138,303 4,461 B High
    75 Curtis Yarvin 136,918 4,416 C High
    76 Hun Sen 135,268 4,363 C Low
    77 Gary Sinise 134,280 4,331 C Low
    78 Megyn Kelly 130,684 4,215 B Low
    79 Clark Gable 128,046 4,130 B Low
    80 Trump–Musk feud 127,689 4,119 B Low
    81 Kemi Badenoch 126,082 4,067 B Low
    82 Nancy Mace 123,834 3,994 B Low
    83 Steve Bannon 123,693 3,990 B Mid
    84 Anna Paulina Luna 123,092 3,970 B Low
    85 Nigel Farage 122,876 3,963 B Mid
    86 Mahathir Mohamad 119,865 3,866 GA High
    87 Greg Gutfeld 119,763 3,863 C Low
    88 James Caan 118,839 3,833 C Low
    89 Imran Khan 117,183 3,780 B Low
    90 Javier Milei 116,339 3,752 B Mid
    91 Taliban 115,311 3,719 B High
    92 Karl Malone 114,677 3,699 Start Low
    93 Dick Cheney 113,611 3,664 C Mid
    94 Herbert Hoover 112,751 3,637 B Mid
    95 Shinzo Abe 112,167 3,618 B Mid
    96 Chiang Kai-shek 111,336 3,591 C Low
    97 Mark Rutte 111,216 3,587 C High
    98 Second presidency of Donald Trump 111,169 3,586 C Low
    99 John McCain 110,088 3,551 FA Mid
    100 Grover Cleveland 109,801 3,541 FA Mid
    101 Susie Wiles 109,121 3,520 B Low
    102 Matt Gaetz 108,890 3,512 B Low
    103 Ron DeSantis 108,229 3,491 B Mid
    104 Red states and blue states 106,425 3,433 C Mid
    105 Charles de Gaulle 106,410 3,432 B Mid
    106 Lauren Boebert 105,843 3,414 B Low
    107 Liberal Democratic Party (Japan) 105,773 3,412 C High
    108 Nayib Bukele 103,903 3,351 GA Low
    109 New World Order conspiracy theory 103,344 3,333 GA High
    110 Ann Coulter 103,194 3,328 C Mid
    111 Warren G. Harding 103,028 3,323 FA Low
    112 Steele dossier 102,783 3,315 B Low
    113 Ayn Rand 102,453 3,304 GA Mid
    114 Stephen Baldwin 102,268 3,298 B Low
    115 Patricia Heaton 101,967 3,289 C Low
    116 Melinda Tankard Reist 101,500 3,274 Start Unknown
    117 Riley Gaines 101,174 3,263 B Mid
    118 Constitution of the United States 100,724 3,249 B High
    119 Calvin Coolidge 100,702 3,248 FA High
    120 Rudy Giuliani 100,276 3,234 C Mid
    121 Dave Mustaine 100,103 3,229 C Low
    122 James Woods 99,648 3,214 Start Low
    123 Shigeru Ishiba 99,321 3,203 B Low
    124 Ben Shapiro 98,532 3,178 C Mid
    125 Gamergate 98,399 3,174 C Mid
    126 Nick Adams (commentator) 98,335 3,172 Start Low
    127 Bo Derek 98,147 3,166 Start Low
    128 Rashtriya Swayamsevak Sangh 97,684 3,151 B Top
    129 Mitt Romney 97,605 3,148 FA High
    130 Thom Tillis 96,769 3,121 B Low
    131 Iran–Contra affair 96,763 3,121 GA Low
    132 William Howard Taft 96,329 3,107 FA Mid
    133 Conservative Party (UK) 95,843 3,091 B High
    134 Manosphere 95,283 3,073 B Low
    135 Friedrich Merz 93,497 3,016 C Mid
    136 James A. Garfield 92,513 2,984 FA Low
    137 The Heritage Foundation 92,042 2,969 B High
    138 Chuck Grassley 90,250 2,911 C Mid
    139 Carl Schmitt 89,999 2,903 C Top
    140 Lindsey Graham 89,745 2,895 C Low
    141 The Wall Street Journal 89,119 2,874 B Mid
    142 William Barr 88,924 2,868 B Unknown
    143 Edwin Feulner 88,614 2,858 C Mid
    144 Groypers 88,412 2,852 B Low
    145 Brooke Rollins 86,914 2,803 Start Low
    146 Recep Tayyip Erdoğan 86,892 2,802 B High
    147 Barbara Stanwyck 86,265 2,782 B Low
    148 Deng Xiaoping 85,839 2,769 B Low
    149 Generation 85,537 2,759 B Mid
    150 Anders Behring Breivik 85,118 2,745 C Low
    151 Dmitry Medvedev 83,138 2,681 C High
    152 Otto von Bismarck 82,776 2,670 B High
    153 Laura Bush 82,530 2,662 B Low
    154 Laura Ingraham 82,130 2,649 C Mid
    155 Arthur Wellesley, 1st Duke of Wellington 81,380 2,625 B Low
    156 Dinesh D'Souza 80,767 2,605 B Mid
    157 Saagar Enjeti 80,176 2,586 Stub Unknown
    158 GypsyCrusader 80,124 2,584 C High
    159 Mike Pence 80,068 2,582 B Mid
    160 Department of Government Efficiency 79,456 2,563 B High
    161 David Cameron 78,388 2,528 B Top
    162 George Wallace 77,887 2,512 B Mid
    163 Edward Teller 77,398 2,496 C Low
    164 Whig Party (United States) 77,099 2,487 C Low
    165 James Cagney 76,990 2,483 B Low
    166 Ustaše 76,899 2,480 C High
    167 Sarah Palin 76,668 2,473 C Mid
    168 Phil Robertson 76,264 2,460 C Low
    169 Fox News 76,121 2,455 C Mid
    170 Mitch McConnell 75,988 2,451 B Mid
    171 Gary Cooper 75,926 2,449 FA Mid
    172 Charlton Heston 75,126 2,423 B Low
    173 Charles Lindbergh 75,012 2,419 B Low
    174 Falun Gong 74,865 2,415 B Mid
    175 Tony Hinchcliffe 74,633 2,407 B Low
    176 Alternative for Germany 74,619 2,407 C Low
    177 John Kennedy (Louisiana politician) 74,549 2,404 C Low
    178 Atal Bihari Vajpayee 74,484 2,402 GA High
    179 Thomas Sowell 74,468 2,402 C Mid
    180 1964 United States presidential election 73,684 2,376 C Mid
    181 Daily Mail 73,658 2,376 B Mid
    182 Kellyanne Conway 73,502 2,371 B Low
    183 John Major 73,485 2,370 B High
    184 Russell Vought 73,051 2,356 C Mid
    185 Pat Boone 72,878 2,350 C Low
    186 David Duke 72,546 2,340 B Mid
    187 Sean Hannity 72,464 2,337 B Mid
    188 John Ratcliffe 72,330 2,333 C Low
    189 Angie Harmon 72,309 2,332 C Low
    190 Gadsden flag 71,425 2,304 B Low
    191 Condoleezza Rice 70,496 2,274 B Mid
    192 Project Esther 70,473 2,273 B Low
    193 Trumpism 69,707 2,248 B Mid
    194 Truth Social 69,647 2,246 B Low
    195 Jesse Watters 69,633 2,246 Start Low
    196 Craig T. Nelson 69,455 2,240 Start Unknown
    197 Clarence Thomas 69,451 2,240 B Mid
    198 Chester A. Arthur 69,066 2,227 FA Low
    199 Woke 69,041 2,227 B Top
    200 Dark Enlightenment 68,841 2,220 Start Mid
    201 Marc Andreessen 68,677 2,215 C Mid
    202 Rand Paul 68,226 2,200 GA Mid
    203 Paul von Hindenburg 67,604 2,180 C Mid
    204 2025 German federal election 67,374 2,173 B High
    205 Itamar Ben-Gvir 67,257 2,169 C Mid
    206 William F. Buckley Jr. 67,180 2,167 B Top
    207 Muhammad Ali Jinnah 66,511 2,145 FA High
    208 Angela Merkel 66,383 2,141 GA High
    209 Far-right politics 65,933 2,126 B Low
    210 Asmongold 65,800 2,122 C Low
    211 Kalergi Plan 65,715 2,119 Start Mid
    212 Denis Leary 65,452 2,111 C NA
    213 False or misleading statements by Donald Trump 65,371 2,108 B Low
    214 Trump derangement syndrome 65,333 2,107 C Mid
    215 Viktor Orbán 65,213 2,103 C Mid
    216 John Roberts 65,118 2,100 B High
    217 Sohei Kamiya 65,017 2,097 C Unknown
    218 Benjamin Harrison 64,822 2,091 FA Low
    219 Lauren Southern 64,430 2,078 Start Mid
    220 Ginger Rogers 63,992 2,064 C Unknown
    221 Theresa May 63,988 2,064 B Mid
    222 Cicero 63,706 2,055 B Mid
    223 Hillbilly Elegy 63,273 2,041 B Low
    224 John Thune 63,259 2,040 C Low
    225 Stacey Dash 63,245 2,040 C Low
    226 Likud 63,104 2,035 C Low
    227 Anthony Eden 62,945 2,030 B Mid
    228 Neville Chamberlain 62,734 2,023 FA Mid
    229 Ron Paul 62,724 2,023 C Mid
    230 Komeito 62,549 2,017 C Low
    231 Bing Crosby 62,327 2,010 B Low
    232 T. S. Eliot 62,094 2,003 B Low
    233 Make America Great Again 61,886 1,996 B High
    234 Great Replacement conspiracy theory 61,760 1,992 C Top
    235 Bill O'Reilly (political commentator) 61,009 1,968 B Mid
    236 Virginia Foxx 60,965 1,966 C Unknown
    237 Elise Stefanik 60,929 1,965 B Low
    238 Japan Innovation Party 60,633 1,955 Start Mid
    239 Tim Scott 60,466 1,950 C Low
    240 Nancy Reagan 60,303 1,945 B Mid
    241 Spiro Agnew 59,813 1,929 FA Mid
    242 Éamon de Valera 59,634 1,923 B High
    243 Jeanine Pirro 59,421 1,916 B Low
    244 AI slop 59,314 1,913 B Low
    245 Rachel Campos-Duffy 59,071 1,905 Start Low
    246 Strom Thurmond 58,729 1,894 B Mid
    247 Ben Carson 58,451 1,885 C Low
    248 Mark Levin 57,856 1,866 B High
    249 Ted Nugent 56,983 1,838 C Low
    250 Kevin McCarthy 56,910 1,835 B Low
    251 Brett Cooper (commentator) 56,777 1,831 Start Low
    252 Donald Rumsfeld 56,556 1,824 B Mid
    253 Bob Hope 56,518 1,823 B Low
    254 Nikki Haley 56,474 1,821 B Low
    255 Dave Ramsey 56,470 1,821 C Unknown
    256 Rick Scott 55,820 1,800 C Low
    257 Turning Point USA 55,589 1,793 C Low
    258 Michael Farmer, Baron Farmer 55,329 1,784 C Low
    259 Tom Clancy 55,055 1,775 C Low
    260 David Mamet 54,827 1,768 C Low
    261 Rutherford B. Hayes 54,761 1,766 FA Low
    262 John Locke 54,724 1,765 B Top
    263 Mike Waltz 54,664 1,763 C Low
    264 McCarthyism 54,298 1,751 C High
    265 Roger Stone 54,093 1,744 C Low
    266 Neoliberalism 53,453 1,724 B Top
    267 L. K. Advani 53,402 1,722 B High
    268 Right-wing politics 53,057 1,711 C Top
    269 House of Bourbon 52,683 1,699 B High
    270 Paul Ryan 52,527 1,694 C Mid
    271 Milo Yiannopoulos 52,091 1,680 C Low
    272 Naftali Bennett 52,072 1,679 B Mid
    273 Jacobitism 51,649 1,666 B High
    274 Reform Party of the United States of America 51,498 1,661 C Low
    275 Kevin Hassett 51,323 1,655 Start Mid
    276 Douglas Murray (author) 51,172 1,650 C Low
    277 Nicolas Sarkozy 50,920 1,642 B High
    278 Newt Gingrich 50,471 1,628 B High
    279 António de Oliveira Salazar 50,449 1,627 B Mid
    280 Conservatism 50,258 1,621 B Top
    281 Mike Huckabee 49,795 1,606 B Mid
    282 Monica Crowley 49,717 1,603 C Low
    283 Kayleigh McEnany 49,474 1,595 C Low
    284 Lee Zeldin 49,281 1,589 B Low
    285 Jane Russell 48,938 1,578 B Low
    286 Dana Perino 48,750 1,572 C Low
    287 Lisa McClain 48,697 1,570 C Low
    288 Mullah Omar 48,630 1,568 B High
    289 Scott Baio 48,537 1,565 Start Low
    290 Sarah Huckabee Sanders 48,522 1,565 C Low
    291 United Russia 48,397 1,561 B High
    292 Charles Hurt 48,281 1,557 Stub Unknown
    293 Proud Boys 48,212 1,555 C Low
    294 Menachem Begin 47,643 1,536 B Mid
    295 Barry Goldwater 47,578 1,534 B High
    296 Patrick Bet-David 47,498 1,532 C Low
    297 Libertarianism 47,295 1,525 B High
    298 Pat Sajak 47,270 1,524 C Low
    299 Jeb Bush 47,217 1,523 B Low
    300 Deportation in the second presidency of Donald Trump 47,157 1,521 B Low
    301 Amy Coney Barrett 47,127 1,520 B Low
    302 The Times of India 47,074 1,518 C Mid
    303 List of federal judges appointed by Donald Trump 47,033 1,517 List Low
    304 Harold Macmillan 47,018 1,516 B High
    305 Shiv Sena 46,871 1,511 C Unknown
    306 Ashley Moody 46,826 1,510 C Unknown
    307 John Cornyn 46,727 1,507 B Low
    308 Dan Quayle 46,457 1,498 B Mid
    309 Melissa Joan Hart 46,351 1,495 B Low
    310 Milton Friedman 46,180 1,489 GA High
    311 Rush Limbaugh 46,076 1,486 B High
    312 Jemima Goldsmith 46,036 1,485 C Unknown
    313 Conservative Party of Japan 45,931 1,481 C Low
    314 Capitalism 45,891 1,480 C Top
    315 Katie Britt 45,693 1,473 C Low
    316 W. B. Yeats 45,615 1,471 FA Low
    317 Benjamin Disraeli 45,549 1,469 FA Top
    318 Doug Ford 45,103 1,454 B Low
    319 Michael Whatley 45,067 1,453 Start Unknown
    320 John Rocker 45,021 1,452 C Unknown
    321 New York Post 44,940 1,449 C Low
    322 Bezalel Smotrich 44,822 1,445 C Mid
    323 Tammy Bruce 44,761 1,443 Start Low
    324 Bill Gothard 44,648 1,440 B Low
    325 Anthony Scaramucci 44,626 1,439 C Low
    326 Ray Bradbury 44,530 1,436 B Low
    327 Booker T. Washington 43,984 1,418 B Low
    328 Enoch Powell 43,935 1,417 C High
    329 Bret Stephens 43,867 1,415 C Low
    330 Bob Dole 43,841 1,414 B Low
    331 Oliver North 43,811 1,413 C Mid
    332 White supremacy 43,757 1,411 B Low
    333 María Elvira Salazar 43,706 1,409 C Low
    334 Victor Davis Hanson 43,533 1,404 B Mid
    335 Dimes Square 43,051 1,388 Stub Low
    336 Edward Heath 42,982 1,386 B High
    337 White genocide conspiracy theory 42,916 1,384 B Low
    338 Terri Schiavo case 42,848 1,382 GA Low
    339 Alessandra Mussolini 42,784 1,380 B Low
    340 Loretta Young 42,271 1,363 C Low
    341 Brett Kavanaugh 42,222 1,362 B High
    342 Park Chung Hee 42,182 1,360 C Low
    343 Glenn Beck 42,167 1,360 B Mid
    344 Benny Johnson (columnist) 41,880 1,350 Start Low
    345 Martin Heidegger 41,759 1,347 C Low
    346 Jim Jordan 41,711 1,345 B Low
    347 Tea Party movement 41,334 1,333 C Mid
    348 Rick Perry 41,137 1,327 B Mid
    349 Don King 41,118 1,326 B Low
    350 Last Man Standing (American TV series) 41,049 1,324 B Low
    351 Winsome Earle-Sears 40,775 1,315 C Low
    352 Federalist Party 40,684 1,312 B Low
    353 Pat Buchanan 40,154 1,295 B Mid
    354 Gretchen Carlson 39,634 1,278 B Low
    355 Matt Walsh (political commentator) 39,405 1,271 C Low
    356 In God We Trust 39,325 1,268 GA Mid
    357 Joni Ernst 39,274 1,266 B Low
    358 Gavin McInnes 39,115 1,261 C Low
    359 Brothers of Italy 38,685 1,247 B Mid
    360 Fianna Fáil 38,642 1,246 B Low
    361 Ayaan Hirsi Ali 38,508 1,242 B Low
    362 Tomi Lahren 38,408 1,238 Start Low
    363 Aleksandr Solzhenitsyn 38,296 1,235 B Mid
    364 Right-wing populism 38,236 1,233 B High
    365 Fred Thompson 38,176 1,231 B Low
    366 Mary Matalin 38,129 1,229 C Low
    367 Critical race theory 38,047 1,227 C Low
    368 Geert Wilders 37,993 1,225 B Low
    369 Blaire White 37,909 1,222 Start Low
    370 The Fountainhead 37,705 1,216 FA Low
    371 Will Cain 37,339 1,204 C Mid
    372 Eva Vlaardingerbroek 37,170 1,199 Start Unknown
    373 Kelly Loeffler 36,978 1,192 B Low
    374 Roger Ailes 36,962 1,192 C Mid
    375 John C. Calhoun 36,905 1,190 FA Top
    376 Trey Gowdy 36,694 1,183 C Mid
    377 First presidency of Donald Trump 36,683 1,183 B Low
    378 Steve Scalise 36,583 1,180 C Mid
    379 Vinayak Damodar Savarkar 36,560 1,179 B High
    380 Tommy Tuberville 36,151 1,166 B Low
    381 List of national presidents of the Bharatiya Janata Party 36,051 1,162 FL Low
    382 Byron Donalds 35,945 1,159 C Low
    383 National Rally 35,914 1,158 C High
    384 Scott Presler 35,737 1,152 B Low
    385 Thomas Mann 35,692 1,151 C Mid
    386 Lee Hsien Loong 35,665 1,150 C Mid
    387 D. H. Lawrence 35,630 1,149 B Unknown
    388 Yeonmi Park 35,540 1,146 B Low
    389 Tudor Dixon 35,459 1,143 B Low
    390 Lord Randolph Churchill 35,446 1,143 Start Unknown
    391 Marsha Blackburn 35,411 1,142 C Low
    392 The Daily Telegraph 35,263 1,137 C Low
    393 Liz Cheney 35,221 1,136 B High
    394 Franklin Graham 35,174 1,134 B Low
    395 Tom Cotton 35,157 1,134 C Low
    396 Walter Brennan 35,125 1,133 C Low
    397 Samuel Alito 35,016 1,129 C Mid
    398 Christian Democratic Union of Germany 34,975 1,128 C High
    399 Freedom Caucus 34,919 1,126 C Low
    400 Oliver Anthony 34,895 1,125 Start Low
    401 John Layfield 34,819 1,123 B Low
    402 Steve Hilton 34,755 1,121 C Mid
    403 1924 United States presidential election 34,558 1,114 C Low
    404 Left–right political spectrum 34,503 1,113 C Top
    405 Alice Weidel 34,397 1,109 C Low
    406 Haganah 34,353 1,108 C Mid
    407 Harmeet Dhillon 34,310 1,106 Start Low
    408 Cambodian People's Party 34,247 1,104 Start Low
    409 Chip Roy 34,204 1,103 B Low
    410 Breitbart News 34,127 1,100 C Mid
    411 Mike Lindell 34,063 1,098 C Low
    412 The Epoch Times 33,929 1,094 B Low
    413 Mike Lee 33,845 1,091 C Low
    414 Neoconservatism 33,755 1,088 C Top
    415 Madison Cawthorn 33,644 1,085 C Low
    416 Lillian Gish 33,604 1,084 C Low
    417 Thrash metal 33,602 1,083 B Low
    418 Antonin Scalia 33,583 1,083 FA High
    419 Ward Bond 33,311 1,074 C Low
    420 Corey Lewandowski 33,031 1,065 C Low
    421 John Birch Society 33,002 1,064 C Low
    422 Fred MacMurray 32,870 1,060 C Low
    423 Deep state conspiracy theory in the United States 32,763 1,056 Start Low
    424 Jeff Sessions 32,612 1,052 Start Unknown
    425 UK Independence Party 32,570 1,050 B Low
    426 Bourbon Restoration in France 32,542 1,049 C High
    427 Shiv Sena (UBT) 32,380 1,044 C Mid
    428 Political appointments of the second Trump administration 32,366 1,044 List Low
    429 Otzma Yehudit 32,336 1,043 B Mid
    430 Kari Lake 32,137 1,036 C Low
    431 Jonathan Guinness, 3rd Baron Moyne 31,863 1,027 Start Unknown
    432 Donald Trump and fascism 31,854 1,027 B Mid
    433 Sheldon Adelson 31,705 1,022 C Low
    434 Edmund Burke 31,692 1,022 B Top
    435 Conservatism in the United States 31,497 1,016 B Top
    436 American Independent Party 31,317 1,010 C Low
    437 Danielle Smith 31,295 1,009 B Unknown
    438 Liberty University 31,202 1,006 B Mid
    439 Flannery O'Connor 31,124 1,004 A Low
    440 Mike Gabbard 31,098 1,003 C Low
    441 Islamophobia 31,090 1,002 C Mid
    442 Primogeniture 31,081 1,002 Start Low
    443 Meir Kahane 31,072 1,002 B High
    444 Rumble (company) 31,067 1,002 Start Low
    445 Jim Bob Duggar 31,030 1,000 C Low
    446 Christopher Luxon 30,869 995 B Unknown
    447 Jerry Falwell 30,774 992 B High
    448 Allie Beth Stuckey 30,649 988 C Low
    449 Alt-right 30,536 985 C Mid
    450 Chris Christie 30,362 979 C Low
    451 Dan Patrick (politician) 30,213 974 C Mid
    452 Views of Elon Musk 30,173 973 B Mid
    453 Stephen Harper 30,154 972 GA High
    454 Zia-ul-Haq 30,116 971 B High
    455 Blue Dog Coalition 29,946 966 C Low
    456 Redneck 29,927 965 C Low
    457 Adam Kinzinger 29,890 964 C Low
    458 Anita Bryant 29,788 960 B High
    459 Fiscal conservatism 29,714 958 B Top
    460 Friedrich Hayek 29,709 958 B Top
    461 David Koch 29,672 957 C Mid
    462 Richard Grenell 29,644 956 C Low
    463 Lawrence B. Jones 29,620 955 Start Unknown
    464 Franz von Papen 29,536 952 B Low
    465 Marine Le Pen 29,495 951 B Low
    466 Christian Zionism 29,482 951 C Mid
    467 Moshe Dayan 29,316 945 B Mid
    468 Honoré de Balzac 29,269 944 FA High
    469 Laissez-faire 29,197 941 C Top
    470 Islam in the United Kingdom 29,193 941 B Low
    471 Suella Braverman 29,119 939 C Low
    472 Nick Land 28,869 931 C Low
    473 Shas 28,799 929 C Low
    474 Richard B. Spencer 28,796 928 C Low
    475 Nawaz Sharif 28,670 924 B Unknown
    476 Justice and Development Party (Turkey) 28,646 924 B Low
    477 Tom Wolfe 28,559 921 B Low
    478 Jake Berry 28,545 920 Start Low
    479 Fine Gael 28,431 917 B High
    480 Christian nationalism 28,430 917 Start High
    481 Conservative Party of Canada 28,419 916 B High
    482 White Terror (Spain) 28,415 916 B Mid
    483 Jacob Rees-Mogg 28,299 912 C Low
    484 Johnny Ramone 28,219 910 C Low
    485 James Cleverly 28,141 907 C Low
    486 The Daily Wire 28,133 907 C Low
    487 Effects of pornography 28,096 906 C Low
    488 Vox (political party) 28,071 905 B Mid
    489 Najib Razak 28,011 903 B Mid
    490 Rose Wilder Lane 27,988 902 B Mid
    491 Dan Crenshaw 27,786 896 B Low
    492 Dennis Miller 27,637 891 Start Low
    493 Paul Dans 27,464 885 Start Low
    494 Dennis Prager 27,463 885 C Low
    495 Barack Obama citizenship conspiracy theories 27,376 883 B Low
    496 Reagan (2024 film) 27,337 881 C Low
    497 Rivers of Blood speech 27,280 880 C Low
    498 Robert Davi 27,270 879 Start Low
    499 Alec Douglas-Home 27,233 878 FA Low
    500 Stanley Baldwin 27,213 877 B High


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    1 September 2025

    31 August 2025

    For this watchlist but with about 3X in length, See: Wikipedia:WikiProject Conservatism/Recent changes
    Alternative watchlist prototypes (Excerpts)
    See also: Low-importance recent changes
    See also: Mid-importance recent changes
    See also: High-importance recent changes
    See also: Top-importance recent changes
    See also: Preconfigured recent vandalism shortlist

    Publications watchlist prototype beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    1 September 2025

    31 August 2025

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Publications recent changes

    Watchlist of journalists, bloggers, commentators etc., beneath this line:

    Discuss  · Edit

    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    1 September 2025

    31 August 2025

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Journalism recent changes

    Organizations watchlist beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    1 September 2025

    31 August 2025

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Organizations recent changes

    Prototype political parties watchlist beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    1 September 2025

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Political parties recent changes

    Prototype politicians watchlist beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    1 September 2025

    For a watchlist potentially up to 30X in length, see Wikipedia:WikiProject Conservatism/Politicians recent changes

    Prototype MISC (drafts, templates etc.) watchlist beneath this line:


    List of abbreviations (help):
    D
    Edit made at Wikidata
    r
    Edit flagged by ORES
    N
    New page
    m
    Minor edit
    b
    Bot edit
    (±123)
    Page byte size change

    1 September 2025

    31 August 2025

    For a watchlist potentially up to30X in length, see Wikipedia:WikiProject Conservatism/MISC recent changes

    New articles

    A list of semi-related articles that were recently created

    This list was generated from these rules. Questions and feedback are always welcome! The search is being run daily with the most recent ~14 days of results. Note: Some articles may not be relevant to this project.

    Rules | Match log | Results page (for watching) | Last updated: 2025-08-31 19:55 (UTC)

    Note: The list display can now be customized by each user. See List display personalization for details.















    In The Signpost

    One of various articles to this effect
    July 2018
    DISCUSSION REPORT
    WikiProject Conservatism Comes Under Fire

    By Lionelt

    WikiProject Conservatism was a topic of discussion at the Administrators' Noticeboard/Incident (AN/I). Objective3000 started a thread where he expressed concern regarding the number of RFC notices posted on the Discussion page suggesting that such notices "could result in swaying consensus by selective notification." Several editors participated in the relatively abbreviated six hour discussion. The assertion that the project is a "club for conservatives" was countered by editors listing examples of users who "profess no political persuasion." It was also noted that notification of WikiProjects regarding ongoing discussions is explicitly permitted by the WP:Canvassing guideline.

    At one point the discussion segued to feedback about The Right Stuff. Member SPECIFICO wrote: "One thing I enjoy about the Conservatism Project is the handy newsletter that members receive on our talk pages." Atsme praised the newsletter as "first-class entertainment...BIGLY...first-class...nothing even comes close...it's amazing." Some good-natured sarcasm was offered with Objective3000 observing, "Well, they got the color right" and MrX's followup, "Wow. Yellow is the new red."

    Admin Oshwah closed the thread with the result "definitely not an issue for ANI" and directing editors to the project Discussion page for any further discussion. Editor's note: originally the design and color of The Right Stuff was chosen to mimic an old, paper newspaper.

    Add the Project Discussion page to your watchlist for the "latest RFCs" at WikiProject Conservatism Watch (Discuss this story)

    ARTICLES REPORT
    Margaret Thatcher Makes History Again

    By Lionelt

    Margaret Thatcher is the first article promoted at the new WikiProject Conservatism A-Class review. Congratulations to Neveselbert. A-Class is a quality rating which is ranked higher than GA (Good article) but the criteria are not as rigorous as FA (Featued article). WikiProject Conservatism is one of only two WikiProjects offering A-Class review, the other being WikiProject Military History. Nominate your article here. (Discuss this story)
    RECENT RESEARCH
    Research About AN/I

    By Lionelt

    Reprinted in part from the April 26, 2018 issue of The Signpost; written by Zarasophos

    Out of over one hundred questioned editors, only twenty-seven (27%) are happy with the way reports of conflicts between editors are handled on the Administrators' Incident Noticeboard (AN/I), according to a recent survey . The survey also found that dissatisfaction has varied reasons including "defensive cliques" and biased administrators as well as fear of a "boomerang effect" due to a lacking rule for scope on AN/I reports. The survey also included an analysis of available quantitative data about AN/I. Some notable takeaways:

    • 53% avoided making a report due to fearing it would not be handled appropriately
    • "Otherwise 'popular' users often avoid heavy sanctions for issues that would get new editors banned."
    • "Discussions need to be clerked to keep them from raising more problems than they solve."

    In the wake of Zarasophos' article editors discussed the AN/I survey at The Signpost and also at AN/I. Ironically a portion of the AN/I thread was hatted due to "off-topic sniping." To follow-up the problems identified by the research project the Wikimedia Foundation Anti-Harassment Tools team and Support and Safety team initiated a discussion. You can express your thoughts and ideas here.

    (Discuss this story)

    Delivered: ~~~~~


    File:Finally, a public-___domain "Finally," template.jpg
    Goodyear
    PD
    47
    0
    442
    WikiProject Conservatism

    Is Wikipedia Politically Biased? Perhaps


    A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.


    Report by conservative think-tank presents ample quantitative evidence for "mild to moderate" "left-leaning bias" on Wikipedia

    A paper titled "Is Wikipedia Politically Biased?"[1] answers that question with a qualified yes:

    [...] this report measures the sentiment and emotion with which political terms are used in [English] Wikipedia articles, finding that Wikipedia entries are more likely to attach negative sentiment to terms associated with a right-leaning political orientation than to left-leaning terms. Moreover, terms that suggest a right-wing political stance are more frequently connected with emotions of anger and disgust than those that suggest a left-wing stance. Conversely, terms associated with left-leaning ideology are more frequently linked with the emotion of joy than are right-leaning terms.
    Our findings suggest that Wikipedia is not entirely living up to its neutral point of view policy, which aims to ensure that content is presented in an unbiased and balanced manner.

    The author (David Rozado, an associate professor at Otago Polytechnic) has published ample peer-reviewed research on related matters before, some of which was featured e.g. in The Guardian and The New York Times. In contrast, the present report is not peer-reviewed and was not posted in an academic venue, unlike most research we cover here usually. Rather, it was published (and possibly commissioned) by the Manhattan Institute, a conservative US think tank, which presumably found its results not too objectionable. (Also, some – broken – URLs in the PDF suggest that Manhattan Institute staff members were involved in the writing of the paper.) Still, the report indicates an effort to adhere to various standards of academic research publications, including some fairly detailed descriptions of the methods and data used. It is worth taking it more seriously than, for example, another recent report that alleged a different form of political bias on Wikipedia, which had likewise been commissioned by an advocacy organization and authored by an academic researcher, but was met with severe criticism by the Wikimedia Foundation (who called it out for "unsubstantiated claims of bias") and volunteer editors (see prior Signpost coverage).

    That isn't to say that there can't be some questions about the validity of Rozado's results, and in particular about how to interpret them. But let's first go through the paper's methods and data sources in more detail.

    Determining the sentiment and emotion in Wikipedia's coverage

    The report's main results regarding Wikipedia are obtained as follows:

    "We first gather a set of target terms (N=1,628) with political connotations (e.g., names of recent U.S. presidents, U.S. congressmembers, U.S. Supreme Court justices, or prime ministers of Western countries) from external sources. We then identify all mentions in English-language Wikipedia articles of those terms.

    We then extract the paragraphs in which those terms occur to provide the context in which the target terms are used and feed a random sample of those text snippets to an LLM (OpenAI’s gpt-3.5-turbo), which annotates the sentiment/emotion with which the target term is used in the snippet. To our knowledge, this is the first analysis of political bias in Wikipedia content using modern LLMs for annotation of sentiment/emotion."

    The sentiment classification rates the mention of a terms as negative, neutral or positive. (For the purpose of forming averages this is converted into a quantitative scale from -1 to +1.) See the end of this review for some concrete examples from the paper's published dataset.

    The emotion classification uses "Ekman’s six basic emotions (anger, disgust, fear, joy, sadness, and surprise) plus neutral."

    The annotation method used appears to be an effort to avoid the shortcomings of popular existing sentiment analysis techniques, which often only rate the overall emotional stance of a given text overall without determining whether it actually applies to a specific entity mentioned in it (or in some cases even fail to handle negations, e.g. by classifying "I am not happy" as a positive emotion). Rozado justifies the "decision to use automated annotation" (which presumably rendered considerable cost savings, also by resorting to OpenAI's older GPT 3.5 model rather than the more powerful but more expensive GPT-4 API released in March 2023) citing "recent evidence showing how top-of-the-rank LLMs outperform crowd workers for text-annotation tasks such as stance detection." This is indeed becoming a more widely used choice for text classification. But Rozado appears to have skipped the usual step of evaluating the accuracy of this automated method (and possibly improving the prompts it used) against a gold standard sample from (human) expert raters.

    Selecting topics to examine for bias

    As for the selection of terms whose Wikipedia coverage to annotate with this classifier, Rozado does a lot of due diligence to avoid cherry-picking: "To reduce the degrees of freedom of our analysis, we mostly use external sources of terms [including Wikipedia itself, e.g. its list of members of the 11th US Congress] to conceptualize a political category into left- and right-leaning terms, as well as to choose the set of terms to include in each category." This addresses an important source of researcher bias.

    Overall, the study arrives at 12 different groups of such terms:

    • 8 of these refer to people (e.g. US presidents, US senators, UK members of parliament, US journalists).
    • Two are about organizations (US think tanks and media organizations).
    • The other two groups contain "Terms that describe political orientation", i.e. expressions that carry a left-leaning or right-leaning meaning themselves:
      • 18 "political leanings" (where "Rightists" receives the lowest average sentiment and "Left winger" the highest), and
      • 21 "extreme political ideologies" (where "Ultraconservative" scores lowest and "radical-left" has the highest – but still slightly negative – average sentiment)

    What is "left-leaning" and "right-leaning"?

    As discussed, Rozado's methods for generating these lists of people and organizations seem reasonably transparent and objective. It gets a bit murkier when it comes to splitting them into "left-leaning" and "right-leaning", where the chosen methods remain unclear and/or questionable in some cases. Of course there is a natural choice available for US Congress members, where the confines of the US two-party system mean that the left-right spectrum can be easily mapped easily to Democrats vs. Republicans (disregarding a small number of independents or libertarians).

    In other cases, Rozado was able to use external data about political leanings, e.g. "a list of politically aligned U.S.-based journalists" from Politico. There may be questions about construct validity here (e.g. it classifies Glenn Greenwald or Andrew Sullivan as "journalists with the left"), but at least this data is transparent and determined by a source not invested in the present paper's findings.

    But for example the list of UK MPs used contains politicians from 14 different parties (plus independents). Even if one were to confine the left vs. right labels to the two largest groups in the UK House of Commons (Tories vs. Labour and Co-operative Party, which appears to have been the author's choice judging from Figure 5), the presence of a substantial number of parliamentarians from other parties to the left or right of those would make the validity of this binary score more questionable than in the US case. Rozado appears to acknowledge a related potential issue in a side remark when trying to offer an explanation for one of the paper's negative results (no bias) in this case: "The disparity of sentiment associations in Wikipedia articles between U.S. Congressmembers and U.K. MPs based on their political affiliation may be due in part to the higher level of polarization in the U.S. compared to the U.K."

     
    Most negative sentiment among Western leaders: Former Australian PM Tony Abbott
     
    Most positive sentiment among Western leaders: Former Australian PM Scott Morrison

    This kind of question become even more complicated for the "Leaders of Western Countries" list (where Tony Abbott scored the most negative average sentiment, and José Luis Rodríguez Zapatero and Scott Morrison appear to be in a tie for the most positive average sentiment). Most of these countries do not have a two-party system either. Sure, their leaders usually (like in the UK case) hail from one of the two largest parties, one of which is more to the left and the another more to the right. But it certainly seems to matter for the purpose of Rozado's research question whether that major party is more moderate (center-left or center-right, with other parties between it and the far left or far right) or more radical (i.e. extending all the way to the far-left or far-right spectrum of elected politicians).

    What's more, the analysis for this last group compares political orientations across multiple countries. Which brings us to a problem that Wikipedia's Jimmy Wales had already pointed to back in 2006 in response a conservative US blogger who had argued that there was "a liberal bias in many hot-button topic entries" on English Wikipedia:

    "The Wikipedia community is very diverse, from liberal to conservative to libertarian and beyond. If averages mattered, and due to the nature of the wiki software (no voting) they almost certainly don't, I would say that the Wikipedia community is slightly more liberal than the U.S. population on average, because we are global and the international community of English speakers is slightly more liberal than the U.S. population. ... The idea that neutrality can only be achieved if we have some exact demographic matchup to [the] United States of America is preposterous."

    We already discussed this issue in our earlier reviews of a notable series of papers by Greenstein and Zhu (see e.g.: "Language analysis finds Wikipedia's political bias moving from left to right", 2012), which had relied on a US-centric method of defining left-leaning and right-leaning (namely, a corpus derived from the US Congressional Record). Those studies form a large part of what Rozado cites as "[a] substantial body of literature [that]—albeit with some exceptions—has highlighted a perceived bias in Wikipedia content in favor of left-leaning perspectives." (The cited exception is a paper[2] that had found "a small to medium size coverage bias against [members of parliament] from the center-left parties in Germany and in France", and identified patterns of "partisan contributions" as a plausible cause.)

    Similarly, 8 out of the 10 groups of people and organizations analyzed in Rozado's study are from the US (the two exceptions being the aforementioned lists of UK MPs and leaders of Western countries).

    In other words, one potential reason for the disparities found by Rozado might simply be that he is measuring an international encyclopedia with a (largely) national yardstick of fairness. This shouldn't let us dismiss his findings too easily. But it is a bit disappointing that this possibility is nowhere addressed in the paper, even though Rozado diligently discusses some other potential limitations of the results. E.g. he notes that "some research has suggested that conservatives themselves are more prone to negative emotions and more sensitive to threats than liberals", but points out that the general validity of those research results remains doubtful.

    Another limitation is that a simple binary left vs. right classification might be hiding factors that can shed further light on bias findings. Even in the US with its two-party system, political scientists and analysts have long moved to less simplistic measures of political orientations. A widely used one is the NOMINATE method which assigns members of the US Congress continuous scores based on their detailed voting record, one of which corresponds to the left-right spectrum as traditionally understood. One finding based on that measure that seems relevant in context of the present study is the (widely discussed but itself controversial) asymmetric polarization thesis, which argues that "Polarization among U.S. legislators is asymmetric, as it has primarily been driven by a substantial rightward shift among congressional Republicans since the 1970s, alongside a much smaller leftward shift among congressional Democrats" (as summarized in the linked Wikipedia article). If, for example, higher polarization was associated with negative sentiments, this could be a potential explanation for Rozado's results. Again, this has to remain speculative, but it seems another notable omission in the paper's discussion of limitations.

    What does "bias" mean here?

    A fundamental problem of this study, which, to be fair, it shares with much fairness and bias research (in particular on Wikipedia's gender gap, where many studies similarly focus on binary comparisons that are likely to successfully appeal to an intuitive sense of fairness) consists of justifying its answers to the following two basic questions:

    1. What would be a perfectly fair baseline, a result that makes us confident to call Wikipedia unbiased?
    2. If there are deviations from that baseline (often labeled disparities, gaps or biases), what are the reasons for that – can we confidently assume they were caused by Wikipedia itself (e.g. demographic imbalances in Wikipedia's editorship), or are they more plausibly attributed to external factors?

    Regarding 1 (defining a baseline of unbiasedness), Rozado simply assumes that this should imply statistically indistinguishable levels of average sentiment between left and right-leaning terms. However, as cautioned by one leading scholar on quantitative measures of bias, "the 'one true fairness definition' is a wild goose chase" – there are often multiple different definitions available that can all be justified on ethical grounds, and are often contradictory. Above, we already alluded to two potentially diverging notions of political unbiasedness for Wikipedia (using an international instead of US metric for left vs right leaning, and taking into account polarization levels for politicians).

    But yet another question, highly relevant for Wikipedians interested in addressing the potential problems reported in this paper, is how much its definition lines up with Wikipedia's own definition of neutrality. Rozado clearly thinks that it does:

    Wikipedia’s neutral point of view (NPOV) policy aims for articles in Wikipedia to be written in an impartial and unbiased tone. Our results suggest that Wikipedia’s NPOV policy is not achieving its stated goal of political-viewpoint neutrality in Wikipedia articles.

    WP:NPOV indeed calls for avoiding subjective language and expressing judgments and opinions in Wikipedia's own voice, and Rozado's findings about the presence of non-neutral sentiments and emotions in Wikipedia articles are of some concern in that regard. However, that is not the core definition of NPOV. Rather, it refers to "representing fairly, proportionately, and, as far as possible, without editorial bias, all the significant views that have been published by reliable sources on a topic." What if the coverage of the terms examined by Rozado (politicians, etc.) in those reliable sources, in their aggregate, were also biased in the sense of Rozado's definition? US progressives might be inclined to invoke the snarky dictum "reality has a liberal bias" by comedian Stephen Colbert. Of course, conservatives might object that Wikipedia's definition of reliable sources (having "a reputation for fact-checking and accuracy") is itself biased, or applied in a biased way by Wikipedians. For some of these conservatives (at least those that are not also conservative feminists) it may be instructive to compare examinations of Wikipedia's gender gaps, which frequently focus on specific groups of notable people like in Rozado's study. And like him, they often implicitly assume a baseline of unbiasedness that implies perfect symmetry in Wikipedia's coverage – i.e. the absence of gaps or disparities. Wikipedians often object that this is in tension with the aforementioned requirement to reflect coverage in reliable sources. For example, Wikipedia's list of Fields medalists (the "Nobel prize of Mathematics") is 97% male – not because of Wikipedia editors' biases against women, but because of a severe gender imbalance in the field of mathematics that is only changing slowly, i.e. factors outside Wikipedia's influence.

    All this brings us to question 2. above (causality). While Rozado uses carefully couched language in this regard ("suggests" etc, e.g. "These trends constitute suggestive evidence of political bias embedded in Wikipedia articles"), such qualifications are unsurprisingly absent in much of the media coverage of this study (see also this issue's In the media). For example, the conservative magazine The American Spectator titled its article about the paper "Now We've Got Proof that Wikipedia is Biased."

    Commendably, the paper is accompanied by a published dataset, consisting of the analyzed Wikipedia text snippets together with the mentioned term and the sentiment or emotion identified by the automated annotation. For illustration, below are the sentiment ratings for mentions of the Yankee Institute for Public Policy (the last term in the dataset, as a non-cherry-picked example), with the term bolded:

    Dataset excerpt: Wikipedia paragraphs with sentiment for "Yankee Institute for Public Policy"
    positive "Carol Platt Liebau is president of the Yankee Institute for Public Policy.Liebau named new president of Yankee Institute She is also an attorney, political analyst, and conservative commentator. Her book Prude: How the Sex-Obsessed Culture Damages Girls (and America, Too!) was published in 2007."
    neutral "Affiliates

    Regular members are described as ""full-service think tanks"" operating independently within their respective states.

    Alabama: Alabama Policy Institute
    Alaska: Alaska Policy Forum
    [...]
    Connecticut: Yankee Institute for Public Policy
    [...]
    Wisconsin: MacIver Institute for Public Policy, Badger Institute, Wisconsin Institute for Law and Liberty, Institute for Reforming Government
    Wyoming: Wyoming Liberty Group"
    positive "The Yankee Institute for Public Policy is a free market, limited government American think tank based in Hartford, Connecticut, that researches Connecticut public policy questions. Organized as a 501(c)(3), the group's stated mission is to ""develop and advocate for free market, limited government public policy solutions in Connecticut."" Yankee was founded in 1984 by Bernard Zimmern, a French entrepreneur who was living in Norwalk, Connecticut, and Professor Gerald Gunderson of Trinity College. The organization is a member of the State Policy Network."
    neutral "He is formerly Chairman of the Yankee Institute for Public Policy. On November 3, 2015, he was elected First Selectman in his hometown of Stonington, Connecticut, which he once represented in Congress. He defeated the incumbent, George Crouse. Simmons did not seek reelection in 2019."
    negative "In Connecticut the union is closely identified with liberal Democratic politicians such as Governor Dannel Malloy and has clashed frequently with fiscally conservative Republicans such as former Governor John G. Rowland as well as the Yankee Institute for Public Policy, a free-market think tank."
    positive "In 2021, after leaving elective office, she was named a Board Director of several organizations. One is the Center for Workforce Inclusion, a national nonprofit in Washington, DC, that works to provide meaningful employment opportunities for older individuals. Another is the William F. Buckley Program at Yale, which aims to promote intellectual diversity, expand political discourse on campus, and expose students to often-unvoiced views at Yale University. She also serves on the Board of the Helicon Foundation, which explores chamber music in its historical context by presenting and producing period performances, including an annual subscription series of four Symposiums in New York featuring both performance and discussion of chamber music. She is also a Board Director of the American Hospital of Paris Foundation, which provides funding support for the operations of the American Hospital of Paris and functions as the link between the Hospital and the United States, funding many collaborative and exchange programs with New York-Presbyterian Hospital. She is also a Fellow of the Yankee Institute for Public Policy, a research and citizen education organization that focuses on free markets and limited government, as well as issues of transparency and good governance."
    positive "He was later elected chairman of the New Hampshire Republican State Committee, a position he held from 2007 to 2008. When he was elected he was 34 years old, making him the youngest state party chairman in the history of the United States at the time. His term as chairman included the 2008 New Hampshire primary, the first primary in the 2008 United States presidential election. He later served as the executive director of the Yankee Institute for Public Policy for five years, beginning in 2009. He is the author of a book about the New Hampshire primary, entitled Granite Steps, and the founder of the immigration reform advocacy group Americans By Choice."

    Briefly


    Other recent publications

    Other recent publications that could not be covered in time for this issue include the items listed below. Contributions, whether reviewing or summarizing newly published research, are always welcome.

    How English Wikipedia mediates East Asian historical disputes with Habermasian communicative rationality

    From the abstract: [3]

    "We compare the portrayals of Balhae, an ancient kingdom with contested contexts between [South Korea and China]. By comparing Chinese, Korean, and English Wikipedia entries on Balhae, we identify differences in narrative construction and framing. Employing Habermas’s typology of human action, we scrutinize related talk pages on English Wikipedia to examine the strategic actions multinational contributors employ to shape historical representation. This exploration reveals the dual role of online platforms in both amplifying and mediating historical disputes. While Wikipedia’s policies promote rational discourse, our findings indicate that contributors often vacillate between strategic and communicative actions. Nonetheless, the resulting article approximates Habermasian ideals of communicative rationality."

    From the paper:

    "The English Wikipedia presents Balhae as a multi-ethnic kingdom, refraining from emphasizing the dominance of a single tribe. In comparison to the two aforementioned excerpts [from Chinese and Korean Wikipedia], the lead section of the English Wikipedia concentrates more on factual aspects of history, thus excluding descriptions that might entail divergent interpretations. In other words, this account of Balhae has thus far proven acceptable to a majority of Wikipedians from diverse backgrounds. [...] Compared to other language versions, the English Wikipedia forthrightly acknowledges the potential disputes regarding Balhae's origin, ethnic makeup, and territorial boundaries, paving the way for an open and transparent exploration of these contested historical subjects. The separate 'Balhae controversies' entry is dedicated to unpacking the contentious issues. In essence, the English article adopts a more encyclopedic tone, aligning closely with Wikipedia's mission of providing information without imposing a certain perspective."

    (See also excerpts)

    Facebook/Meta's "No Language Left Behind" translation model used on Wikipedia

    From the abstract of this publication by a large group of researchers (most of them affiliated with Meta AI):[4]

    "Focusing on improving the translation qualities of a relatively small group of high-resource languages comes at the expense of directing research attention to low-resource languages, exacerbating digital inequities in the long run. To break this pattern, here we introduce No Language Left Behind—a single massively multilingual model that leverages transfer learning across languages. [...] Compared with the previous state-of-the-art models, our model achieves an average of 44% improvement in translation quality as measured by BLEU. By demonstrating how to scale NMT [neural machine translation] to 200 languages and making all contributions in this effort freely available for non-commercial use, our work lays important groundwork for the development of a universal translation system."

    "Four months after the launch of NLLB-200 [in 2022], Wikimedia reported that our model was the third most used machine translation engine used by Wikipedia editors (accounting for 3.8% of all published translations) (https://web.archive.org/web/20221107181300/https://nbviewer.org/github/wikimedia-research/machine-translation-service-analysis-2022/blob/main/mt_service_comparison_Sept2022_update.ipynb). Compared with other machine translation services and across all languages, articles translated with NLLB-200 has the lowest percentage of deletion (0.13%) and highest percentage of translation modification kept under 10%."

    "Which Nigerian-Pidgin does Generative AI speak?" – only the BBC's, not Wikipedia's

    From the abstract:[5]

    "Naija is the Nigerian-Pidgin spoken by approx. 120M speakers in Nigeria [...]. Although it has mainly been a spoken language until recently, there are currently two written genres (BBC and Wikipedia) in Naija. Through statistical analyses and Machine Translation experiments, we prove that these two genres do not represent each other (i.e., there are linguistic differences in word order and vocabulary) and Generative AI operates only based on Naija written in the BBC genre. In other words, Naija written in Wikipedia genre is not represented in Generative AI."

    The paper's findings are consistent with an analysis by the Wikimedia Foundation's research department that compared the number of Wikipedia articles to the number of speakers for the top 20 most-spoken languages, where Naija stood out as one of the most underrepresented.

    "[A] surprising tension between Wikipedia's principle of safeguarding against self-promotion and the scholarly norm of 'due credit'"

    From the abstract:[6]

    Although Wikipedia offers guidelines for determining when a scientist qualifies for their own article, it currently lacks guidance regarding whether a scientist should be acknowledged in articles related to the innovation processes to which they have contributed. To explore how Wikipedia addresses this issue of scientific "micro-notability", we introduce a digital method called Name Edit Analysis, enabling us to quantitatively and qualitatively trace mentions of scientists within Wikipedia's articles. We study two CRISPR-related Wikipedia articles and find dynamic negotiations of micro-notability as well as a surprising tension between Wikipedia’s principle of safeguarding against self-promotion and the scholarly norm of “due credit.” To reconcile this tension, we propose that Wikipedians and scientists collaborate to establish specific micro-notability guidelines that acknowledge scientific contributions while preventing excessive self-promotion.

    See also coverage of a different paper that likewise analyzed Wikipedia's coverage of CRISPR: "Wikipedia as a tool for contemporary history of science: A case study on CRISPR"

    "How article category in Wikipedia determines the heterogeneity of its editors"

    From the abstract:[7]

    " [...] the quality of Wikipedia articles rises with the number of editors per article as well as a greater diversity among them. Here, we address a not yet documented potential threat to those preconditions: self-selection of Wikipedia editors to articles. Specifically, we expected articles with a clear-cut link to a specific country (e.g., about its highest mountain, "national" article category) to attract a larger proportion of editors of that nationality when compared to articles without any specific link to that country (e.g., "gravity", "universal" article category), whereas articles with a link to several countries (e.g., "United Nations", "international" article category) should fall in between. Across several language versions, hundreds of different articles, and hundreds of thousands of editors, we find the expected effect [...]"

    "What do they make us see:" The "cultural bias" of GLAMs is worse on Wikidata

    From the abstract:[8]

    "Large cultural heritage datasets from museum collections tend to be biased and demonstrate omissions that result from a series of decisions at various stages of the collection construction. The purpose of this study is to apply a set of ethical criteria to compare the level of bias of six online databases produced by two major art museums, identifying the most biased and the least biased databases. [...] For most variables the online system database is more balanced and ethical than the API dataset and Wikidata item collection of the two museums."

    References

    1. ^ Rozado, David (June 2024). "Is Wikipedia Politically Biased?". Manhattan Institute. Dataset: https://doi.org/10.5281/zenodo.10775984
    2. ^ Kerkhof, Anna; Münster, Johannes (2019-10-02). "Detecting coverage bias in user-generated content". Journal of Media Economics. 32 (3–4): 99–130. doi:10.1080/08997764.2021.1903168. ISSN 0899-7764.
    3. ^ Jee, Jonghyun; Kim, Byungjun; Jun, Bong Gwan (2024). "The role of English Wikipedia in mediating East Asian historical disputes: the case of Balhae". Asian Journal of Communication: 1–20. doi:10.1080/01292986.2024.2342822. ISSN 0129-2986.   (access for Wikipedia Library users)
    4. ^ Costa-jussà, Marta R.; Cross, James; Çelebi, Onur; Elbayad, Maha; Heafield, Kenneth; Heffernan, Kevin; Kalbassi, Elahe; Lam, Janice; Licht, Daniel; Maillard, Jean; Sun, Anna; Wang, Skyler; Wenzek, Guillaume; Youngblood, Al; Akula, Bapi; Barrault, Loic; Gonzalez, Gabriel Mejia; Hansanti, Prangthip; Hoffman, John; Jarrett, Semarley; Sadagopan, Kaushik Ram; Rowe, Dirk; Spruit, Shannon; Tran, Chau; Andrews, Pierre; Ayan, Necip Fazil; Bhosale, Shruti; Edunov, Sergey; Fan, Angela; Gao, Cynthia; Goswami, Vedanuj; Guzmán, Francisco; Koehn, Philipp; Mourachko, Alexandre; Ropers, Christophe; Saleem, Safiyyah; Schwenk, Holger; Wang, Jeff; NLLB Team (June 2024). "Scaling neural machine translation to 200 languages". Nature. 630 (8018): 841–846. Bibcode:2024Natur.630..841N. doi:10.1038/s41586-024-07335-x. ISSN 1476-4687. PMC 11208141. PMID 38839963.
    5. ^ Adelani, David Ifeoluwa; Doğruöz, A. Seza; Shode, Iyanuoluwa; Aremu, Anuoluwapo (2024-04-30). "Which Nigerian-Pidgin does Generative AI speak?: Issues about Representativeness and Bias for Multilingual and Low Resource Languages". arXiv:2404.19442 [cs.CL].
    6. ^ Simons, Arno; Kircheis, Wolfgang; Schmidt, Marion; Potthast, Martin; Stein, Benno (2024-02-28). "Who are the "Heroes of CRISPR"? Public science communication on Wikipedia and the challenge of micro-notability". Public Understanding of Science. doi:10.1177/09636625241229923. ISSN 0963-6625. PMID 38419208. blog post
    7. ^ Oeberst, Aileen; Ridderbecks, Till (2024-01-07). "How article category in Wikipedia determines the heterogeneity of its editors". Scientific Reports. 14 (1): 740. Bibcode:2024NatSR..14..740O. doi:10.1038/s41598-023-50448-y. ISSN 2045-2322. PMC 10772120. PMID 38185716.
    8. ^ Zhitomirsky-Geffet, Maayan; Kizhner, Inna; Minster, Sara (2022-01-01). "What do they make us see: a comparative study of cultural bias in online databases of two large museums". Journal of Documentation. 79 (2): 320–340. doi:10.1108/JD-02-2022-0047. ISSN 0022-0418.   / freely accessible version


    ToDo List

    Miscellaneous tasks

    Categories to look through

    (See also this much larger list of relevant articles without a lead image)

    Translation ToDo

    A list of related articles particularly good and notable enough to be worthy of a solid translation effort

    Requested articles (in general)

    1. ^ Backman, J. (2022). Radical conservatism and the Heideggerian right : Heidegger, de Benoist, Dugin. Frontiers in Political Science, 4, Article 941799. https://doi.org/10.3389/fpos.2022.941799

    Merging ToDo

    A list of related articles that may have resulted from a WP:POVFORK or may, at least, look like the functional equivalents of one
    Note that the exact target of a potential merge must not be provided here and that multiple options (e.g. generous use of Template:Excerpt) might accomplish the same