Listen to the article
For years, Wikipedia has been one of the informational backbones of the internet. With over seven million articles in the English version, Wikipedia is—by far—the largest repository of human knowledge ever collected. Since its launch in 2001, Wikipedia has become a one-stop shop to learn the basics of any topic in the world, from ancient history to quantum physics to foreign films.
Over the course of its history, Wikipedia has seen many challengers to its dominance, often motivated by its perceived inadequacies. As documented in a 2023 Quillette article, sites like Citizendium, Everipedia, Conservapedia, Scholarpedia, and Justapedia, and niche projects like Ballotpedia have sprung up as alternatives to Wikipedia. I am not surprised if you have not heard of some of these. Most are now maintained by just a handful of enthusiasts. Citizendium, which Wikipedia co-founder Larry Sanger said in 2007 “will (probably) succeed,” now has just eight active editors (defined as a registered user who made at least one edit to an article within the past thirty days). Scholarpedia, a peer-reviewed encyclopaedia written by scholars, has one active editor. Justapedia (also hailed by Sanger) has only forty registered users who have performed an action on the site within the last ninety days. These encyclopaedias are digital ghost towns. The English Wikipedia, on the other hand, has 291,274 active editors.
Introducing Justapedia
A new alternative to Wikipedia has arrived. Can it succeed where others have failed?
Other Wikipedia alternatives morphed into something completely different. Everipedia is now iq.wiki, an encyclopaedia dedicated to blockchain and cryptocurrency. Conservapedia, originally a site to counter Wikipedia’s perceived left-wing bias, is now a caricature of an encyclopaedia where right-wing conspiracies and fantasies prevail over facts. (It currently has just twenty active editors.)
The most successful niche encyclopaedia project is Fandom, launched by Wikipedia co-founder Jimmy Wales and Angela Beesley Starling in October 2004. This site is a family of wikis, most of which are devoted to cataloging the minutiae of games, TV shows, and other media. But even their passionate fan bases don’t result in a strong community of editors. The Harry Potter Wiki only has 158 active editors, Star Trek’s Memory Alpha wiki has 293, and the Star Wars Wookieepedia can boast 502 active editors. If passionate fans can only muster a few hundred people per month to update their encyclopaedias, catching up with Wikipedia’s hundreds of thousands of active editors is a daunting task.
Wikipedia’s challengers have failed because Wikipedia had a head-start of several years in which it built up name recognition, an invested community of editors, and became society’s default encyclopaedia. With each passing year, it becomes harder for a new competitor to make a dent in Wikipedia’s market share because its first mover advantage becomes more entrenched.
But in tech, first movers can be dethroned—as proof, just look at Yahoo! or MySpace. When Google debuted as a search engine, Yahoo! was still indexing pages manually and organising them into directories (like the files in the folders on a hard drive). Google’s automatic indexing and page rankings were revolutionary and helped users more efficiently find what they were looking for on the internet—the most basic function of a search engine.
The reason Wikipedia’s challengers have been so feeble is that none of them have offered a new model that can accomplish Wikipedia’s purpose of delivering information to its users. All the competitors function the same way as Wikipedia: as a volunteer-driven community of users who update and expand the encyclopaedia through crowdsourcing. Without any major advantages in fulfilling the basic function of an encyclopaedia, editors and readers have little reason to defect from Wikipedia (or to stay if they do).
Enter Grokipedia. Elon Musk launched this project through his xAI company on 27 October 2025, less than a month after he publicly announced the idea. According to Musk, Wikipedia is “extremely left-biased” and “controlled by far-left activists.” Correcting Wikipedia’s bias seems to be Musk’s most important motivation for creating Grokipedia, though not the only one.
We are building Grokipedia @xAI.
Will be a massive improvement over Wikipedia.
Frankly, it is a necessary step towards the xAI goal of understanding the Universe. https://t.co/xvSeWkpALy
— Elon Musk (@elonmusk) September 30, 2025
Musk is not the first to criticise Wikipedia for its perceived left-wing bias. Conservapedia launched in 2006, and centrist and right-leaning authors have published articles on the leftist worldview of Wikipedia. This is accomplished in a variety of ways, one of the most egregious of which is to declare right-leaning media sources too unreliable to cite but to give similar left-leaning media a pass. For example, the UK’s Daily Mail is “deprecated,” which means it cannot be cited on Wikipedia except “for uncontroversial self-descriptions, although reliable secondary sources are still preferred.” The Guardian, on the other hand, can be cited. MS NOW (formerly MSNBC) is “generally reliable,” but Fox News has been declared unreliable for science and politics, with other topics in dispute. The New York Post is considered “generally unreliable” on Wikipedia, which probably explains why it took ten months for the encyclopaedia to include information about Hunter Biden’s laptop (a news story that the Post broke).
Some vindictive maniac who has a grudge with the rationalist movement has effectively banned @Quillette, @thefp, @freebeacon, from being used as sources on Wikipedia — whereas TeenVogue is characterised as ‘solid’ 😑https://t.co/YKZdmTqm11
— Claire Lehmann (@clairlemon) July 10, 2024
Absurdly, Wikipedia even considers Al Jazeera to be a “generally reliable” news source for almost all topics, despite the funding and editorial direction it receives from the Qatari government. Wikipedia appears to have some reasonable standards, though: Tasnim News, an outlet affiliated with Iran’s Islamic Revolutionary Guard Corps, is a banned news source. But somehow that hasn’t stopped it from being cited over 29,000 times on Wikipedia, according to a recent report. Other media outlets affiliated with Hamas, Hezbollah, and the Muslim Brotherhood are cited over 9,000 times combined and are not proscribed by Wikipedia.
Podcast #256: Unreliable Sources
Iona Italia talks to Jack Despain Zhou (aka Tracing Woodgrains) about the decline of standards at Wikipedia as a result of the obsessive efforts of ideologically motivated editors.

The problem of biased information sources extends beyond the popular media to even scientific journals being verboten if they occasionally publish peer-reviewed articles that violate sacred leftist values. When the allowable sources of information are disproportionately left-wing (and sometimes Islamist), don’t be surprised that the articles based on them reflect a similar worldview.
Controlling the input of information into Wikipedia is not the only tool for controlling its content. Wikipedia’s decision-making processes, while facially neutral, contribute to its left-leaning makeup. Wikipedia’s all-volunteer structure means that power is in the hands of the people with the most time and energy to devote to a topic—a setup that favours activists. The consequences of this are most apparent when resolving disputes, which usually occurs via majority vote. Anyone can flag content on any Wikipedia article, and if an argument ensues, it is not hard for an activist to rally enough voters to tip the scales towards a preferred outcome. This is especially true since Democratic editors outnumber Republican editors by about three-to-one.
The organic solution of encouraging more conservatives, centrists, and Republicans to edit is ineffective. Wikipedia touts itself as “a free online encyclopedia that anyone can edit,” and in the 2000s, that was true. That time was a golden age of knowledge accumulation, when an energetic and knowledgeable person could easily contribute to Wikipedia and expect reasonable edits to be retained. Since then, the community has created steep barriers to entry for anyone who wants to make more than perfunctory edits. Today, the site has accumulated a staggering number of policies and guidelines that users must follow. Moreover, nearly every page on the encyclopaedia is on at least one editor’s “watchlist,” meaning they receive a notification whenever an article is edited. Consequently, even the most sincere newcomer will run afoul of a rule after a few edits, especially if they are editing popular, well-known, or controversial topics. When this occurs, a more experienced editor quickly reverts the change, often quoting a policy or a previous vote outcome of which a newcomer is unlikely to be aware. Learning the rules and how to win disputes is not easy, and only a highly motivated editor will find that to be a good use of their time.
Musk’s philosophy is that any human judgment will show biases and that removing human judgment as much as possible from the editing process will make an encyclopaedia more accurate. If he is correct, that solves the concern of bias. But—more importantly—it also provides the mechanism to undermine Wikipedia’s monopoly as an online encyclopaedia.
Unlike previous challengers, Grokipedia is not created through crowdsourcing knowledge and research. Instead, an AI program crawls the internet, looking for information to add to the site. This AI-driven foundation is what makes Grokipedia the first online encyclopaedia that represents a serious challenge to Wikipedia. By removing humans from the equation, Grokipedia can efficiently update articles, search for new sources of information, and synthesise different sources of information into a coherent whole. The encyclopaedia is also not reliant on human time and effort spent to resolve disputes or enforce rules. If Grokipedia can harness AI to create a better encyclopaedia than the hive mind of nearly 300,000 humans, then it may yet displace Wikipedia’s position as the internet’s pre-eminent encyclopaedia.
In its current iteration (version 0.2), Grokipedia is an inferior product to Wikipedia. It has fewer articles (six million, compared to the English Wikipedia’s 7.1 million), and it is updated less often: approximately 3,000 times per day, compared to 8,200 times per day on Wikipedia. Grokipedia lacks basic features that have been part of Wikipedia for over two decades: images and multimedia, infoboxes, a high-quality featured article spotlighted each day, and more.
Because of these obvious shortcomings, early reactions to Grokipedia from the Wikipedia establishment were dismissive. Wales has challenged the neutrality of an encyclopaedia helmed by Elon Musk (while brushing off criticisms of Wikipedia’s biases) and argued that AI programs’ propensity to produce confident-but-inaccurate responses is a fatal flaw in the Grokipedia model—though Wales did not provide an example of Grokipedia doing so. In the Verge, technology reporter Jay Peters emphasised that much of Grokipedia’s content in version 0.1 was cloned from corresponding Wikipedia pages. The reuse of Wikipedia content on Grokipedia led Annie Rauwerda, who runs the popular Depths of Wikipedia social media accounts, to label Grokipedia “another Wikipedia rival built on Wikipedia’s content which mostly ends up reinforcing its influence.”
On 10 November 2025, the Wikimedia Foundation (the parent organisation of Wikipedia) released a statement stating that “Human-created knowledge isn’t replaceable” by AI. The statement does not mention Grokipedia, but the implication is clear. However, the Foundation ignores the fact that Wikipedia does not produce new knowledge. Indeed, Wikipedia bans editors from adding original research to articles. The site is a knowledge aggregator, and there is no reason to expect that AI cannot aggregate knowledge as well as—or better than—humans.
On the same day that the Wikimedia Foundation released its statement, Wikipedia’s newsletter, the Signpost, called Grokipedia “extremely flawed, perhaps fatally so, because it’s controlled by one biased person with extreme views and because it lacks human understanding and the human touch.” This article reported the views of six Wikipedia editors, all of whom compared Grokipedia’s supposed bias to Wikipedia’s supposed neutrality. (Every interviewee was silent on the legitimate examples of left-wing bias in Wikipedia.) An opinion piece posted in the Signpost on the same day made the same point. To paraphrase Shakespeare, the Wikipedians doth protest too much, methinks.
The dismissive happy talk from the Wikipedia establishment shows that it does not recognise the potential threat that Grokipedia poses to Wikipedia’s monopoly. Their criticisms mostly focus on Grokipedia 0.1’s obvious flaws and shortcomings, instead of the potential advantages that AI curation offers over a human hive-mind. For example, editors and decision makers at Wikipedia spend a great deal of time and effort resolving disputes. Some discussions take months (and occasionally years) and voting on issues is open for days or weeks. AI can weigh the evidence it finds and make a decision within minutes.
Even if the Wikipedia establishment were to perceive the threat posed by Grokipedia, Wikipedia is poorly equipped to respond to it. After 25 years of operation, Wikipedia has become an incredibly bureaucratic organisation, which hampers its ability to improve operations or react to changing circumstances. Currently on Wikipedia, bots can only be used for routine tasks, like linking articles to one another, cleaning up vandalism, adding metadata to articles, and correcting grammar. Activating a new bot requires approval from a committee, and Wikipedia’s policy governing bots bans them from generating substantive new content.
The Wikimedia Foundation acknowledges that AI has the potential to improve Wikipedia, but the organisation’s AI strategy is not capable of meeting the challenges of an AI-powered competitor. Instead, the Wikimedia Foundation’s vision for AI is limited to supporting human editors and freeing them up for discussions and editing. This is not a fundamental transformation of how Wikipedia operates; it just grafts AI onto existing practices. This strategy can only create marginal gains in efficiency, not revolutionary improvements.
Furthermore, the strategy document itself is emblematic of the procedural and bureaucratic challenges that make change difficult and slow at Wikipedia. It took a committee ten months to craft, and nearly a year after its release, there is little to show for it. Decision-making power is extremely diffuse at Wikipedia because the Wikimedia Foundation’s leaders have not given themselves the power to mandate policies or change practices. Instead, change at Wikipedia is enacted through discussions that build consensus among users over time. This weak governance is in keeping with Wikipedia’s efforts to democratise knowledge, but it makes substantive changes difficult, especially at a rapid pace.
Fundamentally, the major challenge to adopting AI is that the Wikimedia Foundation’s AI strategy is touted as being “a human-centred approach.” That human-centred approach is slow and prevents Wikipedia from reaping the full benefits of AI. In time, the Wikipedia vs. Grokipedia competition may become a 21st-century version of the John Henry story, with man and machine in a contest of their abilities.
This face-off may have already begun. Grokipedia’s AI can match human performance (and even improve upon it) for some tasks, such as editing articles. As a trial, I suggested a minor edit to remove a few words from a Grokipedia article to reduce redundancy. Within a minute, Grokipedia made the suggested edit and provided a 229-word note on the article’s edit history to explain why the edit was made. A human on Wikipedia could make the same edit just as quickly, but it is unlikely that they would write such an extensive explanation to validate the decision. I suggested another edit in a different article to Grok and provided a scholarly source that I wanted it to cite. Within minutes an edit had been made—but not the one I suggested. The accompanying explanation acknowledged that I was correct to request an edit, but added that Grokipedia had found other sources with better information. After reading the explanation, I was persuaded that the revision was indeed better than the one I had suggested. On Wikipedia, I might have been able to edit the article, but months or years might have passed before another human thought to improve upon my change.
A more important test of Wikipedia and Grokipedia is their ability to handle controversy. Standard operating procedure at Wikipedia is to handle a controversy through discussion (and often a vote) on a talk page in order to hammer out a consensus position that determines the article’s viewpoint. Grokipedia takes a different approach: on controversial topics, Grokipedia’s articles air all sides of a debate, along with evidence for or against each viewpoint. This does not mean it gives equal space to crackpot ideas or conspiracy theories, though. For example, in both the Wikipedia and Grokipedia articles on the “Shakespeare authorship question,” the dominant view is that William Shakespeare really wrote the plays ascribed to him and that theories of other authors are not supported by documentary evidence or scholarly opinion. Grokipedia goes further and explores why people might believe the idea that Shakespeare’s plays were written under a pseudonym.
An example of divergence can be found in my area of expertise: intelligence testing. Grokipedia’s article on the topic is factual and straightforward. Where there are controversies (such as regarding race and sex differences in IQ scores), Grokipedia reports all sides of the controversy. Wikipedia’s article, on the other hand, editorialises the entire topic and impugns the science of IQ testing by implying that American psychologists working on intelligence tests were inspired by the Nazi eugenics movement. (In reality, Nazi psychologists rejected the theories underlying IQ testing.) When Wikipedia addresses controversy related to IQ, it only puts forward one perspective (inevitably a leftist viewpoint) and only mentions dissenting views to dismiss them. On the other hand, Grokipedia presents all sides of an issue and why people might have different perspectives. When facing a controversy, Grokipedia trusts its readers to come to their own conclusion, whereas Wikipedia’s editors reach a conclusion and then write an article that conforms to it.
Cognitive Distortions
How the culture wars came for Wikipedia’s articles about human intelligence.

There are already indications that Grokipedia is harder for activists to hijack. In an article about the 2025 Chilean general elections, a user repeatedly suggested edits that were favourable towards left-wing candidates and unflattering towards right-wing candidates. Grokipedia’s AI rejected most of these edits for being insufficiently sourced, irrelevant, or violating the article’s neutral tone. In the article on international differences in IQ, a user attempted several times to get Grokipedia to cite his self-published writings on the topic. Grokipedia rejected all of these suggestions. (The small counterclockwise clock icon in the top righthand corner of every article reveals the article’s edit history if you click it.)
Grokipedia’s and Wikipedia’s citation and sourcing practices also vary noticeably. Grokipedia casts a wider net, though there are categories of sources that it misses and that Wikipedia cites. Grokipedia is limited by sources that it can access online, which means that print books and sources behind paywalls are rarely cited. Wikipedia picks up information from these sources if human editors choose to cite them, but it systematically ignores other useful sources. For example, Wikipedia labels almost all primary sources as uncitable, which leads to distortions. On Wikipedia’s article for Quillette, only three of the 28 sources cited are from Quillette itself, which means that the magazine’s article is almost completely defined by what Quillette’s competitors have written about it. On Grokipedia, 31 of 43 sources are published by Quillette. Although primary sources should not be the only source of information about a topic, it seems reasonable that Quillette would be the best resource about Quillette.
Likewise, Wikipedia articles about social groups (such as religious groups and political parties) are dominated by outside sources claiming to know what their members do and think. Some of these sources are correct, but a policy that restricts primary sources prevents these groups from having a voice in how the world perceives them. This is particularly detrimental to small groups who may have escaped notice of traditional media and scholarly sources. On the other hand, Grokipedia often cites sources that are generally unacceptable in academic or factual writing: blog posts, self-published or unvetted sources, and even tweets and discussions on Reddit and Quora. A middle-school teacher would find most of these sources unacceptable in a student paper, and yet they pass muster with Grokipedia. While I am flattered that Grokipedia cites my professional blog in several articles (for example, here and here), more authoritative sources for the same information are available.
Moreover, Grokipedia’s method for evaluating sources is largely a black box. Descriptions of how sources are chosen and evaluated by Grokipedia’s AI system are vague. In contrast, Wikipedia’s source-vetting process is public on articles’ talk pages. Even if a reader disagrees with the decision, at least the process at Wikipedia is transparent about how the decision was made.
The issues surrounding its citation behaviours are emblematic of Grokipedia’s largest problem: its opaque algorithms. Putting AI in charge of writing and editing an encyclopaedia does not remove humans’ influence from Grokipedia. Humans still created the algorithms that fuel Grokipedia, and that means that potential biases are baked into Grokipedia’s code. Currently, none of Grokipedia’s code is available for external scrutiny, and the logic behind some of its output is not clear. This is probably Grokipedia’s most challenging hurdle to its acceptability, and its critics are right to express concerns about the encyclopaedia’s bias.
In its current form, Grokipedia is inferior to Wikipedia. The smaller number of features, the motley mix of sources, and Grokipedia’s lack of transparency are major problems that Musk’s team must deal with. But Wikipedia’s defenders are shortsighted in only reacting to the current version of Grokipedia. Indeed, some of the criticisms of version 0.1 (such as wholesale cloning of Wikipedia’s content) no longer apply to version 0.2. More improvements are in the works. There are already plans to add multimedia and other features that have long been part of Wikipedia’s interface. Comparisons of articles on the same topic show that Grokipedia’s are often longer and more detailed. The Grokipedia article on the precuneus is 5,253 words long and has 113 references. Wikipedia’s article is 1,295 words long and has 28 references (though Wikipedia’s does include media that illustrate the neuroanatomy better). Grokipedia performs well in many head-to-head comparisons like this.
When Grokipedia is good enough (long way to go), we will change the name to Encyclopedia Galactica.
It will be an open source distillation of all knowledge, including audio, images and video.
Join @xAI to help build the sci-fi version of the Library of Alexandria!
— Elon Musk (@elonmusk) November 13, 2025
But Grokipedia is a long way from dethroning Wikipedia. Its organic web traffic is less than one percent of Wikipedia’s traffic each month. It also frequently conflates people or things with identical names that would not confuse a human. For a while, it mixed up political scientist Charles Murray with a deceased American vaudeville actor of the same name. Moreover, Grokipedia’s attempts to provide an exhaustive entry often result in a wordy and repetitive writing style. The detailed but unnatural style that afflicts many LLMs is sometimes apparent in Grokipedia articles. Yet Grokipedia has the potential to be a serious competitor to Wikipedia. Its AI can already match or surpass humans’ ability to edit articles and handle controversies, and its limitations are fixable. Moreover, it already surpasses Wikipedia in important ways, especially in being much harder for activists to hijack.
Despite its advantages, I do not believe that Grokipedia will ever kill Wikipedia entirely. The human-powered encyclopaedia is a proven model, and it has succeeded in building a massive repository of human knowledge with a devoted user base. The most likely outcome is that Grokipedia becomes a legitimate competitor to Wikipedia and breaks the latter’s monopoly. If that happens, then society will benefit from having two free encyclopaedias, each containing vast amounts of knowledge. That can only be a positive development.
Quillette invites thoughtful responses to its essays.
Selected responses are published once per week as part of a curated Letters to the Editor feature. If selected, letters appear under the contributor’s real name and may be edited for clarity and length.
To submit a letter for consideration, please email [email protected].
Read the full article here
Fact Checker
Verify the accuracy of this article using AI-powered analysis and real-time sources.

