Nine Based Takes on Grokipedia
What's the point? Plus Jimmy Wales's book, Larry Sanger's Nine Theses
Yesterday’s episode of “Between the Brackets.” YouTube and Audio version.
When Elon Musk launched Grokipedia on October 28, he did so with his usual combative style. The new AI-generated encyclopedia would be a “massive improvement over Wikipedia,” he promised, delivering “several orders of magnitude” more breadth, depth, and accuracy than the website he often derides as too woke.
Since then, my inbox has been full of emails from reporters asking whether Musk’s latest project is just a marketing ploy or represents an actual paradigm shift for internet knowledge. While I don’t yet have a unified theory of Grokipedia, I do have some initial impressions shaped by years of reporting on Wikipedia as a journalist. I’m offering those nine observations here in case others find them interesting, controversial, or useful.
Grokipedia borrows extensively from Wikipedia. For a site presented as a dramatic improvement over Wikipedia, Grokipedia sure copies a lot from Wikipedia. A Cornell Tech study found that 56% of Grokipedia entries are wholesale copies of their Wikipedia counterparts, as shown by a disclosure at the bottom of articles indicating that the content was “adapted from Wikipedia” and referring to the Creative Commons license. (Only pages without that footer, like the article for Joseph Stalin, appear to be Grokipedia originals.)
Musk fans may say that they don’t have a problem with his copying from Wikipedia because the site is openly licensed and free to use. Still, there’s something jarring about Musk railing that Wikipedia is “insane” and “trash,” only to roll out a new encyclopedia that copies directly from Wikipedia more than half the time.
I chatted about this briefly for The Washington Post (gift link):
When Grokipedia doesn’t copy Wikipedia, things get weird fast. I spent some time comparing entries from the two websites side-by-side, noticing instances where Grokipedia differed from the Wikipedia version. My biggest concern has been sentences with no citation at all. It seems likely that Grok’s AI is hallucinating, and without a reference, there is no way for the reader to check.
PolitiFact covered one of these hallucinations in Grokipedia’s entry on the Nobel Prize for Physics, where the AI encyclopedia asserted that physics is “traditionally the first award presented” at the Nobel ceremony. There was no citation for this sentence on Grokipedia and as it turns out, it’s not true. The Nobel Prize for Medicine is traditionally awarded first. My hypothesis is that Grok’s AI is scanning the internet, noticing a bigger spike in press coverage for the Physics award, and then making an inference that Physics must be the first prize awarded. But of course, just because something seems statistically plausible to an AI system doesn’t mean that it belongs on an encyclopedia, presented to the reader as fact.
The sourcing problem doesn’t stop there. Grokipedia also links to outlets that Wikipedia would not consider acceptable. According to the Cornell Tech study, Grokipedia includes dozens of citations to Stormfront, a neo-Nazi forum, and Infowars, the conspiracy site run by Sandy Hook denier Alex Jones. Opening the door to alternative media may sound like a free speech philosophical stance, but in practice it means Grokipedia’s algorithm is categorizing extremist propaganda as “authoritative.”
Grokipedia prefers first-person sources like LinkedIn. Even on X, where Musk’s most diehard fans are amplified, plenty of users have noted that Grokipedia directly mirrors a person’s LinkedIn page.
Consider the example of Steven Pruitt, the most prolific editor on English Wikipedia by edit count and someone that I have reported on in the past as a notable Wikipedian. Pruitt said that his Grokipedia entry contains several errors that do not appear on his Wikipedia page. For example, his Grokipedia entry lists him as working at Science Applications International Corporation, a job he left more than a year ago. Why the outdated information? Because his LinkedIn profile still says he works at SAIC, and Pruitt freely admits he’s “notoriously lazy about updating it.”
The LinkedIn issue raises the bigger question: What types of sources should an encyclopedia consider a reliable source? Grokipedia tends toward first-person, self-authored platforms like LinkedIn. Wikipedia, by contrast, favors third-party reporting such as newspapers, books, and academic sources. The theory among Wikipedia editors is that third-party reporting is more likely to be neutral; or at least, there is not the inherent conflict of interest of the first-person perspective where the individual would be tempted to self-promote.
And if LinkedIn makes the cut for Grokipedia, does that mean all first-person sources are acceptable? Grokipedia already cites Facebook, Quora, Reddit, and X posts. In the case of Pruitt’s Grokipedia page, it didn’t cite any of the articles that I’ve written that report about him for The Washington Post or Slate. Instead, it cited the versions that I reposted on Medium. In my case, the Medium pieces are syndicated versions of the original stories that went through a fact-checking process. But for the most part, Medium is a blogging platform, meaning it’s the kind of first-person source that Wikipedia generally treats as unreliable.
Whether Grokipedia continues to privilege first-person sources over independent reporting will say a lot about what kind of encyclopedia Musk is actually trying to build.Grokipedia says very positive things about Elon Musk. Reporting for Slate, Mary Harris noted that “Grok portrays the Tesla CEO through stubbornly rose-colored lenses.” That’s not an exaggeration. Musk’s Grokipedia entry includes a glowing quote from a former SpaceX employee: “Elon was the best mentor I ever had… He’s so sharp, he just picks it up.” The line reads less like a neutral encyclopedia than the recommendations section on LinkedIn.
Musk has long tried to control the text of encyclopedic summaries. Back in 2022, I wrote about how Musk was originally a big fan of Wikipedia but got angry with the site because the lead sentence included the word investor. He wanted that word gone, repeatedly tweeting at editors to remove it, despite the fact that he had invested in companies like SpaceX, Tesla, and Twitter. Musk wanted to be portrayed as an entrepreneur and visionary, not someone who purchased companies founded by others.
Given that history, it’s not surprising that Grokipedia gives Musk significantly more control over how he’s described. Ryan McGrady, a researcher at UMass Amherst, argued in Tech Policy Press that Grokipedia marks a return to top-down control of knowledge. On Wikipedia, neutrality is hashed out through messy, bottom-up debate among volunteers. On Grokipedia, neutrality is whatever Musk (and his AI) decide it is.
As McGrady puts it: “Wikipedia, for all its many flaws, has always aimed to ‘set knowledge free’ … In contrast, Grokipedia’s defining feature as an encyclopedic project is the use of technological power to re-exert top-down authority over information and knowledge.”Despite Grokipedia’s flaws, the Wikipedia community should not be complacent about AI-generated encyclopedias. In a recent interview series in The Signpost, Wikipedia’s long-running community newspaper, several editors shrugged off Musk’s new internet encyclopedia as simply the latest challenger. Wikipedia has seen plenty of those over the years. Citizendium, Conservapedia, Everipedia . . . So far, none have had any real staying power.
While I appreciate the “keep calm and carry on” attitude, I would urge Wikipedians not to be overly chill about Grokipedia. None of those earlier rivals had the backing of the world’s richest man. More importantly, the realization of an AI-generated encyclopedia represents a potentially significant shift. Going forward, Wikipedia editors and supporters will have to make the case for human effort.
In my view, there are still strong reasons to prefer a human-curated encyclopedia. Selena Deckelmann at the Wikimedia Foundation has laid out some of the arguments previously. But this debate is likely to last decades. As AI gets better at mimicking human processes, Wikipedians will have to explain, over and over, why the human-led, messy, deliberative, consensus-building model works better.
My novel The Editors predicted Grokipedia in some ways, but not in others. Once Grokipedia launched in October, I began to get DMs along the lines of: “I read your novel and loved every page of it. And boy oh boy did you foresee where things were going!”
Spoiler alert for anyone who has not read the book. The Editors features a billionaire named Pierce Briggs who creates a for-profit encyclopedia designed to challenge Infopendium, the novel’s stand-in for Wikipedia. Briggs launches Infoveritas to better control his self-image and disrupt the influence of the nonprofit website. When I started writing the book in January 2020, the idea of a billionaire attacking and ultimately challenging a volunteer-run encyclopedia still sounded like an unusual plot twist.
(The Editors is on sale at Amazon and Bookshop.org if you’ve been waiting to read a fictional dramatization of internet fact disputes.)
Despite some similarities, there are a few key differences between the real-life and fictional versions. Musk’s Grokipedia is AI-written, and artificial intelligence is not a major part of the book set during the 2019 to 2020 period. Another key difference is who helps launch the rival encyclopedia. In the novel, Infoveritas arises from a partnership between Briggs and a dissident volunteer editor of Infopendium named Ed Shelton. Grokipedia, at least so far, looks less like a collaboration with Wikipedia editors and more like a Musk solo project.
Which brings me to my next point.
Grokipedia shows very little evidence of being influenced by Larry Sanger’s “Nine Theses” on Wikipedia. Sanger is the co-founder of Wikipedia who played an influential role during its first fourteen months in 2001 and 2002. He has spent much of the past two decades trying to create rival encyclopedias such as Citizendium and Everipedia. He has also become one of Wikipedia’s loudest critics.
In September 2025, Sanger published his “Nine Theses” on both Wikipedia and his personal site. The imagery harkens back to Martin Luther nailing his 95 theses on the Wittenberg church door. Sanger sees this as a protest against what he sees as the institutional decline of Wikipedia, together with concrete proposals for reform. The nine theses drew significant media attention for Sanger, including interviews with Tucker Carlson, the Free Press, and The Washington Post (gift link).
Despite the timing, it doesn’t appear that Sanger’s ideas had much of an effect on Musk’s Grokipedia. Sanger’s proposals focus on how to improve the human processes behind Wikipedia, whereas Grokipedia is famously curated and written by AI.
If Musk were to hire Sanger as editor in chief of Grokipedia and recruit a cadre of human editors to further develop the project, that would look more like the plot of my novel. For now, Grokipedia appears to be Musk’s own, in-house AI project. When Sanger offered early criticism of Grokipedia, Musk responded on X that he should “calm down” because the site was still at version 0.1.
Sanger’s Nine Theses deserve serious consideration, and many Wikipedians are already debating them. It would be understandable if some editors approach Sanger’s proposals with skepticism. After all, Sanger has launched at least three rival encyclopedia projects since leaving Wikipedia in 2002, and he has spent years criticizing the site on cable news. Volunteer editors might reasonably wonder whether his latest announcement is motivated by a sincere desire to fix Wikipedia or to raise his own profile.
Even so, plenty of Wikipedia editors are taking him at his word. Following the maxim of assume good faith, many Wikipedians are responding to the theses in detail.
ranked them in an essay on his Substack, . Sanger himself has emphasized the work he put into the Nine Theses, saying on X that he spent nine months writing them.Here’s my quick assessment of three of the theses.
Agree. In thesis five, Sanger argues that Wikipedia should retire one of its infamous rules: “Ignore all rules.” Sanger says he originally wrote IAR as a humorous idea to welcome newcomers, but that it now functions as a shield to avoid responsibility. After years of trying to explain Wikipedia’s byzantine rules to outsiders, I find it logically inconsistent that one of the rules instructs editors to ignore the rules. Wikipedia could still encourage newcomers to be bold because that’s more of an attitude than a carte blanche. But IAR is not a true policy in practice and I agree with Sanger about retiring it.
Disagree. Thesis seven proposes allowing the general public to rate Wikipedia articles. For me, this immediately calls to mind Reddit’s systems of upvotes and downvotes. While this idea may work for gauging public sentiment, it seems like a disastrous method for determining reliability. People would very likely organize campaigns designed to downvote articles with inconvenient truths. As Yaron Koren put it on the Between the Brackets podcast, this approach would lead to the Redditization of Wikipedia.
Conflicted. Thesis six calls for Wikipedia to reveal the real identities of its most powerful volunteers, including CheckUsers, Bureaucrats, and members of the Arbitration Committee, which is like Wikipedia’s version of a Supreme Court. To be clear, these users do hold significant authority. Their decisions influence Wikipedia, and through Wikipedia they influence Siri, search engines, and LLMs.
On paper, transparency sounds reasonable, but anonymity is important for practical reasons. Many of the editors in these leadership roles receive harassment, threats, and sometimes worse. I have interviewed several of them over the years, and have seen some of the death threats they receive online. Sanger himself acknowledges the safety issue and suggests that the Wikimedia Foundation should offer stronger support to targeted volunteers. Even so, I suspect that most editors in these positions would rather keep their anonymity than rely on uncertain protection from a nonprofit organization.
At the same time, total anonymity has real drawbacks. I have written about how anonymity can enable conflicts of interest and how easily it can be abused. More recently, location data from X revealed that several major political accounts were operated not by the patriotic Americans they claimed to be, but by troll farms in Nigeria and India. Anonymity creates opportunities for manipulation. The real challenge is finding the right balance between transparency and safety. My guess is that online platforms will continue moving toward a real ID framework. It is an open question whether Wikipedia will follow that trend.
One other note about the Nine Theses. Midway through thesis two, Sanger uses an acronym to describe what he sees as Wikipedia’s dominant worldview: GASP, which stands for globalist, academic, secular, and progressive. I spoke about this GASP acronym recently on the Between the Brackets podcast.
For now, I want to set aside the “P” because whether Wikipedia is progressive is contested, and there are processes on the site for flagging concerning political bias. To me, the other three letters are more interesting.
Should Wikipedia be globalist? The project is written by editors around the world and read by people in virtually every country. It is not meant to mirror a single nation’s perspective in the way that the original Encyclopedia Britannica reflected British sensibilities. The word globalist can be divisive, but it may be a natural consequence of Wikipedia’s mission: to provide everybody in the world with access to free knowledge.
Should Wikipedia be academic? The site relies on the most dependable sources available, and in many fields that means scholarly or peer-reviewed materials. In this sense, Wikipedia is conservative with a small “c.” It tends to weigh academic sources more heavily than the informal, first-person sources discussed above.
Should Wikipedia be secular? Most readers would probably expect it to be. If you look up a religious topic, you expect Wikipedia to describe beliefs rather than endorse them. A faith-based encyclopedia would present doctrinal claims as settled truths. By contrast, Wikipedia presents them as the views of those specific religious communities.
Sanger mentions GASP only briefly, but the framework shows the fault lines for discussion. Each of these dimensions raises questions about the goal for an internet encyclopedia.
Musk does not share the same optimistic worldview of Wikipedia founder Jimmy Wales. Unlike Sanger, Wales has remained involved with Wikipedia and the nonprofit that supports it since 2001. His new book, The Seven Rules of Trust, uses lessons from Wikipedia to offer advice on how to build institutions that last. For instance, he argues that organizations should regularly take a “trust inventory” and hold workshops that bring together people of opposing political views.
Wales calls himself a pathological optimist, and the book is suffused with that spirit. He writes about how Annie Rauwerda, the founder of Depths of Wikipedia, shares the joy of quirky things she’s found on the site. He covers the genuine community that forms among Wikipedia editors, both online and in real-life meetups.
A consistent theme of the book is bipartisanship. Wales reminds readers that, “People on the other side of the fence may be opponents sometimes. They are never enemies.” Twenty years ago, that line may have sounded earnest to a fault. Today it feels worth stating and restating.
The contrast between Wales and Musk is striking. Musk recently attacked the beloved author Joyce Carol Oates on X, calling her a “lazy liar” and an “abuser of semicolons,” and even saying that she’s “not a good human.” Her offense, apparently, was suggesting that she knew of an unnamed wealthy man who rarely posts about nature, books, or movies he admires.
Wales, by comparison, is far more likely to treat criticism as an opportunity for reflection rather than a justification for personal attacks. “Most people are still very decent and do care about other people for very good reasons,” Wales writes in his book. “The moment you think, ‘We’re great, we’re fine, anyone who criticizes us is a lunatic,’ you’ve lost it.”



Smarts takes! Should be noted, though, that LinkedIn is also cited throughout Wikipedia, primarily to confirm job titles & career chronologies for notable individuals (the same way that Grokipedia is using it). More troublingly, "first-person" LinkedIn essays are being used to support all sorts of claims, from product releases to platform criticisms. The same is true of Medium essays. Wikipedia may "favor" third-party reporting as a matter of policy, but that guideline is so inconsistently applied that it feels misleading to highlight it as a virtue.
Terrific article Stephen.