You want to use ChatGPT to create a Wikipedia page as you believe AI is the answer. After all, large language models have advanced tremendously when it comes to creating content.
A large language model (LLM) is an AI system trained on large datasets to generate human-like text. Examples include ChatGPT, Gemini, Claude, and Grok.
Bottom Line: Yes, you can use ChatGPT and other LLMs to create a Wikipedia page, but you shouldn’t copy and paste directly. Wikipedia editors are aggressively detecting and deleting AI-generated content. The smart move? Use AI to research, organize, and outline. Then write the article yourself using proper sources. If you skip this step, expect your page to get flagged, tagged, or deleted.
I highly advise you to soak in this article prior to attempting to create a Wikipedia page using AI.
Wikipedia is at war with ChatGPT and other large language models. As with anything new, people abuse it and editors get annoyed with it. Because of this, there has been an effort to ban all AI-generated content on Wikipedia.
Key Stat: By 2024, five percent of all new Wikipedia pages contained AI-written content. As of October 2025, nearly 3,000 articles have been tagged for suspected AI content.
If you are here, you likely want to know the following:
- Can you use AI to create a Wikipedia article?
- If so, should you use AI to create a Wikipedia article?
- What are the guidelines for using AI on Wikipedia?
- What is the best practice for using AI to edit Wikipedia?
Well, I wish I had easy answers for you, but I will tell you this…
…yes, you can use AI to create a Wikipedia article, but I highly advise against it. However, since I know you aren’t going to listen to me, I will show you the rules for doing so and give you best practices you can use to be successful.
But first, a little history of the use of large language models on Wikipedia.
How Long Has AI Been Used on Wikipedia?
How long has AI been used on Wikipedia? The answer may surprise you.
Most of Wikipedia’s policies related to the use of AI only go back to 2022. However, the use of AI goes back to 2010.
The first known AI tool on Wikipedia was ClueBot, a tool used to fight vandalism. The bot uses machine learning to detect and automatically revert vandalism.

As you can see from the image above, ClueBot is quick to read an edit, determine if it’s vandalism, and remove the same. This one took less than a minute to correct from the time it was made.
In 2018, a company called Primer.ai created a software program called Quicksilver to create Wikipedia draft pages for female scientists who didn’t already have entries.
Primer.ai posted the drafts on its Quicksilver website, not Wikipedia, allowing users to transfer them. As of 2025, only 34 of the 100 entries had been transferred to Wikipedia.
With the launch of ChatGPT and generative AI in 2022, there was an onslaught of people using it to edit and create new articles on Wikipedia. The quality of the writing and sourcing was so poor that it caused Wikipedia editors to fight back.
In 2022, Wikipedia editors began drafting an essay on large language models and their use in Wikipedia. By 2023, WikiProject AI Cleanup was formed to help “combat the increasing problem of unsourced, poorly written AI-generated content on Wikipedia.”
By 2024, five percent of all new Wikipedia pages contained AI-written content.
The WikiProject AI Cleanup has led to some guidance on using AI, but there are so many people abusing AI programs within Wikipedia that the site is now under full attack.
Can You Cite ChatGPT as a Source on Wikipedia?
The short answer is, “no.” ChatGPT, Gemini, and other generative AI programs can never be cited by Wikipedia.
Wikipedia’s guideline on reliable sources give two main reasons:
- Large language models scrape text from Wikipedia, press releases, and official websites. Wikipedia does not consider these sources reliable. So, any information from AI programs is also considered unreliable.
- Large language models tend to fill in gaps with information that sounds plausible. The issue is that the information is not factual. So, you may end up with misleading or false information when using a LLM. This concept is referred to as hallucination and as of 2025 continues to be an issue.
In addition, AI programs are not currently capable of evaluating what constitutes a reliable source under Wikipedia guidelines.
Important Note: Large language models cannot be cited as sources on Wikipedia. They hallucinate information, scrape unreliable data, and cannot evaluate what qualifies as a reliable source under Wikipedia guidelines.
Many sources need to be looked at with a human eye to determine if they qualify. Sometimes, a publication is reliable, but it is a paid advertisement. Sometimes the author is reliable but it is not a reliable publication.
So many factors are involved in determining if a specific source is reliable and ChatGPT does not know the difference. As such, you may wind up with an AI-generated Wikipedia page full of sources that do not meet Wikipedia guidelines.
At this point, I know what you are thinking. Can I just take a chance and use ChatGPT to create a Wikipedia page?
Is It Allowed to Use AI for Wikipedia Articles?
The short answer is…yes.
Short Answer: Yes, you can use AI to create a Wikipedia page. As of 2025, there is no policy against it, only an essay with guidance. But just because you “can“ doesn’t mean you “should.”
Technically, there is no rule against using a large language model to create a Wikipedia page. In fact, the guidance for using AI on Wikipedia is only an essay, not a policy or guideline. So, even though the guidance says you shouldn’t, you are free to do so.
But now the question is, “should you?”
Should You Use AI to Create Wikipedia Pages?
Do NOT use ChatGPT, Gemini, or any other large language model to create a Wikipedia article.
Hell, even ChatGPT will tell you not to use ChatGPT to create a Wikipedia article.
No joke!

It’s not wrong. Using a large language model to create a Wikipedia page can cause you serious issues. More on that in the next section.

Even Grok advises against it.
Grok’s breakdown basically states that it’s great to use for brainstorming ideas, but that the content should be manually written based on the information it summarizes.
Don’t worry, though. I will show you how to use AI without disrupting Wikipedia. Be patient, I’m getting there.
What Can Happen If You’re Caught Using AI?
Well, that depends.
If it’s minor, you may just have your edit reverted. If it’s major, the entire article may be deleted.
Yes, I said deleted.
If you create a Wikipedia page using AI, Wikipedia has a policy as of 2024 that allows that page to be deleted.
There are a number of things that could also happen.
- Your page may be tagged for containing AI content.
- An editor may be nice (not likely) and rewrite the AI content.
My point here is simple. It is not worth the risk of being lazy. Something will likely happen if you don’t employ the best practices for creating Wikipedia content using LLM.
Warning: If you create a Wikipedia article using AI-generated content, it can be deleted for such. Your account may also be blocked from editing in the future, and the article you created protected from future creation.
And yes, you will be caught if you use AI to write your Wikipedia page.
You Will Be Caught Using AI Programs on Wikipedia
Of course, you can do anything you want on Wikipedia. But, if it goes against the guidelines and policies, your edits will be reverted, and you could get blocked.
So, I highly advise you against creating anything in ChatGPT and posting word-for-word on Wikipedia.
Wikipedia has a category for articles that contain suspected AI-generated text. You can see from the image below that in May 2024 there are only eight pages. But, in September 2025, that number jumped to 1,286 (that month alone).
In October 2025, I checked Wikipedia and found there were 2,926 articles tagged as having AI content.

Now comes the big question – how are these articles found??
How Do Wikipedia Editors Spot AI Writing?
Wikipedia editors have become hypervigilant at detecting AI content. This is because of how many people have used it to introduce poorly written content into the encyclopedia over the last few years.
Editors are now encouraged to use programs that detect large language model writing. Such programs include ZeroGPT and GPTZero.
However, Wikipedia also has a guide for how to detect AI writing. And yes, there is a lot of AI content that is not detected with programs.
Examples of what to look for include:
- Editorializing (e.g., “important to note,” “it’s worth mentioning”)
- Summarizing words (e.g., “in summary,” “to conclude”)
- 404 sourcing (references leading to 404s on a website)
- Excessive use of boldface
- Em dashes (a telltale sign of AI-generated content)
- Curly quotes
These are easy to spot, and Wikipedia editors are getting used to what to look for. So, posting your content directly from Gemini or other LLMs is likely to get your edit flagged.
Red Flags Wikipedia Editors Look For: Editorializing phrases (“important to note”), summarizing words (“in summary”), excessive boldface, em dashes, curly quotes, and broken reference links (404s).
What about bots to detect AI writing?
As of the date of this post, there is no bot that detects AI content on Wikipedia. However, I am sure as of the date of you reading this post, there likely is.
Wikipedia’s Guidelines on Using AI:
As of 2025, there is no official policy or guideline on using AI on Wikipedia. However, WikiProject AI Cleanup is working on it. There is currently an essay with guidance on using AI.
I will tell you now that despite it not being an official policy, you need to follow it as if it is. Editors are forming an “ignore all rules” approach and enforcing the essay as if it’s law. They are even deleting pages that are created solely with AI-generated content.
Some of the guidance Wikipedia provides includes the following:
- You should be competent in creating Wikipedia articles without using AI prior to attempting to use AI to help you create articles.
- You should disclose any edits you make that incorporate AI-generated content.
- Never use AI to generate comments for discussions on Wikipedia (including your talk page or article talk pages).
So, how do you use AI to create a Wikipedia page?
How to Use AI Tools the Right Way for Wikipedia:
Okay, so you are still going to use AI to write a Wikipedia page.
Guess what?
I encourage you to do it!!
I know I say you shouldn’t, but what I am saying is you shouldn’t just copy and paste from LLM. Use it as a tool, not as the final content creator.
Here is what I mean.
Large language models are great at collecting, organizing, and outlining information. So, use it for that.
The video below shows how I used Grok to research and organize information for a Wikipedia page. I will also show you how I used the information from Grok to properly format a Wikipedia page.
The content that Grok provided me would have been flagged and deleted by Wikipedia.
The content I wrote (based on the information collected by Grok) is a well-written, properly formatted and cited Wikipedia page.
Disclosure — This article was edited with the assistance of Claude, an AI assistant by Anthropic. The irony is not lost on me. And yes, I reviewed each recommendation prior to doing any updates to the content…the very thing I suggest you do if using LLM to create a draft Wikipedia page.
This is the same advice given from the WikiProject AI Cleanup.
“If using an LLM as a writing advisor, i.e. asking for outlines, how to improve paragraphs, criticism of text, etc., editors should remain aware that the information it gives is unreliable. If using an LLM for copyediting, summarization, and paraphrasing, editors should remain aware that it may not properly detect grammatical errors, interpret syntactic ambiguities, or keep key information intact. It is possible to ask the LLM to correct deficiencies in its own output, such as missing information in a summary or an unencyclopedic, e.g., promotional, tone, and while these could be worthwhile attempts, they should not be relied on in place of manual corrections. The output may need to be heavily edited or scrapped. Due diligence and common sense are required when choosing whether to incorporate the suggestions and changes.”
Should You Use the Articles for Creation Process?
If you are unsure if your article will be deleted because of AI use, I would suggest not submitting it directly in the mainspace. Wikipedia has an “article wizard” that allows you to create and submit a draft for review.
The draft is reviewed at the “Articles for Creation” project and either approved or declined.
You should always use the Articles for Creation process if using AI tools.
If you post to the mainspace and do not use the AfC process, the following can happen:
- If AI content is detected, the page could get deleted and placed on a watchlist for future creations.
- You could get reported to administrators and your account blocked if using AI in the mainspace.
If you do use the AfC process, the following can happen:
- If no AI content is detected and the draft meets other guidelines, an editor will move it to the mainspace on your behalf.
- If AI content is detected, you may receive a warning but will be given a chance to fix it. No harm, no foul!
And no, the AfC process does not necessarily delay the creation of a page. As long as it meets guidelines, it will be approved rather quickly. If its not, I guarantee it has to do with your writing, not the process itself.
Pro Tip: If you’re unsure whether your content sounds AI-generated, run it through detection tools like ZeroGPT or GPTZero before posting. Wikipedia editors certainly will.
Final Thoughts:
Please, heed what I tell you about using ChatGPT, Gemini, and other language model programs to create Wikipedia articles. Yes, you are free to use them, but it will cause you more trouble than it’s worth.
Wikipedia editors can recognize AI-generated content and will immediately delete it. It could also lead to blocks on future attempts to create content on the same topic.
Use it wisely. AI is a tool to gather information for you to use, but the ultimate job of creating a Wikipedia page should be done with the human eye.
If you’d rather leave it to the pros, check out my Wikipedia page creation services FAQ.
