Wikipedia’s “WikiProject AI Cleanup” is a coordination hub for dealing with the growing amount of AI-generated (or AI-assisted) content landing in articles. Instead of treating this as a one-off moderation issue, the project frames it like any other cross-cutting quality effort on Wikipedia: a shared place to document patterns, agree on norms, and build a repeatable workflow for review and cleanup.
The page is useful even if you don’t edit Wikipedia regularly because it shows what “AI cleanup” looks like in practice. It collects guidelines for flagging suspicious edits, a backlog of pages and categories that need review, and links to tools/processes editors can use to track and triage work. The underlying point is simple: generative text can be fluent while still being wrong, un-sourced, or subtly biased, so the community needs ways to find and repair the damage without burning out the humans who maintain the corpus.