What to know

  • Wikipedia announced a new three-year AI strategy on April 30, 2025, emphasizing that it won't replace human editors and volunteers.
  • The Wikimedia Foundation plans to implement AI specifically for automating tedious tasks, improving information discovery, facilitating translations, and supporting new volunteer onboarding.
  • The organization prioritizes using AI for content integrity rather than content generation, as new encyclopedic knowledge can only be added at a rate that editors can handle.

The Wikimedia Foundation, which owns Wikipedia, unveiled its new artificial intelligence strategy on April 30, 2025. The plan outlines how the online encyclopedia will implement AI over the next three years while maintaining its commitment to human editors.

"The community of volunteers behind Wikipedia is the most important and unique element of Wikipedia's success," the non-profit organization stated. For nearly 25 years, Wikipedia's success has relied on human editors who research, deliberate, and build consensus to create the world's largest encyclopedia.

The new AI tools will focus on making editors' work easier by handling technical aspects. This approach allows volunteers to concentrate on what they want to accomplish rather than how to technically achieve it.

Chris Albon, CEO of Machine Learning at the Wikimedia Foundation, emphasized that the organization will be guided by long-standing values and principles. "We will focus on a human-centric approach, favoring open-source AI. We will be committed to transparency, and we will also take into account the complexity of multilingualism," Albon said.

The AI implementation will target several specific areas. It will automate routine and repetitive tasks that don't require human judgment. The technology will also improve information discoverability on Wikipedia, giving editors more time for deliberation and consensus-building.

Translation capabilities will be enhanced through AI, helping editors share local perspectives and context. Additionally, the Foundation plans to use generative AI to improve the onboarding process for new Wikipedia volunteers through guided mentorship.

Wikimedia has decided to prioritize using AI for content integrity rather than content generation. This decision stems from the understanding that new encyclopedic knowledge can only be added at a rate that human editors can effectively manage. Investing too heavily in content generation could overwhelm their capacity.

The organization noted that maintaining Wikipedia's knowledge base becomes increasingly important in a world where generative AI can make mistakes and fabricate answers. The Foundation plans to review its AI strategy annually and make adjustments as needed.

This move comes as Wikipedia has been facing increased strain on its resources from bots collecting data to train AI models. These activities have raised costs for maintaining the infrastructure, potentially limiting content access.

While the balance between moderation and generation might shift over time depending on needs, the Foundation's current strategy clearly puts humans first in its approach to artificial intelligence.

Via: Tech Crunch