Wikipedia Creates AI System To Filter Out Bad Edits

An anonymous reader writes: Wikipedia has developed a new artificial intelligence system aimed at improving the quality of its entries and detecting both mistakes and damaging edits made to its articles. The technology is named the Objective Revision Evaluation Service. The Wikimedia blog explains that the system is able to highlight incorrect edits, allowing editors to filter them out from the “torrent” of new amends and scrutinize their credibility. The entire service and process is open – with Wikipedia making revision scoring transparent and audit-able by publishing the source code, performance statistics and project documentation publicly under open licenses.

Share on Google+

Read more of this story at Slashdot.

Original URL:

Original article

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑

%d bloggers like this: