An anonymous reader writes: Wikipedia has developed a new artificial intelligence system aimed at improving the quality of its entries and detecting both mistakes and damaging edits made to its articles. The technology is named the Objective Revision Evaluation Service. The Wikimedia blog explains that the system is able to highlight incorrect edits, allowing editors to filter them out from the “torrent” of new amends and scrutinize their credibility. The entire service and process is open – with Wikipedia making revision scoring transparent and audit-able by publishing the source code, performance statistics and project documentation publicly under open licenses.
Read more of this story at Slashdot.
Original URL: http://rss.slashdot.org/~r/Slashdot/slashdot/~3/XRVJpskX4Gw/wikipedia-creates-ai-system-to-filter-out-bad-edits