New research coordinated by the European Broadcasting Union (EBU) and led by the BBC has found that AI assistants – already a daily information gateway for millions of people – routinely misrepresent news content no matter which language, territory, or AI platform is tested. The intensive international study of unprecedented scope and scale was launched at the EBU News Assembly, in Naples. Involving 22 public service media (PSM) organizations in 18 countries working in 14 languages, it identified multiple systemic issues across four leading AI tools. Professional journalists from participating PSM evaluated more than 3,000 responses from ChatGPT, Copilot, Gemini, and Perplexity against key criteria, including accuracy, sourcing, distinguishing opinion from fact, and providing context. Key findings: 45% of all AI answers had at least one significant issue. 31% of responses showed serious sourcing problems – missing, misleading, or incorrect attributions. 20% contained major accuracy issues, including hallucinated details and outdated information. Gemini performed worst with significant issues in 76% of responses, more than double the other assistants, largely due to its poor sourcing performance. Comparison between the BBC’s results earlier this year and this study show some improvements but still high levels of errors. Why this distortion matters AI assistants are already replacing search engines for many users. According to the Reuters Institute’s Digital News Report 2025, 7% of total online news consumers use AI assistants to get their news, rising to 15% of under-25s. ‘This research conclusively shows that these failings are not isolated incidents,’ says EBU Media Director and Deputy Director General Jean Philip De Tender. ‘They are systemic, cross-border, and multilingual, and we believe this endangers public trust. When people don’t know what to trust, they end up trusting nothing at all, and that can deter democratic participation.’ Peter Archer, BBC Programme...
First seen: 2025-10-22 14:23
Last seen: 2025-10-22 20:27