Did you know New Study Shows AI Cannot Be Trusted for News as It Lacks Accuracy

Did you know New Study Shows AI Cannot Be Trusted for News as It Lacks Accuracy

 

According to a new study by British Broadcasting Corporation, AI assistants often provide inaccurate news and misleading news to users which can have drastic effects. The journalists at BBC asked AI chatbots like CoPilot, ChatGPT, Perplexity and Gemini 100 questions about current news and asked them to cite BBC articles as their sources. The results showed that 51% of the responses from AI had significant issues, while 91% had slight issues. 19% of the sources which cited BBC content had incorrect statistics and date while 13% of the quotes from BBC articles were fabricated or altered. AI assistants couldn't differentiate between facts and opinions and couldn't provide context.

This shows that AI assistants shouldn't be used for reliable news because their hallucination and misinformation issues can mislead the audience. One of the responses from Google Gemini stated that the NHS advises people to not start vaping but the actual article advised people to start vaping if they want to quit smoking. Some other responses also provided inaccurate information about political leaders as well as TV presenters.


This study matters because it is important for people to trust news, no matter where it is from even from AI assistants. Some people prefer human-centric journalism over AI while others said they partly trust news from AI. So this means accuracy matters the most to people and human reviews is essential even with AI use. AI also lacks context often so it can also become misleading and problematic if used for news.


 

 

Mohamed Elarby

A tech blog focused on blogging tips, SEO, social media, mobile gadgets, pc tips, how-to guides and general tips and tricks

Post a Comment

Previous Post Next Post

Post Ads 1

Post Ads 2