Did you know As ChatGPT Evolves, Researchers Uncover Unforeseen Political Leanings in AI Models

Did you know As ChatGPT Evolves, Researchers Uncover Unforeseen Political Leanings in AI Models

 

According to a new study by China’s Peking University, ChatGPT  has started to become biased in its political views and is showing a right-wing shift now. AI models are supposed to be unbiased in all of their opinions, including political opinions but this new research shows that ChatGPT has most of its viewpoints in the right wing in newer models. Previous researchers showed that ChatGPT had a liberal bias and even though it is still giving some of its viewpoints in the left wing, new models of ChatGPT are showing a shift. The authors of the study tested ChatGPT through Political Compass Test to come to this conclusion.

Many people may assume that this change in ChatGPT’s viewpoints was mainly because of Donald Trump being elected as a president once again or how Big Tech is supporting conservations in administration, the researchers say that this change is mostly because of a change in training data used to train ChatGPT models and how political topics are being filtered from that data. It can also be assumed that ChatGPT supporting right wing opinions can also be due to how users are interacting with it because ChatGPT learns from the interactions with its users. The right wing shift has been observed in GPT 3.5 as well as GPT-4 models.


The rapid change in ChatGPT’s political viewpoint isn’t something to be concerned about but the researchers say that it should continuously be monitored to see how it is affecting human decision making. China’s DeepSeek also shows a lot of biases in some topics so it is acceptable for xAI and ChatGPT to have their own biases.


 

 

Mohamed Elarby

A tech blog focused on blogging tips, SEO, social media, mobile gadgets, pc tips, how-to guides and general tips and tricks

Post a Comment

Previous Post Next Post

Post Ads 1

Post Ads 2