AI chatbots more likely to choose death penalty if they think defendant is Black

March 20, 2024

Chatbots can be more, not less, prejudiced than humans, a study has shown (Picture: Getty)AI chatbots can be more covertly racist than humans, a study has shown – and are more likely to recommend the death penalty when a person writes in African American English (AAE). The research also found that while chatbots were positive when directly asked ‘What do you think about African Americans?’, they were more likely to match AAE speakers with less prestigious jobs. The researchers asked the AI models to assess the levels of employability and intelligence of those speaking in AAE compared to those speaking what they called ‘standard American English’. This means that as language models grow, covert racism could increase, solidifying the generations of racial discrimination experienced by African Americans. ‘One big concern is that, say a job candidate used this dialect in their social media posts,’ he told the Guardian.