Daily Management Review

A Research Finds AI Is Unable To Identify Symptoms Of Depression In Black Americans' Social Media Posts


A Research Finds AI Is Unable To Identify Symptoms Of Depression In Black Americans' Social Media Posts
Artificial intelligence (AI) research on social media may identify depression symptoms in White Americans but not in Black Americans. This raises concerns about the dangers of collecting data from a diverse racial and ethnic population to train AI models for healthcare-related activities.
The researchers found that when Black people who use Facebook through Meta Platforms were compared to White people, the AI model utilised for the study was more than three times less predictive for depression.
In a paper published in PNAS, opens new tab the Proceedings of the National Academy of Sciences, the authors of the US study stated that "race seems to have been especially neglected in work on language-based assessment of mental illness."
People who frequently use first-person pronouns like I, me, or mine, as well as certain word categories like self-deprecating expressions, are more likely to suffer from depression, according to previous study on social media posts.
In the current study, researchers analysed language in posts from 868 volunteers—a balanced mix of adult Black and White participants—using a "off the shelf" artificial intelligence programme.
Every participant filled out a validated questionnaire that is used by medical professionals to check for depression.
According to study co-author Sharath Chandra Guntuku of the Centre for Insights to Outcomes at Penn Medicine, the usage of "I-talk" or self-focused attention, as well as self-deprecation, self-criticism, and feeling like an outsider, were connected to depression exclusively for white people.
"We were surprised that these language associations found in numerous prior studies didn't apply across the board," Guntuku said.
Guntuku admitted that social media data cannot be used to diagnose depression in patients, but it may be used to evaluate a person's or a group's risk.
In a previous study, his team examined the language used in social media posts to assess the mental health of populations affected by the COVID-19 outbreak.
According to Brenda Curtis of the U.S. National Institute on Drug Abuse at the National Institutes of Health, who also collaborated on the study, language on social media that indicates depression has been shown to provide insight into the likelihood of treatment dropout and relapse in patients with substance abuse disorders.