fbpx
Leave a comment

#JamesDonaldson on #MentalHealth – #SuicideRisk Prediction Models Could Perpetuate #Racial Disparities

READ LATER - DOWNLOAD THIS POST AS PDF >> CLICK HERE <<

Two #suicide risk prediction models are less accurate for some #minority groups, which could exacerbate #ethnic and #racial disparities.

Suicide risk prediction models could perpetuate racial disparities

 By Jessica Kent

April 29, 2021 – #Suiciderisk prediction models that perform well in the general population may not be as accurate for #Black, #AmericanIndian, and #AlaskaNative people, potentially worsening #ethnic and #racial disparities.

In a study published in JAMA Psychiatry, researchers from #KaiserPermanente found that two #suiciderisk prediction models don’t perform as well in these #racial and #ethnic groups. The team believes this is the first study to examine how the latest statistical methods to assess #suicide risk perform when tested specifically in different #racial and #ethnic groups.

Researchers noted that in the US, more than 47,500 people died from #suicide in 2019 – an increase of 33 percent since 1999.

Several leading organizations, including the #VeteransHealthAdministration and HealthPartners, have started using #suicide risk prediction models to help guide care. Leaders are looking to leverage records of #mentalhealth visits, diagnoses, and other data points to identify #patients at high risk of #suicide.

“With enthusiasm growing for #suicide prediction modeling, we must be sure that such efforts consider health equity,” said Yates Coley, PhD, the study’s first author and an assistant investigator at #KaiserPermanente Washington Health Research Institute.

“Current methods maximize predictive performance across the entire population, with predictive accuracy in the largest subgroup—white #patients—eclipsing the performance for less-prevalent #racial and #ethnic subgroups.”

Researchers gathered EHRs for nearly 14 million outpatient #mentalhealth visits over a seven-year period from seven healthcare systems. Using these health records, the team developed two different models to predict #suicide deaths within 90 days of a #mentalhealth visit: A standard logistic regression approach and a random forest machine learning algorithm.

The models each used demographic characteristics, comorbidities, #mentalhealth and substance use diagnoses, dispensed psychiatric medications, prior #suicide attempts, prior #mentalhealth encounters, and responses to #Patient Health Questionnaire 9, which is routinely filled out at #mentalhealth visits.

The results showed that the models accurately identified suicides and avoided false positives across the entire sample, and for #white, #Hispanic, and #Asian #patients. However, the models performed far worse with #Black, #AmericanIndian, and Alaska Native #patients, as well as #patients without #ethnicity recorded.

For example, the area under the curve (AUC) for the logistic regression model was 0.828 for #white #patients compared with 0.640 for #patients with unrecorded #race/#ethnicity and 0.599 for #AmericanIndian/Alaska Native #patients.

For random forest models, the AUC for #white #patients was 0.812 compared with 0.676 for #patients with unrecorded #race/#ethnicity and 0.642 for #AmericanIndian and #AlaskaNative #patients.

#JamesDonaldson notes:

Welcome to the “next chapter” of my life… being a voice and an advocate for #mentalhealthawarenessandsuicideprevention, especially pertaining to our younger generation of students and student-athletes.

Getting men to speak up and reach out for help and assistance is one of my passions. Us men need to not suffer in silence or drown our sorrows in alcohol, hang out at bars and strip joints, or get involved with drug use.

Having gone through a recent bout of #depression and #suicidalthoughts myself, I realize now, that I can make a huge difference in the lives of so many by sharing my story, and by sharing various resources I come across as I work in this space.  #http://bit.ly/JamesMentalHealthArticle

Researchers cited several possible reasons for these differences in prediction accuracy, including embedded biases in the data. #Black, #AmericanIndian, and #AlaskaNative face barriers to accessing #mentalhealthservices, which means there is less data on #suicide risk factors so it may be harder to make accurate predictions.

Additionally, the team noted that even when these populations do have access to #mentalhealthservices, they are less likely to be diagnosed and treated for #mentalhealthconditions. The clinical data may not accurately reflect risk, which can impact the models’ #suicide predictions.

Finally, suicides in these populations may be incorrectly identified as unintentional or accidental, adding to the challenge in predicting #suicides in these populations.

The group pointed out that the two models examined in the study are not the same as the ones now being implemented in health systems. This study evaluated models that predict #suicide deaths, while models used in clinical care at #KaiserPermanente predict self-harm or #suicide attempts.

Researchers believe that audits like the one used in this study may be needed at other healthcare organizations using #suicide prediction models.

“Before we implement any prediction model in our clinics, we must test for disparities in accuracy and think about possible negative consequences,” said Gregory Simon, MD, MPH, a study co-author and #KaiserPermanente Washington #Health Research Institute senior investigator.

“Some prediction models pass that test and some don’t, but we only know by doing research like this.”

In healthcare, the potential for #racial bias and the exacerbation of disparities is a serious concern with data analytics algorithms. Instead of helping #patients, models built with biased or inaccurate information can worsen any existing #health inequities.

In 2019, a study identified #racial bias in a common algorithm the industry uses to identify eligibility for care management programs.

“There’s growing concern around AI, machine learning, data science, and the risk of automation reinforcing existing biases through the use of algorithms. It was a confluence of what we know is a potential concern,” said Brian Powers, MD, MBA, physician and researcher at Brigham and Women’s Hospital and lead author of the study. 

“There’s absolutely a place for algorithms. What this study showed us is these types of tools are really widespread and have become essentially ubiquitous without enough attention to potential downsides.”

As the use of AI, predictive modeling, and other data analytics technologies continue to rise, the healthcare industry will need to ensure these tools will improve care for every #patient population.

Photo by fauxels on Pexels.com
READ LATER - DOWNLOAD THIS POST AS PDF >> CLICK HERE <<

Leave a Reply

Visit Us
Follow Me
Tweet
Whatsapp
Tumblr
Share
%d bloggers like this: