Lawmakers Looking to Make VA Overhaul Suicide Prevention Algorithm that Favored Men

FacebookXPinterestEmailEmailEmailShare
U.S. Air Force airman hands out suicide prevention information cards
U.S. Air Force airman hands out suicide prevention information cards at the McChord main gate at Joint Base Lewis-McChord, Washington, Sept. 1, 2022. (U.S. Air Force photo by Airman 1st Class Colleen Anthony)

The Department of Veterans Affairs would be forced to overhaul an artificial intelligence program that helps direct suicide prevention outreach under a bill introduced late last month by Sen. John Tester.

Tester, a Montana Democrat who chairs the Senate Veterans Affairs Committee, introduced the bill after an investigation by The Fuller Project and Military.com found the department's algorithm prioritized white, male veterans. It also gave preference to veterans who are "divorced and male" and "widowed and male," but not to any group of female veterans.

Military sexual trauma and intimate partner violence -- both linked to elevated suicide risk among female veterans -- were not taken into account. Tester's legislation would require those factors to be incorporated within 60 days of the bill becoming law.

The most recent government data show a 24% rise in the suicide rate among female veterans between 2020 and 2021 -- four times the increase among male veterans during that one-year period. It was also nearly 10 times greater than the 2.6% increase among women who never served in the military.

"It is critically important that VA takes into account additional risk factors faced by women veterans," Tester said in a statement. Tester, who is locked in a tight reelection fight that could determine control of the Senate, has touted his commitment to veterans over the course of his campaign.

The VA says it is working to update the algorithm to include risk factors that disproportionately impact women independently of the proposed legislation. Agency press secretary Terrence Hayes said in an email Wednesday that the agency is weighing including pregnancy, endometriosis, ovarian cysts, intimate partner violence and military sexual trauma, among other factors.

"VA continuously works to improve our programs," Hayes added. "As we update the model, it will be evaluated for performance and bias before it is deployed."

The VA said it hoped to update the algorithm in early 2025.

VA officials have previously defended prioritizing white, male veterans for outreach. The suicide rate for female veterans may be rising faster, they've said, but the suicide rate for male veterans remains considerably higher. In an interview in May, Matthew Miller, the agency's executive director for suicide prevention, said a history of military sexual assault or intimate partner violence were not among the 61 variables used in the algorithm, because they were not among "the most powerful for us to be able to predict suicide risk."

Veterans groups, who have pushed for the VA to update the algorithm, welcomed Tester's legislation and said that though the agency has engaged with them, it needs to move faster. "We have seen promises," said Naomi Mathis, assistant legislative director of Disabled American Veterans, which has made improved care for female veterans a priority.

Mathis, a former Air Force staff sergeant who deployed to Iraq, noted that in surveys, a third of female veterans tell the VA that while in uniform, they endured sexual activity against their will.

"You're not seeing them," she said.

The issue of algorithmic bias has gained traction in recent years, with researchers finding many AI systems systematically favoring white men in their functions. Both Presidents Donald Trump and Joe Biden issued executive orders to promote transparency and accountability for AI products, a process that can be difficult, given increased reliance by researchers on systems that ostensibly teach themselves and create their own processes that may not be explainable. The VA has identified more than 100 programs covered by those presidential decrees.

Veterans and service members experiencing a mental health emergency can call the Veteran Crisis Line, 988 and press 1. Help also is available by text, 838255, and via chat at VeteransCrisisLine.net.

Aaron Glantz is a fellow at the Center for the Advanced Study of the Behavioral Sciences at Stanford University, where he is incubating an initiative to promote resilience among investigative journalists.

Editor's Note: This article has been updated to include information provided by the VA after initial publication on the timeframe for changes to the suicide prevention algorithm.

Story Continues