Algorithms Help Spot Possible Suicidal Intent Among Veterans’ Social Posts
Roughly 17 military veterans die by suicide each day.
A social media platform designed for America’s military community is now equipped with a custom machine learning model that insiders say can rapidly review public posts and pinpoint those that show signs and risks of potential self-harm.
With support from the Veterans Affairs Department and Harvard University’s Nock Lab, Amazon Web Services linked up with the existing RallyPoint military social media platform to target the production of a technological solution that can speedily surface sensitive public posts and boost online suicide intervention.
“Historically, the heavy lifting of mental health support on RallyPoint has been shouldered by RallyPoint members stepping up to help each other when they come across people sharing their challenges on our site,” RallyPoint CEO Dave Gowel recently told Nextgov. “Now, through our work with the VA, AWS and mental health experts from Harvard, we are more proactive in reinforcing our members’ good work by offering helpful resources when we are alerted about public posts showing signs of risk.”
Launched in 2012, RallyPoint enables nearly 2 million service members, veterans, and their families to connect, share stories and information, ask questions and ultimately chat on topics that accompany military and veteran life. At times, members of the platform will post about experiences of self-injurious thoughts and behaviors, which veterans are starkly more likely to endure. VA estimates roughly 17 military veterans die by suicide each day—a rate 50% higher than that of non-veteran American adults.
The military-focused media platform signed a memorandum of understanding with VA through its Veterans Experience Office in 2018 to collaboratively improve veterans’ interactions in their communities. Through the work, a range of VA subject matter experts have answered RallyPoint members’ questions on the platform, and officials from the company have also travelled to a variety of locations around the U.S., as Gowel put it, “to support VA and VA partner programs and incorporate the learnings from such programs into our member experience.” Through one such effort, RallyPoint officials participated in the VA’s Economic Investment Initiative in-person in both San Juan and Puget Sound, heard from participants about a crucial goal to help normalize discussions about mental health, and ultimately produced and promoted a piece with VA partner Cohen Veterans Network encouraging veterans to chime in about why they maybe haven’t yet received the mental health support they need.
“It was the raw and honest sharing of mental health challenges on RallyPoint by our members participating in discussions like these—and the counsel of mental health experts in the VA’s Suicide Prevention Program, the VA/White House [President's Roadmap to End a National Tragedy of Suicide, or PREVENTS] Task Force to end veteran suicide and Harvard’s Nock Lab. That fueled our interest and understanding of how we could better serve our members through this pilot, though the VA was not directly involved in its development,” Gowel said. He added that VA continues to bring experts onto RallyPoint to answer members’ questions about mental health challenges.
"Suicide is a public health emergency facing our nation. Sadly, our veterans are at high risk—often because of the sacrifices they have made to protect our freedoms," Dr. Barbara Van Dahlen, executive director of the PREVENTS Task Force said in a statement regarding the production of the machine learning model. "Because suicide is not just a concern for our veterans, it’s critical that leading veteran and non-veteran organizations partner in order to ensure that everyone is protected against this very real threat. This type of whole of nation approach exemplifies the strategy the White House and [VA] believe is necessary to accomplish our shared mission to prevent suicide.”
People experiencing suicidal feelings don’t always rush to get the mental health care they need, but those involved in this work believe that machine learning technology—which is intended to identify potential indications of distress in posts in real-time and likely way quicker than humans—can change the game for suicide prevention.
“Developing a way to quickly and accurately sift through a large number of public posts of RallyPoint users to identify the small number indicative of potential self-harm was a challenge, and that’s where we saw an opportunity for machine learning to help,” Michelle K. Lee, vice president of the Amazon Machine Learning Solutions Lab told Nextgov.
The lab, which pairs clients with Amazon’s expert data scientists to spot and implement high-value use cases, worked closely with RallyPoint to build a machine learning model using Amazon SageMaker—a fully managed service for creating and deploying machine learning models—trained with anonymized public posts from the RallyPoint system. From there, mental health experts at Harvard University’s Nock Lab helped to train the model by annotating additional posts using an Amazon data labeling service with aims to continuously improve the accuracy of the model’s predictions.
Considered a “human-in-the-loop system,” this means of review enhances the model’s accuracy by incorporating human judgement into the application to enable higher quality predictions over time.
“We are encouraged by the early results,” Lee noted. “The machine learning model is helping to quickly surface sensitive public posts to the RallyPoint and Harvard teams, while reducing the amount of manual review needed to enable a potentially life-saving intervention.”
In the coming months, RallyPoint and Harvard intend to further refine the model—and content such as mental health programs, hotlines and support groups to surface to users. The social media platform is also gathering feedback from its members to inform the overall process.