Artificial intelligence is one of the hottest trends in government and industry. This column aims to shine the spotlight on where the most innovative uses of AI are happening, and explore the legal, ethical and economical issues organizations face.
Artificial intelligence has gained some notoriety for its use in medical applications, such as diagnosis, drug development and telemedicine. But researchers also are exploring its use in mental health treatments, sometimes in line with government projects. Here are three examples of how research teams are looking to use AI’s ability to detect indicators and analyze data to improve emotional and mental health.
Looking Under the Hood
The National Institute of Mental Health is investigating one aspect of what could be a burgeoning field — eXplainable Artificial Intelligence — to comprehend human behavior by better understanding how the components of brain activity work together.
As its name suggests, Explainable AI, or XAI, in a general sense seeks to allow an AI system to explain its actions. As the Defense Advanced Research Projects Agency says in describing its own research in the field, AI promises systems that can assess situations, draw on the lessons of similar events, make a decision and then act on it. But what systems can’t currently do is debrief the humans involved — to explain, in human terms, why it made a decision. DARPA wants to develop more explainable models that allow human users to understand and trust a system’s decisions, something that would be essential to the efficacy of automated defense and other systems.
NIMH wants to take its research deeper into the realm of mental health, looking into the brain to identify the physiological markers associated with brain functions and using AI to help identify the “causal links between brain activity and complex behaviors.”
In announcing its program, NIMH noted the physiological markers that can indicate brain activity currently are somewhat siloed, to use a term common in the information technology field. Information gathered from genes, neural circuits or a person’s behavior each offer only a small piece of the puzzle. An integrated system that gathers and analyzes data from various sources — both invasive and noninvasive brain manipulations — could go a long way toward presenting the bigger picture.
The NIMH researchers plan to incorporate other new technologies as well, from eye trackers, GPS apps and accelerometers to human-computer interfaces that could explain the information gathered. XAI would be applied to “explain the link between neural activity and behaviors,” by understanding how what happens at the circuit level of the brain can determine complex individual behaviors, according to NIMH. While they’re at it, researchers say the project also could produce breakthroughs in multimodal data analytics.
Post-traumatic stress disorder is a common condition for those who have experienced harrowing events, particularly among veterans returning from war. The Veterans Affairs Department says the rates of PTSD diagnosis vary somewhat depending on the conflict, but generally fall between 10 percent and 20 percent for veterans. The condition is treatable, but although VA says recovery rates in its treatment program are as high as 80 percent, fewer than 10 percent of veterans with PTSD complete the program within a year.
IBM and the digital health company Tiatros are looking to increase the success rate with an AI-based secure social network that connects veterans suffering from PTSD with their peers while delivering personalized cognitive behavioral therapy. Called Tiatros Post Traumatic Growth for Veterans, the program taps into IBM’s cognitive supercomputer Watson, using its Personality Insights and Tone Analyzer APIs to measure participants’ responses and state of mind.
By connecting patients and using AI tools to support Tiatros’ behavior health analytics, the companies said their programs have achieved a 73 percent completion rate among veterans who begin the PTSD sessions.
A Fitbit for the mind
Fitness trackers are catching on as a way to help people get healthier through more activity, better eating habits and more beneficial sleep. Several tech companies are looking to take the next step, incorporating AI into wrist-worn and other monitoring devices to identify subtle vocal clues of stress, depression or other conditions, in some cases doing it in concert with the natural language processing in virtual assistants such as Siri and Alexa.
A startup called Sentio Solutions, for instance, has developed the Feel wristband, which tracks the wearer’s emotions and stress levels throughout the day. Another company, Empatica, has developed the Embrace, which tracks activity and sleep but also collects data that can warn epilepsy patients of impending seizures.
Government and military organizations have been exploring wearable tech in recent years, from devices that can track the health of first responders a variety of body sensors for soldiers in the field. Adding emotional indicators to those sensors is a likely next step, considering the emphasis military organizations put on emotional intelligence.