While the art of conversation in machines is limited, there are improvements with every iteration. As machines are developed to navigate complex conversations, there will be technical and ethical challenges in how they detect and respond to sensitive human issues.
Our work involves building chatbots for a range of uses in health care. Our system, which incorporates multiple algorithms used in artificial intelligence (AI) and natural language processing, has been in development at the Australian e-Health Research Centre since 2014.
The system has generated several chatbot apps which are being trialled among selected individuals, usually with an underlying medical condition or who require reliable health-related information.
They include HARLIE for Parkinson’s disease and Autism Spectrum Disorder, Edna for people undergoing genetic counselling, Dolores for people living with chronic pain, and Quin for people who want to quit smoking.
Research has shown those people with certain underlying medical conditions are more likely to think about #suicide than the general public. We have to make sure our chatbots take this into account.
We believe the safest approach to understanding the language patterns of people with suicidal thoughts is to study their messages. The choice and arrangement of their words, the sentiment and the rationale all offer insight into the author’s thoughts.
For our recent work we examined more than 100 #suicide notes from various texts and identified four relevant language patterns: negative sentiment, constrictive thinking, idioms and logical fallacies.
Read more: Introducing Edna: the chatbot trained to help patients make a difficult medical decision
#JamesDonaldson notes:
Welcome to the “next chapter” of my life… being a voice and an advocate for #mentalhealthawarenessandsuicideprevention, especially pertaining to our younger generation of students and student-athletes.
Getting men to speak up and reach out for help and assistance is one of my passions. Us men need to not suffer in silence or drown our sorrows in alcohol, hang out at bars and strip joints, or get involved with drug use.
Having gone through a recent bout of #depression and #suicidalthoughts myself, I realize now, that I can make a huge difference in the lives of so many by sharing my story, and by sharing various resources I come across as I work in this space. #http://bit.ly/JamesMentalHealthArticle
Negative sentiment and constrictive thinking
As one would expect, many phrases in the notes we analysed expressed negative sentiment such as:
…just this heavy, overwhelming despair…
There was also language that pointed to constrictive thinking. For example:
I will never escape the darkness or misery…
The phenomenon of constrictive thoughts and language is well documented. Constrictive thinking considers the absolute when dealing with a prolonged source of distress.
For the author in question, there is no compromise. The language that manifests as a result often contains terms such as either/or, always, never, forever, nothing, totally, all and only.
Language idioms
Idioms such as “the grass is greener on the other side” were also common — although not directly linked to #suicidalideation. Idioms are often colloquial and culturally derived, with the real meaning being vastly different from the literal interpretation.
Such idioms are problematic for chatbots to understand. Unless a bot has been programmed with the intended meaning, it will operate under the assumption of a literal meaning.
Chatbots can make some disastrous mistakes if they’re not encoded with knowledge of the real meaning behind certain idioms. In the example below, a more suitable response from Siri would have been to redirect the user to a crisis hotline.
The fallacies in reasoning
Words such as therefore, ought and their various synonyms require special attention from chatbots. That’s because these are often bridge words between a thought and action. Behind them is some logic consisting of a premise that reaches a conclusion, such as:
If I were dead, she would go on living, laughing, trying her luck. But she has thrown me over and still does all those things. Therefore, I am as dead.
This closely resemblances a common fallacy (an example of faulty reasoning) called affirming the consequent. Below is a more pathological example of this, which has been called catastrophic logic:
I have failed at everything. If I do this, I will succeed.
This is an example of a semantic fallacy (and constrictive thinking) concerning the meaning of I, which changes between the two clauses that make up the second sentence.
This fallacy occurs when the author expresses they will experience feelings such as happiness or success after completing #suicide — which is what this refers to in the note above. This kind of “autopilot” mode was often described by people who gave psychological recounts in interviews after attempting #suicide.
Preparing future chatbots
The good news is detecting negative sentiment and constrictive language can be achieved with off-the-shelf algorithms and publicly available data. Chatbot developers can (and should) implement these algorithms.
Generally speaking, the bot’s performance and detection accuracy will depend on the quality and size of the training data. As such, there should never be just one algorithm involved in detecting language related to poor #mentalhealth.
Detecting logic reasoning styles is a new and promising area of research. Formal logic is well established in mathematics and computer science, but to establish a machine logic for commonsense reasoning that would detect these fallacies is no small feat.
Here’s an example of our system thinking about a brief conversation that included a semantic fallacy mentioned earlier. Notice it first hypothesises what this could refer to, based on its interactions with the user.
Although this technology still requires further research and development, it provides machines a necessary — albeit primitive — understanding of how words can relate to complex real-world scenarios (which is basically what semantics is about).
And machines will need this capability if they are to ultimately address sensitive human affairs — first by detecting warning signs, and then delivering the appropriate response.
Read more: The future of chatbots is more than just small-talk
If you or someone you know needs support, you can call Lifeline at any time on 13 11 14. If someone’s life is in danger, call 000 immediately.
James Donaldson is a Washington State University graduate (’79). After an outstanding basketball career with WSU, he went on to play professional basketball in the NBA with the Seattle Supersonics, San Diego/L.A. Clippers, Dallas Mavericks, New York Knicks, and Utah Jazz. He also played for several teams in the European Leagues in Spain, Italy, and Greece, and he toured with The Harlem Globetrotters to wrap up his career. James was an NBA All-Star in 1988 while playing center for the Dallas Mavericks. In 2006, James was inducted into the Pac-10 Sports Hall of Fame and also the Washington State University Athletic Hall of Fame. In 2010, James was elected as a board member for the NBA Retired Players Association.
James frequently conducts speaking engagements (motivational, inspirational, educational) for organizations, schools, and youth groups.
In 2010, James was the recipient of the NBA Legends of Basketball ABC Award, awarded for outstanding contributions in Athletics–Business–Community.
He believes in being a role model for success and professionalism to the scores of young people to whom he devotes so much of his time. He currently serves on several boards and committees and is a member of many organizations.
James believes in developing relationships that create a “Win-Win” environment for everyone involved, and in being the best he can be!
For more information about James Donaldson or to request he speak at your event, contact him at:
www.StandingAboveTheCrowd.com
[email protected]
1-800-745-3161 (voicemail & fax)
James Donaldson is the author of “Standing Above The Crowd” and “Celebrating Your Gift of Life” and founder of the Your Gift of Life Foundation which focuses on mental health awareness and suicide prevention, especially pertaining to our school aged children and men.
If you’re interested in having James come and speak to your group of young adults, business entrepreneurs, aspiring political and community leaders, and athletic teams, please contact him at [email protected] and or leave a personal message for him at 1-800-745-3161. Keep up with him and read about how he is reaching out and making a difference in the lives of so many around the world at www.yourgiftoflife.org