Can Cell Phones Help Predict Suicides?

The contents in the “archives” were created and posted by the previous owners of this website. We are not responsible for any misleading or incorrect content that is posted here.

Specifically for the New York Times Infobae.

(science times)

CAMBRIDGE, Mass. — In March, Katelin Cruz exited her final psychiatric hospitalization with a familiar mix of feelings. On the one hand, she was relieved to leave the room, where orderlies removed her shoelaces and sometimes followed her into the shower to make sure she didn’t hurt herself.

But his life abroad is as unstable as ever, he said in an interview, with a stack of unpaid bills and no fixed abode. It was easy to have suicidal thoughts again. For frail patients, the weeks following discharge from a mental health facility are a notoriously difficult time, as the suicide rate is about 15 times the national rate, according to one study.

This time, however, Cruz, 29, was released from the hospital as part of a massive research project trying to use advances in artificial intelligence to do something that eluded psychiatrists for centuries: predict who is likely to attempt suicide commit and when you are likely to do so and then intervene.

On his wrist, he wore a Fitbit electronic bracelet programmed to track his sleep and physical activity. An application on his phone collected data about his mood, movements and social interactions. Each device continuously fed information to a team of researchers on the twelfth floor of the William James Building, which houses Harvard University’s Department of Psychology.

In the field of mental health, few new areas are generating as much excitement as machine learning, which uses computer algorithms to better predict human behavior. At the same time, there is growing interest in biosensors that can track a person’s mood in real time, taking into account music selections, social media posts, facial expressions and voice expressions.

Matthew K. Nock, a Harvard psychologist and one of the nation’s leading suicide researchers, hopes to incorporate these technologies into some sort of early warning system that could be deployed when a high-risk patient is discharged from the hospital.

He offers this example of how it might work: the sensor reports that a patient’s sleep is disturbed, she reports being in a bad mood on questionnaires, and the GPS indicates that she doesn’t leave the house. But an accelerometer on his phone shows he’s moving around a lot, indicating excitement. The algorithm marks the patient. A notification will sound on the dashboard. And at exactly the right moment, a doctor will call you or send you a message.

There are many reasons to doubt that an algorithm can achieve this level of accuracy. Suicide is such a rare event, even among the most vulnerable, that any attempt to predict it will result in false positives, forcing people who may not need it to intervene. False negatives can make doctors legally liable.

The algorithms require long-term granular data from large numbers of people, and observing large numbers of people committing suicide is almost impossible. Finally, the data required for such surveillance raises concerns about invasion of privacy for some of society’s most vulnerable individuals.

Nock is aware of all of these arguments but has insisted, partly out of sheer frustration. “With all due respect to people who have been doing this work for decades, for a century, we haven’t learned much about how to identify vulnerable people and how to intervene,” he said. “The suicide rate today is the same as it was a hundred years ago. So if we’re being honest, we’re not getting any better.”

A data erasure tube

On an August afternoon in the William James Building, a lanky data scientist named Adam Bear sat in front of a monitor in Nock’s lab, dressed in baggy shorts and flip-flops, and stared at zigzag graphs of a test subject’s stress levels over the course of a week

When sentiment is represented as data, patterns emerge, and it’s Bear’s job to look for them. He spent the summer analyzing the days and hours of 571 subjects who, after seeing a doctor about suicidal thoughts, agreed to six months of continuous follow-up. During this time, two committed suicide and between 50 and 100 attempted it.

According to Nock, this is the largest reservoir of information ever collected about the everyday life of people with suicidal thoughts.

The team is particularly interested in the days leading up to suicide attempts, which would allow time to intervene. Some signs are already recognizable: Although suicidal thoughts usually do not change in the period leading up to a suicide attempt, the ability to resist these impulses seems to be diminishing. Something simple—sleep deprivation—seems to be contributing.

Nock has been looking for ways to study these patients since 1994, when he had an experience that profoundly affected him. During an internship in the UK, he was assigned to a closed unit for victims of violence and self-harm. There he saw things he had never seen before: Patients had cuts on their arms. One of them ripped out his eyeball. A young man he was friends with who appeared to be on the mend later turned up in the Thames.

Another surprise came when he began bombarding the doctors with questions about the treatment of these patients and realizing how little they knew: He recalls being told, “We give them medicine, we talk to them and we hope that they are better.”

One reason, he concluded, was that it has never been possible to screen large numbers of people with suicidal thoughts in the same way we can screen patients with heart disease or tuberculosis. “Psychology hasn’t progressed as far as other sciences because, for the most part, we got it wrong,” he explained. “We didn’t look for any important behavior in nature, nor did we go out to observe it.”

But with the advent of phone apps and wearable sensors, he added, “We have data from many different channels, and we have increasing ability to analyze that data and observe people as they live.” One of the study design dilemmas was what to do when participants expressed a strong desire to harm themselves. Nock decided that they should intervene.

Telling the truth to a computer

It was around 9 p.m., a few weeks into the six-month study, when the question popped up on Cruz’s cell phone: “Right now, how strong is your desire to kill yourself?”

Without thinking, he dragged his finger to the end of the bar: ten. Seconds later, she was asked to choose between two statements: “I’m definitely not going to kill myself today” and “I’m definitely going to kill myself today.” He preferred the second option.

Fifteen minutes later her phone rang. It was a member of the investigative team who called her. The woman called 911 and kept Cruz on the line until police knocked on her door and she passed out. Later, when he regained consciousness, a medical team rubbed his sternum, a painful procedure used to resuscitate people after overdoses.

Cruz has a pale angelic face and curly dark hair. She was studying nursing when a cascade of mental health crises took her life in a different direction. She maintains an A-grader’s interest in science, and jokes that the rib cage on her shirt is “totally anatomical.”

She was captivated by the essay from the start, and dutifully responded six times a day when her phone apps quizzed her about her suicidal thoughts. The warnings were intrusive but also reassuring. “I felt like I wasn’t being ignored,” he said. “It takes some weight off me that someone knows how I feel.”

On the night of her trial, she was alone in a hotel room in Concord, Massachusetts. He didn’t have enough money to stay another night and his belongings were in garbage bags on the floor. She was tired, she said, “of feeling like I have nobody or nothing.” Looking back, Cruz said he thinks the technology — its anonymity and lack of judgment — makes it easier to ask for help.

“I think it’s almost easier to tell the truth to a computer,” he added.

Last week, as the six-month clinical trial concluded, Cruz filled out her final questionnaire with a touch of sadness. He would miss the dollar he received for each answer. And she would miss the feeling of someone watching her, even if it was faceless, from a distance, through a device.

“Honestly, I feel a little bit more secure knowing that someone cares enough to read this data every day, you know?” she said. “I’ll be a little sad when it’s over.”

If you have suicidal thoughts, text or visit the National Suicide Prevention Lifeline at 988 SpeakingOfSuicide.com/resources for a list of additional resources. em>

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *