Support the news

Should Facebook Monitor Users For Suicidal Behavior?15:19

This article is more than 2 years old.

A Facebook algorithm analyzes comments, posts and videos to predict when users might be suicidal. In thousands of cases, Facebook has notified police, who have have arrived to perform wellness checks.

Is this a moral good? Or Is Facebook stepping into a health care role and overstepping its bounds?

How do we weigh the cost of privacy against possible social good?

Resources: You can reach the National Suicide Prevention Lifeline at 1-800-273-TALK (8255) and the Samaritans Statewide Hotline at 1-877-870-HOPE (4673).


Hiawatha Bray, technology writer for the business section of The Boston Globe. He tweets @globetechlab.

Dr. Mason Marks, research scholar at New York University's Information Law Institute and a visiting fellow at Yale Law School's Information Society Project. He tweets @MasonMarksMD.

Dr. Steven Schlozman, co-director of The Clay Center for Young Healthy Minds at Massachusetts General Hospital, where he practices child and adult psychiatry. He tweets @SSchlozman.

This segment aired on January 31, 2019.


Jamie Bologna Twitter Producer/Director, Radio Boston
Jamie Bologna is producer and director of Radio Boston.


Eve Zuckoff Twitter Freelance Producer, Radio Boston
Eve Zuckoff is a freelance producer for Radio Boston.


Support the news