Should Facebook Monitor Users For Suicidal Behavior?

Download Audio

A Facebook algorithm analyzes comments, posts and videos to predict when users might be suicidal. In thousands of cases, Facebook has notified police, who have have arrived to perform wellness checks.

Is this a moral good? Or Is Facebook stepping into a health care role and overstepping its bounds?

How do we weigh the cost of privacy against possible social good?

Resources: You can reach the National Suicide Prevention Lifeline at 1-800-273-TALK (8255) and the Samaritans Statewide Hotline at 1-877-870-HOPE (4673).


Hiawatha Bray, technology writer for the business section of The Boston Globe. He tweets @globetechlab.

Dr. Mason Marks, research scholar at New York University's Information Law Institute and a visiting fellow at Yale Law School's Information Society Project. He tweets @MasonMarksMD.

Dr. Steven Schlozman, co-director of The Clay Center for Young Healthy Minds at Massachusetts General Hospital, where he practices child and adult psychiatry. He tweets @SSchlozman.

This segment aired on January 31, 2019.


Headshot of Jamie Bologna

Jamie Bologna Senior Producer/Director, Radio Boston
Jamie Bologna was senior producer and director of Radio Boston.


Headshot of Eve Zuckoff

Eve Zuckoff Freelance Producer, Radio Boston
Eve Zuckoff was a freelance producer for Radio Boston.



More from Radio Boston

Listen Live