LISTEN LIVE: Loading...

Advertisement

 

Should Facebook Monitor Users For Suicidal Behavior?

15:19
Download
Play
This article is more than 4 years old.

A Facebook algorithm analyzes comments, posts and videos to predict when users might be suicidal. In thousands of cases, Facebook has notified police, who have have arrived to perform wellness checks.

Is this a moral good? Or Is Facebook stepping into a health care role and overstepping its bounds?

How do we weigh the cost of privacy against possible social good?

Resources: You can reach the National Suicide Prevention Lifeline at 1-800-273-TALK (8255) and the Samaritans Statewide Hotline at 1-877-870-HOPE (4673).

Guests

Hiawatha Bray, technology writer for the business section of The Boston Globe. He tweets @globetechlab.

Dr. Mason Marks, research scholar at New York University's Information Law Institute and a visiting fellow at Yale Law School's Information Society Project. He tweets @MasonMarksMD.

Dr. Steven Schlozman, co-director of The Clay Center for Young Healthy Minds at Massachusetts General Hospital, where he practices child and adult psychiatry. He tweets @SSchlozman.

This segment aired on January 31, 2019.

Related:

Jamie Bologna Twitter Senior Producer/Director, Radio Boston
Jamie Bologna was senior producer and director of Radio Boston.

More…

Eve Zuckoff Twitter Freelance Producer, Radio Boston
Eve Zuckoff was a freelance producer for Radio Boston.

More…

Advertisement

 
Play
Listen Live
/00:00
Close