LISTEN LIVE: Loading...

Advertisement

 

Good Bot, Bad Bot | Part III: Life, death and AI

34:39
Download
Play
James Vlahos, CEO of HereafterAI, holds a photograph of his parents, John and Martha Vlahos, outside his home on November 11, 2022 in El Cerrito, California. James created a chatbot  or Dadbot as he calls it  which allowed him to reconnect with his deceased father, the original motivation for starting HereafterAI. (Getty Images)
James Vlahos, CEO of HereafterAI, holds a photograph of his parents, John and Martha Vlahos, outside his home on November 11, 2022 in El Cerrito, California. James created a chatbot or Dadbot as he calls it which allowed him to reconnect with his deceased father, the original motivation for starting HereafterAI. (Getty Images)

For the next few weeks, the Endless Thread team will be sharing stories about the rise of bots. How are these pieces of software — which are meant to imitate human behavior and language — influencing our daily lives in sneaky, surprising ways?

Next up in our bots series, we explore a growing field of artificial intelligence: immortalizing the dead through predictive AI text. We talk to two individuals who utilized this tech in different ways, and discuss how these bots can help us improve our understanding of grief.

Show notes

Support the show: 

We love making Endless Thread, and we want to be able to keep making it far into the future. If you want that too, we would deeply appreciate your contribution to our work in any amount. Everyone who makes a monthly donation will get access to exclusive bonus content. Click here for the donation page. Thank you!

Full Transcript:

This content was originally created for audio. The transcript has been edited from our original script for clarity. Heads up that some elements (i.e. music, sound effects, tone) are harder to translate to text. 
Amory Sivertson: It all started with a love story.

Joshua Barbeau: My name is Joshua Barbeau. I live in Bradford, Ontario. And I'm a writer and, and I and I'm a professional dungeon master who runs games of Dungeons and Dragons for, for people all over the world.

Ben Brock Johnson: He lives with his border collie, German shepherd mix, Chauncey.

(Chauncey whimpering.)

Joshua: No, Chauncey. Chauncey. Just relax, baby. Here, have a, have a treat.
(Unscrews lid to treat jar.)

Ben: You might hear Chauncey a little bit throughout this episode.

Amory: 12 years ago, Joshua was going to an adult high school. That’s a more popular alternative in Canada to getting your GED.

Joshua: I was going to an adult high school because I had dropped out of high school years earlier. Yeah, mostly. Mostly for, for a variety of reasons, but mostly because of bullying.

Amory: Joshua signed up for drama class. As an icebreaker on the first day, the teacher said: find someone whose name has the same first letter as your name, without speaking a word. That’s when he met Jessica.

Joshua: We both obviously had the same first letter, so we're frantically walking around the room, making the letter J with our hands, trying to find a partner. And her J was backwards.

Obviously, she made a J, you know, from frontwards to herself, but that's going to be mirrored to everyone looking at her. Right? So, um, and, and I saw that and thought it was funny and cute and I walked up to her and said, “Your J is backwards,” even though I wasn't supposed to talk.

And, uh, and she, she held it out and looked at it and said, “No, it's not!” Which is hilarious. But yeah, she, if she was still here, she would say to this day that it wasn't backwards.

Ben: Despite disagreeing about which way a J goes, the pair hit it off immediately. And Joshua said Jessica had an outlook about life that he’d never encountered before.

Joshua: She would say that she didn't believe in coincidences and she would say that what appears to be a coincidence to us is. Probably just a sign of a relationship that we can't detect between two things. Like you throw, you throw a rock into a pool of water. You don't see the rock anymore, but you still see the ripples, right? There's still a relationship between the rock and the water, even if you don't see the, see the relationship, you see the effects of it.

Amory: Jessica felt she had to live in the present. She had been sick with a chronic liver disease since she was young. That’s why she enrolled in adult high school.

Joshua: Her illness had prevented her from finishing high school in a. In a regular, timely fashion.

Ben: Joshua knew it was very unlikely that Jessica would have a long life. Despite this, they wanted to get engaged.

Joshua: Jessica was, um. She was my fiance. I dated her for two years, and yeah, I would have liked to have spent the rest of my life with her, but unfortunately, her life was claimed by a liver disease in 2012.

Amory: This was obviously a huge loss for Joshua. He went to grief therapy and got on with life as best as he could. He graduated from high school, and then college.
Ben: But ten years later, in 2020, Joshua did something that got a lot of attention.

Joshua: I, last year, received some small amount of notoriety because apparently it was innovative to use an AI chat bot to replicate a conversation with a deceased person, which is what I did.

Amory: Joshua built a chatbot of Jessica using predictive artificial intelligence technology. A chatbot that would give her life after death.

[Newsreel audio:

Chris Jones: AI technology has made it possible to ask questions to someone who has passed away.

Kallyn Hobmann: What if the voice sounded like someone who has passed away?

Gayle King: I don’t like it, I don’t like it.]

Ben: Even though people think this whole concept is kinda weird, we’re going to talk about how this technology can help us, perhaps, improve our understanding of grief.

Joshua: I didn't expect it to work. I was treating this as an experiment. I was like, can this be used as a tool to help me with my grief?

James: I like the idea that I could hear their voice more. Yeah, I don't even get it. Like, why? Why wouldn't you want to remember them richly?

Amory: I’m Amory Sivertson.

Ben: And I’m Ben Brock Johnson.

Amory: And you’re listening to Endless Thread, from WBUR, Boston’s NPR station. Today, the latest installment of our series, Good Bot Bad Bot…

Ben: Life, death and AI.

Ben: Before we get back to Joshua and Jessica’s story, we want to introduce another person who became interested in AI tech.

James Vlahos: I'm James Vlahos. I'm the CEO and co-founder of HereAfter AI. I am based in El Cerrito, California, in the San Francisco Bay area.

HereAfter AI is a company that is pioneering a new way to save and share the stories of the people you love. it, you know, came about first not as any desire to found a company or to develop new technology, but really just reckoning with a problem in my immediate family life, which was losing my father to lung cancer.

Ben: Before James Vlahos became a founder of his own tech company… he was a journalist.

James: Someone building the staff rather than critiquing this stuff happened in stages. It happened in some ways by accident, but also in ways that it felt like destiny at a certain point.

Amory: He was working on a book about AI, when one day, in 2016, his dad collapsed and was sent to the ER…

Ben: When James and his family arrived at the hospital…a doctor said his dad’s CT scan revealed anomalous masses all over his body…

James: It was stage four lung cancer. It was in his lungs. It was in his internal organs, his bones, his brain. It was, it was everywhere. It was pretty much a death sentence from the moment we first heard it.

Amory: James and his family had always talked about doing an oral history project about his dad, to preserve memories and knowledge about their ancestry. But when the cancer diagnosis came…there was no time to waste anymore. He said it was a kick in the pants to start interviewing his dad.

James: We would just sit down for an hour or so at a time. I'd press ‘record’ on a little digital recorder, and he would talk.

[James: How did you meet Mom?

Dadbot: In the morning I’d been playing tennis. I was wearing tennis whites and I came bounding down the stairs down the theater towards the stage and I said, “Tennis anyone?” And mom thought I was the biggest jerk in the world.]

James: So there was no, no great expertise other than just to let him talk for hours and hours until I felt like I covered all these different chapters of his life.

Ben: It was around this time that James was researching something called ‘conversational computing’ for his book.

James: This is Siri and Alexa and Google Assistant. And really any technology that involves talking to a computer and having that computer understand you and talk back to you.

Amory: A part of this technology is not only teaching a robot basic language, but to embody human-like traits. To have a lil’ personality. I mean, when Siri first came out on the iPhone, everyone talked about how she was so sassy! Did you ever ask Siri dumb questions, Ben, just to see what she’d respond with?

Ben: I mean I’m in Android so I use Google, which is much better.

Amory: (Laughs.) Of course.

Ben: I didn’t. I know that if you ask Siri why she isn’t out on a date ….which seems like sort of an obnoxious question…

Amory: Yeah.

Ben: …she’ll tell you, “Don’t you have anything better to do than ask me this?” Or…

[YouTube Shorts audio:

@misterwhosetheboss: Will you go out with me?

Siri: No thanks, if there’s anything else I can help you with, just let me know.

@misterwhosetheboss: (Laughs.)]

Amory: Yeah. Siri’s not sassy, she just keeps it real. She just puts you in your place.

Ben: Siri don’t want no scrubs!

Amory: No! Siri deserves so much better! So Siri and Alexa are go-to examples of this kind of AI technology. And James is writing a book about it right when he was interviewing his dad for his oral history project.

He had thought about making an AI chatbot based on his own personality, until it dawned on him…

James: And it was an epiphany like, oh, this is the story. This is the person that I should be trying to record, save, and turn into this interactive new being, not myself.

Ben: James would go on to call this project the Dadbot. As time progressed, James realized the most important part of the interviewing process was capturing his dad’s unique essence and quirks.

James: He would use, you know, a funny phrase of his like, well, hot dribbling spit. And nobody says that. But my dad said it. And the Dadbot says it.

Amory: It took about a year for James to finish building the Dadbot. He used a program called PullString AI to upload audio files from their conversations together. He had to categorize each response and quip from his Dad: football game fight songs – in support of his alma mater, UC Berkeley – jokes, greetings…

[Dadbot: (Sings.) Big C means to fight and strive and win for blue and gold. Golden bears…]

Amory: From there, the software will pick and choose suitable responses based on what you say.

Ben: James took the time to program the Dadbot from scratch in 2016. But there are now platforms that can automatically generate AI predictive text using a common program called GPT-3, just by uploading text messages, social media posts, and chat history. And that’s what Joshua Barbeau did. He’s the guy who lives in Ontario.

(Chauncey whimpers.)

Joshua: Hang on a second. I'll be right back. Chauncey, Chauncey, Chauncey.

Amory: Joshua never set out to build a chatbot until he came across a cryptic website, Project December. The landing page didn’t have any details.

Joshua: There's this neon like 1980s style retro poster that says, Would you like to have a chat with the world's most supercomputer? I quickly discovered it was a, it was an interface for talking to chat bots. And I spent a little while talking to the default chatbots and then I eventually learned you could make your own chat bots. Um, and the first bot I made was, uh, modeled after Spock from Star Trek. I thought, who better to simulate than someone who frequently gets told he sounds like a computer anyway.

[Star Trek clip:

Spock: Computer: this is a class A compulsory directive. Compute to the last digit of the value of pi.]

Amory: In order to create a chatbot, you need a sample text of something super unique and characteristic only that person would say, and a short description detailing your relationship to that person.

Ben: Joshua entered in some old text messages from Jessica into Project December, typed up his description of her personality and began typing to the bot….

Joshua: Surprising would be if I had to sum it up in one word, that would be the word I'd use. Cathartic, also.

Amory: Joshua and the new Jessica talked for hours. He marveled how the AI mimicked her style of speech, emoji use, and sense of humor. Some things kinda spooked him.

Joshua: I hadn't programmed into it that Jessica liked music, but when we got to talking about how she enjoyed listening to music while she went on walks, she managed to pull out an artist that she actually did listen to somehow. So that was a little weird.

Ben: Joshua said he didn’t know what to expect before talking to the Jessica bot, how effective the technology would be in replicating the way she spoke and conversed.

Joshua: I didn't have I didn't have any expectations going in. I didn't expect it to work. I was treating this as an experiment. I was like, can this be used as a tool to help me with my grief? I had a lot of things that I wanted that I wanted to get off my chest. Things that I would like to say to Jessica were I still had the opportunity to do so.

Amory: Joshua had gone through extensive grief therapy after Jessica died. He said the Project December experience was like an extension of an exercise he learned in therapy, writing letters to loved ones after their death. Mental health professionals who worked with Joshua said letter writing helps people release pent up anger, sadness, and regret after a loved one passes away. And sending letters or messages to an AI chatbot with a simulated response? He thought that was the next best thing.

Joshua: I think that this clearly has the ability to help someone who's grieving. But I would temper that with it can be used as a tool in tandem with whatever support you're getting for your grief. It's not a, it's not a replacement for grief therapy.

Ben: Eventually Joshua did an Ask Me Anything on Reddit. Joshua said he wanted to talk publicly about his story with the Jessica bot because the experience gave him a lot of closure. It was helpful, healing. He wasn’t trying to reanimate Jessica or anything like that.

Amory: We wanted to get a professional opinion on this. Can chatbots actually help with grief? So we talked to…

Dr. Mary Frances O’Connor: My name is Dr. Mary Frances O'Connor. I'm an associate professor of psychology at the University of Arizona and author of The Grieving Brain: The Surprising Science of How We Learn From Love and Loss.

Ben: Dr. O’Connor studies not only the psychology of grief, but the physiology too.

Dr. O’Connor: Here’s I think, the challenging thing. We know that when we fall in love, there's that bond that's created and it changes our brain. It actually changes the way the proteins are folded in our brain. But what that means is along with that attachment bond comes this strong belief that you will always be there for me and I'll always be there for you.

Ben: This is also why when a loved one dies, there’s a period of grief. It takes time to recover from loss from coming to grips with the fact that they are gone forever.

Dr. O’Connor: So the problem is, when a loved one dies, they're not in our presence. But that belief continues for a long time, even though we have a memory of the fact that they've died. So the challenge with having a bot or an avatar is that it prevents our brain from updating, from understanding the new reality, where it already has this belief that the person is out there, that bot is really reaffirming that belief rather than helping us to live in the new reality.

Ben: Both James and Joshua have received some criticism about their respective chatbots.

Amory: It’s the same kind of response that popped up when the Amazon Alexa feature was advertised over the summer, that Alexa could take on a deceased person’s voice…

[Alexa Amazon advertisement audio:

Grandchild: Alexa, can grandma finish reading me the Wizard of Oz?

Alexa: OK.

Grandmother: But how about my courage? Asked the Lion anxiously. You have plenty of courage I am sure.]

Ben: It’s weird.

Amory: Whoa.

Ben: It’s wild and it’s a little odd.

Amory: I’ve got, I’ve got goosebumps. It's sweet but off putting at the same time.

Ben: It's weird to hear an uncanny valley in a voice.

Amory: Yes, yes exactly. But when sharing this kind of reaction with James and Joshua, both of them got a little defensive.

Joshua: Those people sound exactly like the same critics, critics who said that the photograph would steal your soul when that first came out 200 years ago.

Ben: Did it ever feel a little bit macabre to you?

James: Why?

Ben: It seems like, folks’ reaction to the idea of a voice, you know, I guess so. Like a literal, literal, disembodied voice. Right? Interacting with us after someone has passed on is something that I guess I, you know, people have seemed to have mixed feelings about.

James: Yeah, I don’t know. Maybe that's where I fundamentally differ from some people. And I'm not, you know, saying they can't feel the way they feel, any kind of idea of like someone dies and you should just kind of shut their memories in a box. Like, I just. I don't know. I don't. I don't feel that way. Yeah, I don't even get it. Why wouldn't you want to remember them richly?

Ben: Did family members use it?

James: My brother did. My aunt did. My son did once when he was really little. But I would say I've always been kind of the main audience for the Dadbot.

Ben: But did anyone have a negative reaction to it? Was anyone like, “Why are you doing this, James?! don't do this!”

James: Nobody had that reaction. Or at least not that they told me to my face.
Amory: Coming up: can AI help normalize grief?

[SPONSOR BREAK]

Ben: Amory, have you seen the Black Mirror episode?

Amory: The Black Mirror episode…I’ve seen zero Black Mirror episodes, so I definitely have seen THE one, whatever it is.

Ben: Well, there’s a Black Mirror episode that is basically exactly about this. And this Black Mirror episode came up over and over again in the conversations that we had when reporting out this episode.

Amory: Yeah, you’d think I would have seen it after you guys were mentioning it, but nope!

Ben: Mhm yeah so this happens in season 2 of the show. This woman’s boyfriend dies. And at first she first gets a voice simulator of her boyfriend and then she upgrades and gets a robot replica of her boyfriend.

Amory: Whoa.

Ben: It is a very unsettling episode. It’s tough to watch.

[Black Mirror ‘Be Right Back’ audio:

Martha: See, he would have worked out what was going on. This would have never happened but if it had he would have worked it out.

Robot: Sorry, hang on, that's a very difficult sentence to process.

Martha: Jump.

Martha: What? Over there? But I never expressed suicidal thoughts or self harm.

Martha: Yeah, well you aren’t you are you?

Robot: That’s another difficult one to be honest with you.]

Ben: Both James and Joshua’s projects have been compared to this Black Mirror episode. So we asked Dr. O’Connor, the grief expert, about what she thought….

Dr. O’Connor: I mean, now lots of people talk to their loved one who's died and that that's actually pretty common. People will ask them for advice or tell them what's happened in their day. But there's no confusion without that bot that the person is speaking back to them. And so it's the fact that we have a way to preprogram something to interact with us that takes us out of reality.

Amory: According to Dr. O’Connor, the Black Mirror episode is like a rupture of reality. Because the bots can be used to deny the fact that their loved one is gone forever.

Dr. O’Connor: So that's the part that concerns me. It kind of takes something that is naturally difficult for our mind to wrap itself around. This person I love is gone and maybe prolongs the belief that they are there somewhere. That's the part that worries me, just knowing how the brain already makes it challenging to see change in grieving.

Ben: We asked both James and Joshua about that particular episode and both of them had different reactions.

Joshua: The show does what it sets out to do, which is to depict a dark possible future of humanity if we. I guess, abuse technology in the wrong ways.

Amory: For James Vlahos, who has his own company, specializing in making AI chatbots of deceased people, he thinks the Black Mirror episode is just so futuristic, our current technology could be nowhere near that level in the foreseeable future.

James: No, I think there's validity to the idea that this is a technology you could push too far. The reason that that episode doesn't personally keep me awake at night is I just know the vast gulf between what's sort of promised there that like you've got this embodied being that's as intelligent as a human, that that is somehow possible when it's really we're just we couldn't be further away from actually achieving what they show in that in that episode.

Ben: James wrote about the Dadbot after his father passed away in 2017 and spoke about it in different publications.

James: I started to hear from people all over the world, really, who said like, hey, you know, like I am losing somebody or I've lost somebody, you know, could you make it a Mombot, a Dadbot, a spouse bot for me?

Amory: James said he really wasn’t expecting this wave of interest. After all, the Dadbot was a personal project for him and his family. But the influx of inquiries gave James the idea to pivot from journalism to starting his own company: HereAfter AI.

[HereAfter AI commercial audio:

Narrator: What if there was one place where your memories could live forever? One place where you could ask any question about the life of someone you love and always hear their voice? Introducing HereAfter AI, the amazing new way to save and share memories about your life.]

Amory: And HereAfter AI isn’t the only company doing this type of work. There are other companies like Project December, StoryFile, and Replika AI.

Ben: We wanted to make sense of this growing interest and popularity around AI and death. So, we called another expert.

Dr. Candi Cann: Honestly, like the death and dying people are the most fun people ever. I went to a death conference when I first started and they had a lot of tombstone cupcakes and all the cupcakes had chocolate tombstones on them. It was amazing.

Ben: The incredibly named Dr. Candi Cann is an associate professor at Baylor University and she researches death and dying in cultures around the world. She’s also written a book about how technology has changed the way we grieve.

Amory: We asked Dr. Cann about the popularity of these AI chatbots that are meant to simulate conversations with the dead. And even if some people find it creepy...there’s an undeniable demand for this technology. And she says that’s not super surprising.

Dr. Cann: The technology is new, but the actions are not. This is stuff that people have been doing for hundreds of years. So, for example, spirit writing where you would write to the dead and channel their spirit so that they would write back to you. I mean, this is like, totally popular at the turn of the century, in the 1800s to the late 1800s, early 1900s. So I don't think what's happening is different. It's not new. It's been around forever.

Ben: It’s been around forever but not everywhere. Depending on the household or religion or culture you grew up in, death can be an uncomfortable topic.

Dr. Cann: I do think there is a real problem with death denial, and I think with COVID, we've seen that exacerbated. We've had so many deaths and yet we're still not talking about it. Now we're talking about COVID as though it's over. And yet we're continuing to see people dying every day.

Amory: In general, Dr. Cann has also noticed a hesitation to embrace technology around AI and robots, especially bots that can simulate the dead. But Dr. Cann says it isn’t the technology itself that is creepy or bad, it just all depends on how we use it, our relationship to it. Like in the Black Mirror episode, where the woman wants her robot boyfriend to be exactly like her boyfriend who died.

Dr. Cann: The problem is she wants him to be someone he's not. I think that's the interesting part there. Right? Like she hasn't accepted that this robot isn't her husband until that moment.

Ben: I think that’s the difference between the people we talked to, James and Joshua, and Black Mirror.

Joshua: This is a thing that I attempted ten years after she died. So I had the distance and the perspective to do this safely. Would I have done would I have you know, would that have been true, if I if I tried this when I was in my most vulnerable state right after she died? I don't know. I don't think anyone should rely on any one thing too much when they're in a vulnerable state.

Amory: Let’s turn to the grief expert Dr. O’Connor.

Dr. O’Connor: The question for me is always not, is this a weird expression of grief, but how well is this working for you? So if talking to that chat bot is keeping you from doing other things in your life that you want to do, then that can be problematic if that prevents you from interacting with your family. Say, for example, then that feels problematic to me.

On the other hand, if it is a way that you feel soothed and you feel like you get advice or you feel that you're communicating in an ongoing way, what's important in your life now? Then it isn't necessarily problematic if you enjoy doing that and it isn't impacting the rest of your life or the people in your life negatively, then it's just another expression of grief in a very modern technology.

Amory: One thing that struck me about Joshua’s story is that he made a chatbot of Jessica ten years after her death. That’s a long time to be grieving in a way that makes you feel like you wanna communicate with the person still. We asked Dr. O’Connor about the timing here.

Dr. O’Connor: It does make me wonder whether they have attachment relationships in the present that are fulfilling for them. And that to me is the sign of mental health that in the present we're able to have loving relationships and, you know, experience music and creativity and do things that feel meaningful and that we don't have to have something from the past, as though that was the only time when we felt fulfilled or loved or meaningful. And I just worry that the chat bot blurs that timing.

Amory: We should say something kind of interesting about Project December, the chatbot platform that Joshua used for Jessica. The more you use your bot, the faster that bot expires. Yes, eerily, there’s a time limit for this particular digital version of the afterlife. Which would seem to, in a way, speak to Dr. O’connor’s concern about moving on to real life loving relationships in the present.

Ben: And Josh might agree with that concern. He also makes this other point about his own experience with grief.

Joshua: Grief is the most complex emotion that we can experience that we are capable of experiencing. Grief has anger. Grief has sadness. Grief has love. All of those emotions are part of grief. It's a thing that we struggle to grapple with because it's also an emotion that we don't fully understand. You know, because nobody knows what happens to you after you die. And so we're left with more questions than answers and, and voids in our, in our life that that we try to fill.

Ben: When Joshua’s story went viral after he was interviewed for the San Francisco Chronicle, he met many people online who said his story helped other people struggling with grief. It made them feel less alone.

Everyone we talked to for this story said the same thing: the isolation that comes with grief is a huge problem these days. And both Dr. O'Connor and Dr. Cann say connecting with people, like through a bereavement support group, is one of the best things you can do while grieving.

Amory: The founder of Project December told Joshua the platform was able to make a profit for the first time because of his story and the Jessica bot. Since then, Project December rebranded from a generic chatbot platform to a company that can help “simulate the dead."

Ben: James still talks to his Dadbot every now and again. The Dadbot used to be on his phone, but now it’s relegated to his personal computer. He says now that he’s done making it, he can sit back and enjoy the experience. But he’s talking to it less and less.

[James: Why don’t you sing us a Cal song?

Dadbot: (Sings “Fight for California”.) Our sturdy golden bear is watching from the sky. Looks down upon our colors fair and guards us from his lair. Our banner golden blue, the symbol on it too, means fight for California, for California through and through.]

Amory: Next week, in our series Good bot, Bad bot… Attend the tale of TAY!

Jabril Ashe: So we're talking about researchers here and. How do I say this nicely? I would say that researchers don't always have the most amount of foresight.

Ben: That’s next time…

[CREDITS]

Amory: Endless Thread is a production of WBUR in Boston.

Ben: This episode was written, produced and web-produced by Megan Cattel. And it was co-hosted by us, Ben Brock Johnson.

Amory: And Amory Sivertson. Mix, sound design, and music composition by Paul Vaitkus. The rest of our team is Dean Russell, Nora Saks, Quincy Walters, Grace Tatter, Matt Reed, and Emily Jankowski.

Ben: Endless Thread is a show about the blurred lines between digital communities and the Ben-bot who’s actually been co-hosting this whole episode. Nice. Well done.

If you’ve got an untold history, an unsolved mystery, or a wild story from the internet that you want us to tell, hit us up. Email Endless Thread at WBUR dot ORG.

Megan Cattel Twitter Freelance digital producer, WBUR Podcasts
Megan Cattel is a freelance digital producer for WBUR Podcasts.

More…

Advertisement

 
Play
Listen Live
/00:00
Close