Advertisement

How To Teach Ethics To Algorithms

09:59
Download Audio
Resume
A Bloomgberg News analysis of Amazon Prime's same-day delivery showed predominantly black neighborhoods across the country were being left out. (Paul Sakuma/AP)
A Bloomgberg News analysis of Amazon Prime's same-day delivery showed predominantly black neighborhoods across the country were being left out. (Paul Sakuma/AP)

Our world is increasingly run by algorithms, but what happens when we sacrifice the human touch for pure, machine efficiency?

One answer is Amazon's same-day delivery debacle. Last month, an analysis by Bloomberg found Amazon's same-day delivery service is available to everyone in Greater Boston, except for residents of three ZIP codes in the predominantly black neighborhood of Roxbury. Looking at a map of Amazon's deliveries, Roxbury stands alone, completely surrounded by areas that receive same-day packages.

A week after the report was released, Amazon confirmed it will begin rolling out same-day delivery to Roxbury.

How do we avoid similar dilemmas in the future?

Guest

Michael Luca, assistant professor of business administration at Harvard Business School. He tweets @Mike_Luca.

More

Bloomberg: Amazon Doesn’t Consider the Race of Its Customers. Should It?

  • "In six major same-day delivery cities, however, the service area excludes predominantly black ZIP codes to varying degrees, according to a Bloomberg analysis that compared Amazon same-day delivery areas with U.S. Census Bureau data. In Atlanta, Chicago, Dallas, and Washington, cities still struggling to overcome generations of racial segregation and economic inequality, black citizens are about half as likely to live in neighborhoods with access to Amazon same-day delivery as white residents."

The Boston Globe: Why Doesn’t Amazon Offer Same-Day Delivery In Roxbury?

  • "Roxbury residents on Thursday expressed dismay, but not shock. They said Amazon’s decision to leave the neighborhood out is just another example of the second-class services they’ve come to expect from government and businesses."

Harvard Business Review: Algorithms Need Managers Too

  • "Myopia is also the underlying weakness of programs that produce low-quality content by seeking to maximize click-throughs. The algorithms are optimizing for a goal that can be measured in the moment—whether a user clicks on a link—without regard to the longer-range and more important goal of keeping users satisfied with their experience on the site."

The New York Times: Discrimination by Airbnb Hosts Is Widespread, Report Says

  • "A working paper by three Harvard researchers found 'widespread discrimination' by hosts against people with black-sounding names seeking rentals. Fictional guests set up by the researchers with names like Lakisha or Rasheed were roughly 16 percent less likely to be accepted than identical guests with names like Brent or Kristen."

The Upshot: When Algorithms Discriminate

  • "Research from the University of Washington found that a Google Images search for 'C.E.O.' produced 11 percent women, even though 27 percent of United States chief executives are women. (On a recent search, the first picture of a woman to appear, on the second page, was the C.E.O. Barbie doll.) Image search results determined 7 percent of viewers’ subsequent opinions about how many men or women worked in a field, it found."

The Atlantic: When Discrimination Is Baked Into Algorithms

  • "A recent ProPublica analysis of The Princeton Review’s prices for online SAT tutoring shows that customers in areas with a high density of Asian residents are often charged more. When presented with this finding, The Princeton Review called it an 'incidental' result of its geographic pricing scheme. The case illustrates how even a seemingly neutral price model could potentially lead to inadvertent bias—bias that’s hard for consumers to detect and even harder to challenge or prove."

This segment aired on May 11, 2016.

Advertisement

More from Radio Boston

Listen Live
Close