Advertisement

Why Massachusetts Should Pass The Facial Recognition Moratorium Act

A giant KIA video screen advertises facial recognition in prototype vehicles as patrons walk past at the Consumer Electronics Show International Jan. 9, 2019, in Las Vegas. (Ross D. Franklin/AP)
A giant KIA video screen advertises facial recognition in prototype vehicles as patrons walk past at the Consumer Electronics Show International Jan. 9, 2019, in Las Vegas. (Ross D. Franklin/AP)

Massachusetts recently marked the 150th anniversary of one of the most damaging errors in scientific history. In 1868, a French scientist, Étienne Léopold Trouvelot, imported a new breed of caterpillar to the state. He knew that American silk-spinning caterpillars were susceptible to disease, so he hoped to hybridize them with new imports. In actuality, he imported gypsy moths. The moths promptly escaped, spread and became an environmental scourge. Now, they defoliate 1 million acres of American forest a year, costing $868 million.

As a professor focused on teaching law students how to deploy new technologies, I have seen the digital equivalent of gypsy moths unleashed at a scale unimaginable in the 19th century. The proliferation of face and biometric recognition technologies are particularly concerning. These tools secretly record us when we’re in public, and then store our information in databases to make us instantly recognizable by our voice, retinas, face or gait. Police in Massachusetts are currently using biometric recognition technology — scanning photos in the Registry of Motor Vehicles' database to search for suspects in criminal investigations, for instance — without any legislative approval or judicial oversight.

Biometric recognition tools are not only in use here. They have also been used in China, where the government deploys them to efficiently round up religious minorities and police petty crimes like jaywalking. If you think that sort of abuse is unimaginable in America, consider the recent revelation that federal authorities distributed a secret list of activists, lawyers and reporters to stop for added scrutiny at the border because of their criticism of the current administration’s policies.

Commuters walk by surveillance cameras installed near a subway station in Beijing, Feb. 26, 2019. The Chinese government has used facial recognition and other technology to tighten control over society. (Andy Wong/AP)
Commuters walk by surveillance cameras installed near a subway station in Beijing, Feb. 26, 2019. The Chinese government has used facial recognition and other technology to tighten control over society. (Andy Wong/AP)

In addition to raising privacy concerns, these tools can also be inaccurate. In one recent test, Amazon’s facial recognition tool falsely identified 28 members of Congress. It also disproportionately misidentified people of color, tagging them as people who had been arrested for a crime. The tools not only steal our anonymity — they may tell police we are someone we are not.

Even tech companies that stand to profit are sounding the alarm. Google recently announced it would not release a general face surveillance product “before working through the important technology and policy questions.” Microsoft’s president published a blog post calling for the government to step in and regulate the technology.

We should heed these warnings, before the situation spirals out of control. Thankfully, lawmakers in Massachusetts have introduced a bill, known as the Face Surveillance Moratorium Act, that recognizes the dangers unregulated biometric surveillance poses to our basic rights and freedoms. The bill says that, before we use these tools, we need to debate how, when and why they’re to be used, and decide who will have oversight to prevent abuse. Notably, the act doesn’t permanently ban the technology. Instead, it follows the path recommended by the tech giants who created them: Consider their use carefully, and legislate accordingly.

The proper balance between authority and privacy is personal for me, not only because I care about democracy, but also because of my  own background in law enforcement.

In this July 10, 2018 photo, a camera with facial recognition gets installed at Lockport High School in Lockport, N.Y. The school district is adding technology that can be programed to look for expelled students, sex offenders or weapons and alert officials. (Carolyn Thompson/AP)
In this July 10, 2018 photo, a camera with facial recognition gets installed at Lockport High School in Lockport, N.Y. The school district is adding technology that can be programmed to look for expelled students, sex offenders or weapons and alert officials. (Carolyn Thompson/AP)

As a young man, I served as an Operations Support Technician in the U.S. Secret Service, spanning the period before and after 9/11. Protecting our highest officials and supporting criminal investigations was among the highest honors in my life. But more than a decade after returning my badge and gun, I received an alarming letter from the federal government. It said the government had been hacked by foreign agents, and I was one of millions of federal employees whose security forms had been stolen. A foreign, hostile government had gotten our complete files, including dozens of pages detailing employees’ backgrounds, beliefs, family and friendships and financial information.

My file was supposedly kept in a “secure” computer database. Reality proved otherwise. In the end, the Chinese government got my secrets and Uncle Sam gave me five years of free credit monitoring for my trouble. I wish our government had paused to analyze the safeguards that were supposed to have kept my confidential data safe.

From gypsy moths to privacy-invading technologies, the butterfly effect can morph a tiny ripple into a hugely destructive force. In the case of face recognition software, we still have time to pause, and we should do so by passing the Face Surveillance Moratorium Act.

Follow Cognoscenti on Facebook and Twitter.

Related:

Headshot of Gabe Teninbaum

Gabe Teninbaum Cognoscenti contributor
Gabe Teninbaum is a professor at Suffolk University Law School, where he serves as the Director of the Institute on Legal Innovation & Technology.

More…

Advertisement

More from WBUR

Listen Live
Close