The ACLU of Maryland has called for safeguards to be built into a statewide policy regulating the use of facial recognition by law enforcement, arguing that its use poses “significant risks” to the public.
In a letter sent last week, civil rights groups warned that the technology, which uses artificial intelligence to identify people, could lead to misidentifications and wrongful arrests, increasing the risks to people of color and women. The letter also warned that the tool could enable “mass surveillance” when used with video footage.
The best solution to those risks is a total ban, said Nate Freed Wessler, deputy director of the ACLU's Speech, Privacy & Technology Project and co-author of the ACLU's Maryland letter. But in places like Maryland, where lawmakers are seeking to regulate rather than criminalize, police can take “real, serious steps” to minimize the risks.
“The Maryland Legislature has made a good-faith effort to address some of the harms of this technology,” Wessler said. “The state police now have a chance to be among the leaders in the country in serious safeguards against misuse. … The ball is in their court.”
Earlier this year, state lawmakers enacted restrictions on law enforcement use of the technology, including providing that facial recognition results cannot be the sole basis for probable cause, banning its use for more minor crimes and requiring states to disclose when the technology is used during discovery in criminal cases.
The law requires the Maryland State Police to adopt and publish a model policy on facial recognition that will serve as a guide for whether local agencies can use the technology in criminal investigations.
Maryland State Police spokeswoman Elena Russo said in an email that the department's model policy is expected to be completed by Oct. 1.
“The Department is committed to ensuring that our policies reflect the values and expectations of our communities while protecting the constitutional rights of the people we serve and protect,” Russo said in a statement.
The American Civil Liberties Union of Maryland (ACLU) said in a letter to state police on Thursday that it expects the state's policy to include three “minimum protections.”
Clarifies that a “face lineup” produced using facial recognition technology may not be the independent evidence necessary for reasonable cause or positive identification; Prohibits the use of facial recognition to identify or track people appearing in recorded video footage; Prohibits contracting with companies that use facial recognition databases using images collected without the parties' consent;
The ACLU said state policy should further spell out what additional evidence is needed to meet the probable cause standard because false facial recognition matches could “bias” subsequent identification checks, such as people reviewing photo lineups.
“When facial recognition algorithms get it wrong, they spit out images that look like the suspect — basically a doppelganger, a look-alike,” Wessler said. “So what they present to witnesses is an image of someone who looks like the suspect, often someone who looks a lot like the suspect, but is not actually the suspect. And, of course, people make mistakes.”
That was the case with Porcha Woodruff, a Detroit woman who was arrested for carjacking after the victim identified her in a mugshot photo, even though she was eight months pregnant. The charges against her were later dropped, and she sued last year. The city then adopted new rules for how its police department can use the technology as part of a settlement reached in a separate lawsuit.
One of Detroit's new restrictions means police cannot release images of people identified through facial recognition technology unless there is other evidence linking them to a crime.
The American Civil Liberties Union of Maryland (ACLU) pointed to this policy in a letter saying they want Maryland's policy to clarify that arrest warrants or arrest citations must include “additional and independently obtained evidence” other than facial recognition clues or facial recognition technology. They also want the policy to require supervisors to determine whether there is “independent evidence” that a person may be suspected of a crime before using facial recognition or other identification checks.
Wessler, who worked on the lawsuit that led to Detroit's policy change, said the proposal is one of the most important things police can do to prevent false matches from misleading them.
Additionally, the ACLU wants Maryland to further restrict the use of facial recognition technology on recorded video. A recently passed law in the state prohibits the use of facial recognition technology for “live or real-time identification.” But the group wants to block the analysis of video footage, including from police body cameras, because that could lead to “automated tracking or identification of an individual's movements, activities, or relationships over time.”
Finally, Maryland wants to ensure that law enforcement agencies do not contract with or purchase facial recognition technology that collects images in violation of federal or state law or without consent.
State law allows police to use the state driver's license database, law enforcement mugshot databases and other matching databases, but only if law enforcement's contracts with those databases contain provisions “governing the manner in which images in the databases are collected.”
Facial recognition technology has played a role in at least one wrongful arrest in Maryland, first reported by Wired and The New Yorker. Baltimore County State's Attorney Scott Shellenberger said last year that the technology was used to identify a suspect in an assault case. When a photo of one of the possible matches was shown to the man's probation officer, he misidentified him as the suspect. He was detained until his wife convinced police that he wasn't the culprit.
Shellenberger also helped craft the bill, which involved working groups discussing differences and finding compromises. He called the bill “comprehensive legislation” on Monday, saying it balances the needs of law enforcement with the concerns of people worried about privacy. He also noted that the bill has checks on its use and a requirement that it cannot be used as the sole evidence in a case.
He said the tool is a modern way for police to identify suspects in dangerous crimes.
“Twenty or 30 years ago, you would sit at a police station and look at a mugshot album. Now you can do that on a computer, you can look at driver's licenses, and you're not limited to the state of Maryland.”
Shellenberger said he hadn't seen the ACLU letter but maintained that everyone worked closely together on the final product and no further changes were needed.
“I don't think we necessarily have to add anything at this point,” he said. “All the restrictions are already in the law. There's no need to add to that. (The model policy) should follow the letter of the law.”
Privacy, surveillance and technology policy experts praised the safeguards proposed by the ACLU of Maryland.
Jake Laperroux, deputy director of the Security and Surveillance Project at the Center for Democracy and Technology in Washington, D.C., said the group's proposal would “promote responsible use and better protect public safety.” He called facial recognition technology a double-edged sword, risking mistakes that could lead to wrongful arrests, but also dangers if it is used to pinpoint individuals and track and retaliate against them.
Laperroux suggested a further step could be to limit police use of facial recognition unless they have a warrant, as other states have mandated.
Jeramy Scott, director of the Surveillance Oversight Project at the Electronic Privacy Information Center (EPIC), said the ACLU's findings appear designed to reduce the likelihood of “worst-case scenarios.”
Scott said he supported a total ban but that such safeguards were important for as long as the technology was in use, and also called for “extensive training” for those analysing search results.