© 2024 Kansas City Public Radio
NPR in Kansas City
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

ICE Uses Facial Recognition To Go Through Driver's Licenses, Researchers Say

STEVE INSKEEP, HOST:

There is a logic behind a newly revealed use of data by federal immigration authorities. Many states welcome people who are in the U.S. without legal status to obtain a driver's license. Now researchers have found that Immigration and Customs Enforcement, along with the FBI, have been running databases filled with driver's license photos through facial recognition software, looking for immigrants of interest. Jake Laperruque is here to talk about this. He is senior counsel at the Project on Government Oversight, an independent group that focuses on corruption and abuse of power.

Welcome to the program.

JAKE LAPERRUQUE: Hi. Thanks so much for having me.

INSKEEP: How did these things come to light?

LAPERRUQUE: Well, this most recent revelation came from FOIA requests through the Georgetown Center on Privacy and Technology. Facial recognition and the government's use of it is something that we've been steadily uncovering over the past several years, and what we've been learning is this is not some distant sci-fi tech; it's happening now. It's something that affects hundreds of millions of Americans. Over half of all American adults are enrolled in some law enforcement facial recognition database already.

INSKEEP: FOIA request - Freedom of Information Act - so this is government documentation. There's no doubt this is happening. Is it legal that the FBI and that ICE would be doing this with presumably millions of driver's license photos?

LAPERRUQUE: By current law, it's legal. Unfortunately, right now it really is a Wild West in terms of facial recognition. There are only a couple of states that have any sort of limits on law enforcement or another agency enforcement use of this technology, and there are really no limits at the federal or state level on the process for using this information for conducting facial recognition searches or limits on how you can do it. So we have a very powerful technology, a technology that's very prone to error, but there really are no rules. It's a bit of a free-for-all on what the government can do.

INSKEEP: You just said an important thing - very prone to error. Do you mean that someone might be identified as in the U.S. illegally, and maybe their status is fine - it's a totally different person?

LAPERRUQUE: Yes, that's exactly right. Facial recognition - there's a high degree of misidentification that can occur. This goes up based on the settings, and a lot of law enforcement entities do not receive proper training on what limits they should put on this, the specific, narrow (ph) scenarios they should be using it. There's also a number of studies that found that facial recognition is much more likely to misidentify women and people of color, people of darker skin, especially.

So when you're conducting these searches, you know, even if you're a law-abiding citizen, even if you have no reason the government should be looking at you for anything, there's a chance that, as long as you're in these databases, you're going to be coming up and become the target of an investigation because of a glitchy computer system.

INSKEEP: Now, let's grant that what you said is factual. There have been studies of this. It's been documented that facial recognition software doesn't always work. But let's look at the case where it works, where someone is properly identified. If Immigration and Customs Enforcement goes through a database, and they locate someone, and they find out they're living in upstate New York or wherever, and they are able to identify them as someone who's in the U.S. illegally and needs to be deported, what's wrong with that? They did violate the law.

LAPERRUQUE: Well, there's a few things. I mean, I think, first of all, there is always that risk of misidentification. While you might get it right sometimes, I think we have to ask, is the cost of this being wrong so often worth those occasional correct hits? I would say no. And then additionally, I mean, there are many instances throughout society where we say that getting some results is just not worth the cost to privacy and also the chilling effects that we see.

INSKEEP: Is this moving us toward a China-like system, where they have this social credit system and they're keeping track of all kinds of data points about citizens?

LAPERRUQUE: Unfortunately, yeah. Absolutely. This - if we look at sort of what's probably the closest model to the United States in terms of facial recognition, the answer is probably China. Obviously, they don't have a lot of civil rights and civil liberties general protections that we do that hopefully will kick in at some point, but China has no rules or restrictions on this, just like the U.S., and they have a dragnet system - that seems to be where we're headed if we don't put in rules.

INSKEEP: Mr. Laperruque, thanks so much.

LAPERRUQUE: Thank you.

INSKEEP: Jake Laperruque, senior counsel at the Project on Government Oversight. Transcript provided by NPR, Copyright NPR.

KCUR prides ourselves on bringing local journalism to the public without a paywall — ever.

Our reporting will always be free for you to read. But it's not free to produce.

As a nonprofit, we rely on your donations to keep operating and trying new things. If you value our work, consider becoming a member.