December 12, 2003

Face Recognition and False Positives

An Arizona school will install cameras and face-recognition software, to look for "sex offenders, missing children and alleged abductors", according to an AP story by Michelle Rushlo.

Two cameras, which are expected to be operational next week, will scan faces of people who enter the office at Royal Palm Middle School. They are linked to state and national databases of sex offenders, missing children and alleged abductors.

An officer will be dispatched to the school in the event of a possible match, said Maricopa County Sheriff Joe Arpaio.

This looks like a classic security mistake: ignoring the false positive problem.

Claims vary, but it seems safe to assume that this kind of face-recognition technology, deployed in this kind of environment, will have a false positive rate of at least 1%. In other words, for every 100 completely innocent people who pass in front of the camera, on average at least one person will be misidentified as a target (i.e., a sex offender or abductor).

Now let's do the math. If 200 people enter the school office each day, that means two false positives every day -- two pointless trips to the school by police officers. And how often does a sex offender or child abductor enter the office of that particular school? Almost never, I would guess, especially if they know the cameras are there.

So the vast, overwhelming majority of people singled out by this system will be innocent. Is the school really going to call the police every time? Are the police really going to conduct an investigation each time, to tell whether the person is lying when they deny being a sex offender? I doubt they will.

Responding to these calls, and investigating them thoroughly, can't be the best way for police officers to spend their time. Sometimes false positives are the most expensive aspect of a security technology.

Posted by Ed Felten at December 12, 2003 12:53 PM