music in the park san jose

.A Clean Scan on Facial Recognition

Is the Oakland Airport's new security software the high-tech equivalent of unlawful search and seizure?

music in the park san jose

When a PR firm for Imagis Technologies announced that the Oakland Police Department soon would be installing facial recognition technology as a security measure at the Oakland International Airport, alarm bells started ringing. News stories based on the company’s press release, fueled by concern stemming from the September 11 attacks, concluded that the airport was on the verge of using the software to scan passengers’ faces to look for terrorists. Worried civil liberties activists, plus local officials such as Alameda County Supervisor Keith Carson, went on the record wondering if such software might encourage racial profiling or unlawful searches and seizures.

The press release for the Canadian biometric software company, which was fed to thousands of business page editors via PR Newswire, claimed that the company’s ID-2000 software “can be installed at a number of points within an airport — from check-in to take-off — providing unobtrusive opportunities for airport law-enforcement officials to quickly and effectively identify and apprehend suspicious individuals.” The Oakland Tribune attributed to OPD Sergeant Mark Schmid (whom they referred to as “Larry”) the idea that the airport was exploring the “idea of installing cameras to sweep the crowds … looking for any match with known criminals in a federal database of thousands.” Even papers as far away as Australia’s Hobart Mercury latched onto the story, writing that “every passenger will be screened to make sure their face matches up to names on picture IDs such as passports and driver’s licenses.” ABC television’s 20/20 called startled airport officials for details, but apparently never pursued the story. It was, after all, wrong.

Since the erroneous press release went out, the Oakland Police Department has rushed to assure the public that the rumors were overblown. “The indications from some media outlets that we’ll be scanning everyone who comes into the airport is totally wrong,” said OPD deputy chief Patrick Haw. “We have no intention of doing that.” Instead, Haw said the department plans to use the software as a booking tool, adding that the only people whose pictures will be taken are those who are arrested and removed from the terminal to the OPD’s airport substation.

Buying the Imagis software was not a knee-jerk reaction to antiterrorist sentiment, Haw said. “This is technology we already have in place in the city jail,” he said. “We’ve been using it for about a year now, and all it’s for is to identify people after they’ve been arrested.” In fact, he added, the OPD had been planning to bolster its presence at the airport well before the September 11 attacks. And the technology, part of a $2.65 million contract between Imagis and the Alameda County Sheriff’s Department, probably won’t be installed for another six months. But there’s no denying that security is tighter now at Oakland International, especially in the wake of unflattering media reports about guards found asleep at their posts and some very public wrangling over who should take over curbside security — one of two private companies or the Sheriff’s Department. The airport already has boosted its lower-tech security measures, including temporarily discontinuing curbside baggage check-in, closing off parking space within a 300-foot perimeter of the building, limiting the number of bags people can take on planes, and instituting an additional “pre-check” of passengers’ tickets and photos.

Telephone calls about facial recognition technology, however, caught airport officials by surprise. “When I started getting the phone calls, I was like ‘What?'” said airport spokesperson Cyndy Johnson, who attributes the misunderstanding to the work of an overzealous PR firm. “They thought, ‘OPD in the news over security? Let’s get in there!'”

Nor were the news outlets or civil liberties groups that questioned the use of facial recognition technology totally overreacting to the possibility of crowd sweeps. After all, it’s happened before. At this year’s Super Bowl in Tampa, Florida, facial recognition technology was deployed to scan the football fans, resulting in nineteen detentions. Ybor City, a Tampa entertainment district, currently uses 36 mounted cameras equipped with similar software to match the faces of passersby with those in a law enforcement databank. But if the Imagis system is installed, it would make Oakland International the first airport in the US to use such technology, and only the second in North America, after Toronto’s Pearson Airport. Officials at Boston’s Logan Airport and Providence, Rhode Island’s T.F. Green Airport also have announced plans to install similar technology.

Facial scanning is a hot issue, one not likely to cool off anytime soon. The American Civil Liberties Union has backed a high-profile campaign against the technology’s more clandestine uses, along the way gaining the powerful if unlikely ally Representative Dick Armey (R-TX). It opposes the secret surveillance of people in public places, saying it puts unsuspecting citizens in a virtual lineup. “There’s no such thing as bad technology, just technology that has potentially intrusive uses and that needs to be deployed very, very carefully,” said ACLU public education coordinator Jay Stanley. What makes the ACLU queasy about facial recognition software is that — unlike other systems requiring subjects’ cooperation by scanning their hands or eyes, or taking voice samples — facial scans can be done in secret. “Facial recognition is so unreliable compared to something like thumbprints that you have to wonder why it’s used,” Stanley said. “Really there’s only one advantage, and that is that the subjects don’t know they’re being subjected to it. It’s a passive technology that can be done from hundreds of feet away.”

Imagis execs frankly acknowledge that facial recognition doesn’t require the subject’s cooperation. “You don’t have to say, ‘Put your finger here,’ or ‘Put your eye here so we can read your iris,'” said Imagis CEO and President Iain Drummond. This remote nature is the technology’s strength, he added, because it allows officials to identify a dangerous criminal with no more than a photograph.

There’s also no reason to believe that most citizens won’t support giving law enforcement another high-tech tool to use, especially if it keeps the airports safe. A recent Harris Poll found that 86 percent of Americans said they’d support the use of facial recognition software in airports. “My view on that is if you’d asked the population of the States before September 11, you’d get a ten percent acceptance rate,” said Drummond. “We’ve had no complaints in the airport environment because everyone is very keen that when they get on the airplane, terrorists don’t get on there with them.” And certainly, business has been good at Imagis lately. “The phones are just ringing off the hook,” Drummond said.

While the ACLU’s main grievance hinges on the technology’s potential to violate citizens’ Fourth Amendment right to be free from “unreasonable search and seizure,” right now the group also is focused on a more practical issue: Does facial recognition really work? The answers vary depending on who is asked. “It just doesn’t work very well right now,” Stanley said. “Error rates are very high in terms of flagging bad guys and recognizing good guys.” The ACLU points to two recent studies, one by the Department of Defense that documented a significant problem with systems making incorrect matches, and another by the National Institute of Standards and Technology that found that 43 percent of the time, photos of the same person taken eighteen months apart resulted in missed matches. Critics also believe the cameras can be tricked by simple disguises — changes in facial hair, Groucho glasses, even cotton stuffed into one’s mouth to change one’s jawline. Instead of failing to recognize criminals, they say, it could flag innocent people. “What if your photo gets in the database by accident and you start getting stopped everyplace, or even if your photo wasn’t there and you just looked like someone?” asks Stanley. “It could make your life hell.”

Officials with Imagis maintain that their software, which compares relationships between more than two hundred different facial points, is not so easily fooled. “From our point of view, our technology is extremely reliable,” Drummond said. “We ignore things that can be changed — hair color and length, beards and mustaches. We look at the central face from eyebrows to upper lip, the curvature of the eye socket, the hollows of the cheek, things you can’t realistically change.” Drummond said the software isn’t even tricked by the effects of aging; tested on photographs of children as they aged from three to ten years old — dramatic growth spurt years — it was able to make the correct matches.

There is, however, one group of people the system definitely can’t pick out — those who aren’t in the database because they have no prior criminal record. And, as Stanley points out, there are no serial suicide hijackers. So will facial recognition software really help thwart violent criminals like terrorists? Take what happened with the nineteen people detained at the Super Bowl. Some of them turned out to be false alarms and were released; most of the rest turned out to be petty criminals such as pickpockets and ticket scalpers, hardly a danger to national security. “Is this a technology to make the world safe against ticket scalping?” Stanley asked. “Policy makers and security professionals should be honest about the reason this system is being put in place; they should stop pretending this system is going to catch Osama bin Laden if it’s not.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

East Bay Express E-edition East Bay Express E-edition
music in the park san jose
19,045FansLike
14,654FollowersFollow
61,790FollowersFollow
spot_img