Search


  • Categories


  • Archives

    « Home

    National Journal: Privacy Groups Sound the Alarm Over FBI’s Facial-Recognition Technology

    The National Journal reports that privacy organizations are asking the Department of Justice to consider the privacy implications for the FBI’s controversial facial-recognition technology database, which will soon be fully operational:

    More than 30 privacy and civil-liberties groups are asking the Justice Department to complete a long-promised audit of the FBI’s facial-recognition database.

    The groups argue the database, which the FBI says it uses to identify targets, could pose privacy risks to every American citizen because it has not been properly vetted, possesses dubious accuracy benchmarks, and may sweep up images of ordinary people not suspected of wrongdoing.

    In a joint letter sent Tuesday to Attorney General Eric Holder, the American Civil Liberties Union, the Electronic Frontier Foundation, and others warn that an FBI facial-recognition program “has undergone a radical transformation” since its last privacy review six years ago. That lack of recent oversight “raises serious privacy and civil-liberty concerns,” the groups contend. […]

    The Next Generation Identification program—a biometric database that includes iris scans and palm prints along with facial recognition—is scheduled to become fully operational later this year and has not undergone a rigorous privacy litmus test—known as a Privacy Impact Assessment—since 2008, despite pledges from government officials.

    “One of the risks here, without assessing the privacy considerations, is the prospect of mission creep with the use of biometric identifiers,” said Jeramie Scott, national security counsel with the Electronic Privacy Information Center, another of the letter’s signatories. “it’s been almost two years since the FBI said they were going to do an updated privacy assessment, and nothing has occurred.” […]

    A 2010 government report made public last year through a Freedom of Information Act request filed by the Electronic Privacy Information Center stated that the agency’s facial-recognition technology could fail up to 20 percent of the time. When used against a searchable repository, that failure rate could be as high as 15 percent.

    But even those numbers are misleading, privacy groups contend, because a search can be considered a success if the correct suspect is listed within the top 50 candidates. Such an “overwhelming number” of false matches could lead to “greater racial profiling by law enforcement by shifting the burden of identification onto certain ethnicities.”

    Leave a Reply