Search


Intersection: Sidewalks & Public Space

Chapter by Melissa Ngo

"The Myth of Security Under Camera Surveillance"


  • Categories


  • Archives

    « Home

    New York Times: The Human Voice, as Game Changer

    The New York Times reports on the increasing use of voice-recognition technology, but there are privacy questions concerning the use of this biometric technology:

    Mr. Sejnoha is sitting in what looks like a living room but is, in fact, a sort of laboratory inside Nuance Communications, the leading force in voice technology, and the speech-recognition engine behind Siri, the virtual personal assistant on the Apple iPhone 4S.

    Here, Mr. Sejnoha, the company’s chief technology officer, and other executives are plotting a voice-enabled future where human speech brings responses from not only smartphones and televisions, cars and computers, but also coffee makers, refrigerators, thermostats, alarm systems and other smart devices and appliances. [...]

    Like many new technologies, sophisticated voice systems have potential drawbacks. Some experts worry about privacy invasions, others about our ever-deepening attachment to devices like smartphones. [...]

    Some privacy advocates worry that it adds an audio track to the digital trail that people leave behind when they use the Web or apps, potentially exposing them to more data mining.

    Voice recognition software works by sending speech to processors that break down spoken words into sound waves and use algorithms to identify the most likely words formed by the sounds. The system typically records and stores speech so it can teach itself to become more accurate over time. Nuance, for example, believes that, aside from the federal government, it has amassed the largest archive of recorded speech in the United States.

    Nuance says it is impossible to identify consumers from the recordings, because the company’s system recognizes people’s voices only by unique codes on their devices, rather than by their names. The company’s privacy policy says it uses the voice data of consumers only to improve its own internal systems. [...]

    Such assurances aside, voice recognition software could conceivably pose enough of a risk to people’s privacy that regulators in Washington are watching.

    “Just as we are concerned about the possible applications of facial recognition, there are other forms of biometric identification, like voice, that pose the same kind of problems,” says David C. Vladeck, the director of the Bureau of Consumer Protection at the Federal Trade Commission. He was speaking about voice technology in general, not about Nuance in particular.

    Possibly related posts:

    Leave a Reply