Search


  • Categories


  • Archives

    « Home

    Archive for the ‘First Amendment’ Category

    License-plate-reader Technology Is Proliferating, And Questions Remain

    Wednesday, June 28th, 2017

    A couple of years ago, we discussed the increasing use license-plate-recognition camera technology and the possible privacy, civil liberty and security implications about the surveillance tech used to gather and record information on drivers’ movements. At the time, we noted that license-plate-reader technology (also called automated license plate readers, ALPRs), like other surveillance systems, has the ability to create a profile of an individual using personal, possibly sensitive data. Now, the technology is in even more jurisdictions nationwide, and the privacy questions remain.

    Two examples of the proliferation of the license-plate-reader technology are in Rhode Island and Tennessee. In Rhode Island, state legislators are considering HB 5531, “An Act Relating to Motor and Other Vehicles — Electronic Confirmation and Compliance System,” which would create a state-wide license-plate-reader network to identify and fine uninsured drivers. The chief sponsor is Rep. Robert Jacquard (D), who “said he has made a number of changes to address fears of growing state surveillance and concerns the cameras could be used to expand highway tolling,” reports the Providence Journal.

    The ACLU of Rhode Island testified (pdf) against the bill, noting “this legislation would nevertheless facilitate the capture and storage of real time location information on every Rhode Islander on the road, with no guidance as to how this information is to be used, at the benefit of a third-party corporation.” ACLU-RI wants the state to “implement clear and specific restrictions on the use of this technology, particularly by law enforcement” and notes such restrictions are included in HB 5989, whose chief sponsor is Rep. John G. Edwards (D). Read more »

    Be aware of privacy issues as your A.I. assistant learns more about you

    Friday, May 26th, 2017

    Update on June 6, 2017: Apple has introduced its own A.I. assistant device, the HomePod. Notably, the company says the device will only collect data after the wake command. Also, the data will be encrypted when sent to Apple’s servers. However, privacy questions remain, as with other A.I. assistants. 

    Artificial intelligence assistants, such as Amazon’s Echo or Google’s Home devices (or Apple’s Siri or Microsoft’s Cortana services) have been proliferating, and they can gather a lot of personal information on the individuals or families who use them. A.I. assistants are part of the “Internet of Things,” a computerized network of physical objects. In IoT, sensors and data-storage devices embedded in objects interact with Web services.

    I’ve discussed the privacy issues associated with IoT generally (relatedly, the Government Accountability Office recently released a report on the privacy and security problems that can arise in IoT devices), but I want to look closer at the questions raised by A.I. assistants. The personal data retained or transmitted on these A.I. services and devices could include email, photos, sensitive medical or other information, financial data, and more.

    And law enforcement officials could access this personal data. Earlier this year, there was a controversy concerning the data possibly collected by an Amazon Echo. The Washington Post explained, “The Echo is equipped with seven microphones and responds to a ‘wake word,’ most commonly ‘Alexa.’ When it detects the wake word, it begins streaming audio to the cloud, including a fraction of a second of audio before the wake word, according to the Amazon website. A recording and transcription of the audio is logged and stored in the Amazon Alexa app and must be manually deleted later.”  Read more »

    New Year? Time for a New Assessment of Your Privacy Setup.

    Tuesday, January 17th, 2017

    People use a lot of services and devices to transmit and retain sensitive personal information. A person could use daily: a work computer, a personal computer, multiple email addresses, a work cellphone, a personal cellphone, an e-reader or tablet, a fitness tracker or smart watch, and an Artificial Intelligence assistant (Amazon’s Echo, Apple’s Siri, Google’s Assistant, or Microsoft’s Cortana). The data retained or transmitted on these services and devices could include sensitive medical or other information, personal photos, financial data, and more.

    There’s also the issue of the collection of information that could lead to other data being learned. For example, I wrote recently about health-app data and the surprising results of scrutinizing it. A man was alarmed by his wife’s heart rate data, as collected by her Fitbit, and asked others for assistance analyzing it. One theory: She could be pregnant. Did you know that heart-rate changes could signal a pregnancy?

    Currently, there’s ongoing controversy concerning the data possibly collected by an Amazon Echo. The Washington Post explains, “The Echo is equipped with seven microphones and responds to a ‘wake word,’ most commonly ‘Alexa.’ When it detects the wake word, it begins streaming audio to the cloud, including a fraction of a second of audio before the wake word, according to the Amazon website. A recording and transcription of the audio is logged and stored in the Amazon Alexa app and must be manually deleted later.” Arkansas police have served a warrant to Amazon, as they seek information recorded by a suspect’s Echo. Amazon has refused to comply with the warrant.  Read more »

    As Our Devices Increasingly Talk to Others, Privacy Questions Arise

    Thursday, December 17th, 2015

    As technology continues to evolve and become integrated into our lives, there are significant questions about privacy and security. We’ve discussed before the “Internet of Things,” which is a computerized network of physical objects. In IoT, sensors and data-storage devices embedded in objects interact with Web services. Such connected televisions, refrigerators and other devices can raise privacy and security questions.

    For example, consider the “smart” or “connected” car. People buy such vehicles for the benefits of integrating technology into something where they can be for hours at a time. Your car or truck knows where you go and when. It knows how fast you drive and how quickly or slowly you brake. Your car knows if you’re wearing a seatbelt.

    Privacy experts have noted that unclear or vague privacy or usage policies could allow companies that collect drivers’ sensitive data to share or sell that information with others, creating databases that may invade the privacy of consumers. For example, the locations where individuals drive to could reveal deeply personal information. Do you go to a church or mosque at the same time every week? Have you visited an adoption or fertility organization? Did you join a protest or demonstration? Did you recently start going to a building that includes the offices of several psychotherapists or one that houses a drug addiction clinic?

    One privacy issue recently arose with connected automobiles — and it caught many people off-guard. ABC25 in West Palm Beach, Fla., reported that a Ford car with opt-in 911 Assist allegedly ratted out a hit-and-run driver in Florida. Read more »

    Libraries Fight to Protect Users’ Rights to Privacy

    Friday, October 23rd, 2015

    A recent case in New Hampshire illustrates how libraries continue to be battlegrounds for privacy rights. The Kilton Public Library in Lebanon, N.H., a town of about 13,000 people, decided to join Tor, an anonymization network for online activities. It was a pilot for a bigger Tor relay system envisioned by the Library Freedom Project. According to Ars Technica, the Library Freedom Project seeks to set up Tor exit relays in libraries throughout the country. “As of now, only about 1,000 exit relays exist worldwide. If this plan is successful, it could vastly increase the scope and speed of the famed anonymizing network.”

    The Department of Homeland Security learned of the pilot, Pro Publica reported: “Soon after state authorities received an email about it from an agent at the Department of Homeland Security. [...] After a meeting at which local police and city officials discussed how Tor could be exploited by criminals, the library pulled the plug on the project.”

    After much criticism of the DHS and local law enforcement interference and petitions to reinstate the pilot project (including one from the Electronic Frontier Foundation), the Kilton library’s board voted a few weeks later to reinstate the project. ”Alison Macrina, the founder of the Library Freedom Project which brought Tor to Kilton Public Library, said the risk of criminal activity taking place on Tor is not a sufficient reason to suspend its use. For comparison, she said, the city is not going to shut down its roads simply because some people choose to drive drunk,” the Valley News reported. Read more »

    When Software Can Read Your Emotions as You Walk Down the Street

    Wednesday, April 22nd, 2015

    I’ve written before about the increasing use of “digital signage.” What is “digital signage”? Most people have heard of the term connected with billboards or other screens that have cameras (and facial-recognition technology) to watch people watching ads in order to target advertising toward individuals. The data-gathering and surveillance practices raise substantial privacy questions.

    The Los Angeles Times reported on the expansion of these digital billboards and their use of facial-recognition biometric technology in casinos, Chicago-area bars and more. USA Today and the New York Times have detailed safety problems that can arise from these digital billboards. BBC News has reported on the use of digital billboards in the United Kingdom. The Wall Street Journal has reported on digital signage use in Japan.

    Now, Wired reports on the more widespread use of software from the artificial intelligence startup Affectiva that “will read your emotional reactions” in real time. “Already, CBS has used it to determine how new shows might go down with viewers. And during the 2012 Presidential election, [Affectiva's chief science officer Rana el Kaliouby’s] team experimented with using it to track a sample of voters during a debate. Read more »