Search


  • Categories


  • Archives

    « Home

    Archive for the ‘Anonymity’ Category

    It’s Becoming Easier to Have Detailed Secret Surveillance from a Distance

    Wednesday, November 23rd, 2016

    The idea of secret surveillance from a distance isn’t new. For centuries, there have been undercover agents. Subsequently, there came hidden cameras and microphones. But there were limitations to this secret surveillance — such as the physical constraints of a human or camera located far from the person being watched. As surveillance technology has become more sophisticated, however, it is becoming easier to identify, watch, listen to, and judge people from a distance.

    The judgment portion is, in part, based on biometric facial-recognition technology that incorporates expression recognition. For the unseen eyes, it’s no longer just about identifying a person, but also about watching their emotional responses. This type of facial-recognition tech gained attention a few years ago when Microsoft filed a patent for technology that would track individuals’ emotions and target advertising and marketing as based upon a person’s mood.

    “Degrees of emotion can vary — a user can be ‘very angry’ or ‘slightly angry’ — as well as the duration of the mood. Advertisers can target people ‘happy for one hour’ or ‘happy for 24 hours,’” the Toronto Star reported in 2012. Four years later, the mood-identification technology can be bought off the shelf, as NBC News explains in a story about “a new immersive experience for moviegoers.” Read more »

    Criminalizing the Reidentification of ‘Anonymized’ Data Won’t Solve the Privacy Issue

    Monday, October 17th, 2016

    For years, companies and institutions have been using “anonymization” or “deidentification” techniques and processes to release data concerning individuals, saying that the techniques will protect personal privacy and preclude the sensitive information from being linked back to an individual. Yet we have seen time and again that these processes haven’t worked.

    For almost two decades, researchers have told us that anonymization of private information has significant problems, and individuals can be re-identified and have their privacy breached. (I wrote a blog post last year detailing some of the research concerning re-identificaiton of anonymized data sets.)

    Recently, Australian Attorney General George Brandis announced that he would seek to amend the country’s Privacy Act to “create a new criminal offence of re-identifying de-identified government data. It will also be an offence to counsel, procure, facilitate, or encourage anyone to do this, and to publish or communicate any re-identified dataset.”

    According to the Guardian, the “impetus” for this announcement was a recent privacy problem with deidentified Medicare data, a problem uncovered by researchers. “A copy of an article published by the researchers outlines how every single Medicare data code was able to be reidentified by linking the dataset with other available information,” the Guardian reported. Read more »

    As Our Devices Increasingly Talk to Others, Privacy Questions Arise

    Thursday, December 17th, 2015

    As technology continues to evolve and become integrated into our lives, there are significant questions about privacy and security. We’ve discussed before the “Internet of Things,” which is a computerized network of physical objects. In IoT, sensors and data-storage devices embedded in objects interact with Web services. Such connected televisions, refrigerators and other devices can raise privacy and security questions.

    For example, consider the “smart” or “connected” car. People buy such vehicles for the benefits of integrating technology into something where they can be for hours at a time. Your car or truck knows where you go and when. It knows how fast you drive and how quickly or slowly you brake. Your car knows if you’re wearing a seatbelt.

    Privacy experts have noted that unclear or vague privacy or usage policies could allow companies that collect drivers’ sensitive data to share or sell that information with others, creating databases that may invade the privacy of consumers. For example, the locations where individuals drive to could reveal deeply personal information. Do you go to a church or mosque at the same time every week? Have you visited an adoption or fertility organization? Did you join a protest or demonstration? Did you recently start going to a building that includes the offices of several psychotherapists or one that houses a drug addiction clinic?

    One privacy issue recently arose with connected automobiles — and it caught many people off-guard. ABC25 in West Palm Beach, Fla., reported that a Ford car with opt-in 911 Assist allegedly ratted out a hit-and-run driver in Florida. Read more »

    Libraries Fight to Protect Users’ Rights to Privacy

    Friday, October 23rd, 2015

    A recent case in New Hampshire illustrates how libraries continue to be battlegrounds for privacy rights. The Kilton Public Library in Lebanon, N.H., a town of about 13,000 people, decided to join Tor, an anonymization network for online activities. It was a pilot for a bigger Tor relay system envisioned by the Library Freedom Project. According to Ars Technica, the Library Freedom Project seeks to set up Tor exit relays in libraries throughout the country. “As of now, only about 1,000 exit relays exist worldwide. If this plan is successful, it could vastly increase the scope and speed of the famed anonymizing network.”

    The Department of Homeland Security learned of the pilot, Pro Publica reported: “Soon after state authorities received an email about it from an agent at the Department of Homeland Security. [...] After a meeting at which local police and city officials discussed how Tor could be exploited by criminals, the library pulled the plug on the project.”

    After much criticism of the DHS and local law enforcement interference and petitions to reinstate the pilot project (including one from the Electronic Frontier Foundation), the Kilton library’s board voted a few weeks later to reinstate the project. ”Alison Macrina, the founder of the Library Freedom Project which brought Tor to Kilton Public Library, said the risk of criminal activity taking place on Tor is not a sufficient reason to suspend its use. For comparison, she said, the city is not going to shut down its roads simply because some people choose to drive drunk,” the Valley News reported. Read more »

    Targeted Behavioral Advertising and What It Can Mean for Privacy

    Tuesday, September 8th, 2015

    Targeted behavioral advertising is where a user’s online activity is tracked so that ads can be served based on the user’s behavior. What began as online data gathering has expanding — now there’e the online and offline data collection and tracking of the habits of consumers. There have been numerous news stories about this privacy and surveillance issue. There is a fundamental issue about targeted behavioral advertising that divides industry and consumer advocates: opt-in or opt-out. Opt-in, the choice of consumer advocates, puts the burden on companies to have strong privacy protections and use limitations so consumers will choose to share their data. Opt-out, the choice of the majority of ad industry players, puts the burden on consumers to learn about what the privacy policies are, whether they protect consumer data, whom the data is shared with and for what purpose, and how to opt-out of this data collection, use and sharing.

    Companies can also buy information on individuals from data collectors. At times, the information can be wrong, causing problems for individuals. Read a previous post for more about data brokers.

    What happens when data is gathered as a person browses the Internet? It can lead to innocuous advertisements for cars when you’re searching for a new vehicle or boots when you’re considering replacing ones that wore out last winter. Or it can lead to a more difficult situation when you’re faced with ads strollers and car seats showing up on Web sites that you visit even though you had a miscarriage a month ago. It’s easy for advertisers to connect the dots when someone starts searching for infant safety gear or reading parenting Web sites and the person is unable to opt-out of targeted behavioral advertising. Read more »

    When Software Can Read Your Emotions as You Walk Down the Street

    Wednesday, April 22nd, 2015

    I’ve written before about the increasing use of “digital signage.” What is “digital signage”? Most people have heard of the term connected with billboards or other screens that have cameras (and facial-recognition technology) to watch people watching ads in order to target advertising toward individuals. The data-gathering and surveillance practices raise substantial privacy questions.

    The Los Angeles Times reported on the expansion of these digital billboards and their use of facial-recognition biometric technology in casinos, Chicago-area bars and more. USA Today and the New York Times have detailed safety problems that can arise from these digital billboards. BBC News has reported on the use of digital billboards in the United Kingdom. The Wall Street Journal has reported on digital signage use in Japan.

    Now, Wired reports on the more widespread use of software from the artificial intelligence startup Affectiva that “will read your emotional reactions” in real time. “Already, CBS has used it to determine how new shows might go down with viewers. And during the 2012 Presidential election, [Affectiva's chief science officer Rana el Kaliouby’s] team experimented with using it to track a sample of voters during a debate. Read more »