• Categories

  • Archives

    « Home

    Archive for the ‘Anonymity’ Category

    Happy International Privacy Day 2018

    Monday, January 22nd, 2018

    International Data Privacy Day is Sunday. There are a variety of events occurring this week to celebrate. Visit the official site to find events near your area. Take the time to think about how privacy is important in your life and how you can protect your rights from being infringed upon. Please also donate to any number of organizations out there trying to protect your privacy rights.

    In China, a Steady March Toward Complete Surveillance of Its Citizenry

    Friday, December 22nd, 2017

    Decades ago, China began a system of online surveillance and censorship that was nicknamed “the Great Firewall of China.” Now, that firewall is getting stronger, and there is also an increase in broader surveillance of the public, and the surveillance is becoming more focused, so a particular individual could be targeted.

    China has long had a vast camera surveillance, or CCTV, system throughout the country and it includes face-recognition technology. In June, the Wall Street Journal reported that Industry researcher IHS Markit estimated “China has 176 million surveillance cameras in public and private hands, and it forecasts the nation will install about 450 million new ones by 2020. The U.S., by comparison, has about 50 million.” And the Chinese government is using pairing the CCTV surveillance systems with biometric technology “on streets, in subway stations, at airports and at border crossings in a vast experiment in social engineering. Their goal: to influence behavior and identify lawbreakers.”

    The system is powerful. BBC News recently reported that, in a test, it took China’s surveillance system seven minutes to locate and apprehend one of its reporters. Notably, China’s CCTV system isn’t the only one to integrate face-recognition technology in order to better target individuals.  Read more »

    What If the Rules About Newborn Blood Screenings Changed?

    Thursday, October 26th, 2017

    There has been an ongoing privacy and ethics debate over the unauthorized or undisclosed use of newborns’ blood samples for purposes other than the standard disease-screening, which includes about 30 conditions. Now, there’s a trial study, called BabySeq, from Brigham and Women’s Hospital that “uses genomic sequencing to screen for about 1,800 conditions, including some cancers,” CBS Miami reports.

    The privacy questions are clear: What happens to the DNA data — who keeps it, in what form, for how long — and who has access to it? The participants in the study have chosen to participate with, presumably, complete knowledge of the answers to these questions. But consider if the screening of 1,800 conditions, rather than the current 30, became the legal standard. This is a significant amount of highly personal information and there are substantial privacy issues.

    BabySeq co-director, Dr. Robert Green, has raised some of these issues. “We can’t predict what kind of discrimination is going to be occurring by the time your child grows up,” Green said. “We can’t predict whether there’s some sort of privacy breaches, this information gets out and is used against your child in some sort of future scenario. And we, most importantly, we can’t predict the information’s accurate.” Read more »

    Digital Advertisers Continue to Battle Private-Browsing Technology

    Monday, September 18th, 2017

    We have discussed the privacy issues connected with targeted behavioral advertising before. This type of advertising is where a user’s online activity is tracked so that ads can be served based on the user’s behavior. What began as online data gathering has expanded — now there’s the online and offline data collection and tracking of the habits of consumers. For example, Google announced earlier this year that it “has begun using billions of credit-card transaction records” to try to connect individuals’ “digital trails to real-world purchase records in a far more extensive way than was possible before,” the Washington Post reported.

    Some people are uncomfortable with the tracking and the targeting by companies and attempt to opt-out. (Opt-out puts the burden on consumers to learn about what the privacy policies are, whether they protect consumer data, whom the data is shared with and for what purpose, and how to opt-out of this data collection, use and sharing. Consumer advocates support opt-in policies, where companies have an incentive to create strong privacy protections and use limitations so consumers will choose to share their data.) In response, people have installed ad-blocker technology to avoid seeing ads. However, there is online-tracking technology that can be difficult to block, such as “canvas fingerprinting.”

    People also have joined the Do Not Track movement — this can take the form of opting out of being tracked by e-mail address or by having your Web browser send an opt-out signal to a company as you conduct your online activity. And federal lawmakers have tried to pass Do Not Track legislation to protect kids.

    There has been a battle. For example, Apple’s Safari browser and Mozilla’s Firefox browser have included anti-tracking technology for years. However, some companies choose not to respect Do Not Track signals sent by Web browsers.  Read more »

    Be aware of privacy issues as your A.I. assistant learns more about you

    Friday, May 26th, 2017

    Update on June 6, 2017: Apple has introduced its own A.I. assistant device, the HomePod. Notably, the company says the device will only collect data after the wake command. Also, the data will be encrypted when sent to Apple’s servers. However, privacy questions remain, as with other A.I. assistants. 

    Artificial intelligence assistants, such as Amazon’s Echo or Google’s Home devices (or Apple’s Siri or Microsoft’s Cortana services) have been proliferating, and they can gather a lot of personal information on the individuals or families who use them. A.I. assistants are part of the “Internet of Things,” a computerized network of physical objects. In IoT, sensors and data-storage devices embedded in objects interact with Web services.

    I’ve discussed the privacy issues associated with IoT generally (relatedly, the Government Accountability Office recently released a report on the privacy and security problems that can arise in IoT devices), but I want to look closer at the questions raised by A.I. assistants. The personal data retained or transmitted on these A.I. services and devices could include email, photos, sensitive medical or other information, financial data, and more.

    And law enforcement officials could access this personal data. Earlier this year, there was a controversy concerning the data possibly collected by an Amazon Echo. The Washington Post explained, “The Echo is equipped with seven microphones and responds to a ‘wake word,’ most commonly ‘Alexa.’ When it detects the wake word, it begins streaming audio to the cloud, including a fraction of a second of audio before the wake word, according to the Amazon website. A recording and transcription of the audio is logged and stored in the Amazon Alexa app and must be manually deleted later.”  Read more »

    New Year? Time for a New Assessment of Your Privacy Setup.

    Tuesday, January 17th, 2017

    People use a lot of services and devices to transmit and retain sensitive personal information. A person could use daily: a work computer, a personal computer, multiple email addresses, a work cellphone, a personal cellphone, an e-reader or tablet, a fitness tracker or smart watch, and an Artificial Intelligence assistant (Amazon’s Echo, Apple’s Siri, Google’s Assistant, or Microsoft’s Cortana). The data retained or transmitted on these services and devices could include sensitive medical or other information, personal photos, financial data, and more.

    There’s also the issue of the collection of information that could lead to other data being learned. For example, I wrote recently about health-app data and the surprising results of scrutinizing it. A man was alarmed by his wife’s heart rate data, as collected by her Fitbit, and asked others for assistance analyzing it. One theory: She could be pregnant. Did you know that heart-rate changes could signal a pregnancy?

    Currently, there’s ongoing controversy concerning the data possibly collected by an Amazon Echo. The Washington Post explains, “The Echo is equipped with seven microphones and responds to a ‘wake word,’ most commonly ‘Alexa.’ When it detects the wake word, it begins streaming audio to the cloud, including a fraction of a second of audio before the wake word, according to the Amazon website. A recording and transcription of the audio is logged and stored in the Amazon Alexa app and must be manually deleted later.” Arkansas police have served a warrant to Amazon, as they seek information recorded by a suspect’s Echo. Amazon has refused to comply with the warrant.  Read more »