Search


  • Categories


  • Archives

    « Home

    Archive for the ‘Anonymity’ Category

    As COPPA Turns 20, What’s Next for Children’s Privacy?

    Monday, October 29th, 2018

    The Children’s Online Privacy Protection Act became law in October 1998, and the Federal Trade Commission promulgated its rule concerning the law in the next couple of years. It has been 20 years of ups and downs for privacy protection for children’s data. There continue to be numerous privacy challenges for parents seeking to safeguard their children’s personal information.

    As soon as they are born and are issued identification numbers, children face the risk of identity theft. Such thefts can be undetected for years, until a young adult has reason to use her Social Security Number for a loan or credit card. We have schools tracking children (and college students) with camera surveillance systems or RFID-enabled school uniforms or ID cards. Some schools started using biometric ID systems for students to pay for their lunches. There are concerns about tracking apps such as ClassDojo, which can be used by teachers and parents to monitor students’ progress.

    The FTC marked the 20th anniversary by noting it has made changes to its Rule over the years: “by amending the Rule to address innovations that affect children’s privacy – social networking, online access via smartphone, and the availability of geolocation information, to name just a few. After hosting a national workshop and considering public comments, we announced changes to the Rule in 2013 that expanded the types of COPPA-covered information to include photos, video, or audio files that contain a child’s image or voice.” Read more »

    The Speed of Tech Advances Can Be a Hindrance to, But Also Can Help, Privacy Rights

    Tuesday, June 5th, 2018

    There has been an ongoing discussion about how privacy rights can be eroded because laws do not anticipate changing technology. The most prominent example is the Electronic Communications Privacy Act, which was passed in 1986 and remains mired in the technology of that time, which did not include cloud computing, location tracking via always-on mobile devices and other current technology that can reveal our most personal information. (The World Wide Web was invented three years later, in 1989.)

    While ECPA includes protection for email and voicemail communications, the 180-day rule is archaic as applied to how the technology is used today. (The rule is: If the email or voicemail message is unopened and has been in storage for 180 days or less, the government must obtain a search warrant. If the message is opened or has been stored unopened for more than 180 days, the government can access your message via a special court order or subpoena.) Thirty-two years ago, people had to download their email to their computers; the download would trigger an automatic deletion of the content from the provider’s servers. The government could not subpoena an Internet Service Provider (ISP) for your email because it did not have them in 1986. Now, copies of your private email remain stored in the cloud for years by third-party service providers (Google, Facebook, Dropbox, etc.)

    Privacy and civil liberty advocates have been trying for years to update ECPA. Last year, the U.S. House passed the Email Privacy Act, which would codify the rule set out in 2008’s Sixth Circuit case Warshak v. United States: The government must obtain a warrant before they could seek to compel an ISP or other service providers to hand over a person’s private messages. This year, the Email Privacy Act is part of the House version of the National Defense Authorization Act, a must-pass bill. But the Senate has its own version of the NDAA and it’s unknown whether the privacy legislation will be part of it. Read more »

    Fitness Apps Can Be Fun, But Who Else Is Seeing Your Personal Data?

    Wednesday, March 28th, 2018

    Recently, an Australian student publicized that Strava, a fitness app, had published online a Global Heat Map that “uses satellite information to map the locations and movements of subscribers to the company’s fitness service over a two-year period, by illuminating areas of activity,” according to the Washington Post. Strava “allows millions of users to time and map their workouts and to post them online for friends to see, and it can track their movements at other times,” the New York Times reports.

    The data, culled from Strava’s 27 million users (who own Fitbits and other wearable fitness devices), is not updated in real-time. Yet the map still raised privacy and security questions for Strava’s users.

    A similar case in 2011 concerning wearable device Fitbit also raised privacy questions about searchable fitness data. There was an uproar over Fitbit’s privacy settings when people who were logging their sexual activity as a form of exercise learned that the data was showing up in Google searches. And in 2014, Jawbone faced criticism after it published data about how many people wearing its fitness tracker woke up during an earthquake in Northern California. People questioned whether Jawbone’s privacy and data-sharing policies had disclosed such use of their health data.

    Fitness devices, including smartwatches, and mobile health or wellness apps are used by tens of millions of people worldwide. There are many such apps available in Apple’s and Google’s app stores. The data gathered can reveal much personal information about individuals. In the case of Strava, you could track patterns of activity over the two years’ worth of data. Read more »

    Happy International Privacy Day 2018

    Monday, January 22nd, 2018

    International Data Privacy Day is Sunday. There are a variety of events occurring this week to celebrate. Visit the official site to find events near your area. Take the time to think about how privacy is important in your life and how you can protect your rights from being infringed upon. Please also donate to any number of organizations out there trying to protect your privacy rights.

    In China, a Steady March Toward Complete Surveillance of Its Citizenry

    Friday, December 22nd, 2017

    Decades ago, China began a system of online surveillance and censorship that was nicknamed “the Great Firewall of China.” Now, that firewall is getting stronger, and there is also an increase in broader surveillance of the public, and the surveillance is becoming more focused, so a particular individual could be targeted.

    China has long had a vast camera surveillance, or CCTV, system throughout the country and it includes face-recognition technology. In June, the Wall Street Journal reported that Industry researcher IHS Markit estimated “China has 176 million surveillance cameras in public and private hands, and it forecasts the nation will install about 450 million new ones by 2020. The U.S., by comparison, has about 50 million.” And the Chinese government is using pairing the CCTV surveillance systems with biometric technology “on streets, in subway stations, at airports and at border crossings in a vast experiment in social engineering. Their goal: to influence behavior and identify lawbreakers.”

    The system is powerful. BBC News recently reported that, in a test, it took China’s surveillance system seven minutes to locate and apprehend one of its reporters. Notably, China’s CCTV system isn’t the only one to integrate face-recognition technology in order to better target individuals.  Read more »

    What If the Rules About Newborn Blood Screenings Changed?

    Thursday, October 26th, 2017

    There has been an ongoing privacy and ethics debate over the unauthorized or undisclosed use of newborns’ blood samples for purposes other than the standard disease-screening, which includes about 30 conditions. Now, there’s a trial study, called BabySeq, from Brigham and Women’s Hospital that “uses genomic sequencing to screen for about 1,800 conditions, including some cancers,” CBS Miami reports.

    The privacy questions are clear: What happens to the DNA data — who keeps it, in what form, for how long — and who has access to it? The participants in the study have chosen to participate with, presumably, complete knowledge of the answers to these questions. But consider if the screening of 1,800 conditions, rather than the current 30, became the legal standard. This is a significant amount of highly personal information and there are substantial privacy issues.

    BabySeq co-director, Dr. Robert Green, has raised some of these issues. “We can’t predict what kind of discrimination is going to be occurring by the time your child grows up,” Green said. “We can’t predict whether there’s some sort of privacy breaches, this information gets out and is used against your child in some sort of future scenario. And we, most importantly, we can’t predict the information’s accurate.” Read more »