“The FTC’s commitment to enforcing privacy laws across smart devices and apps is fantastic news for consumers,” Tony Pepper, CEO of security vendor Egress, told Lifewire via email, “Any company found in violation can expect to face the consequences set out in the law they’ve broken, such as threat of injunction and financial penalties.”
For the People
In the letter, Kristin Cohen, FTC’s Acting Associate Director of Privacy & Identity Protection, explained that a person’s precise location and information about their health are two of the most sensitive categories of data that are often collected by connected devices, including smartphones, smart cars, and wearables. Even by themselves, such data poses an “incalculable risk” to a person’s privacy, Cohen reasoned, adding that when combined for the purposes of making money, the risk balloons into “unprecedented intrusion.” “While many consumers may happily offer their location data in exchange for real-time crowd-sourced advice on the fastest route home, they likely think differently about having their thinly-disguised online identity associated with the frequency of their visits to a therapist or cancer doctor,” explained Cohen illustrating the kind of misuse the FTC is talking about. Training her guns on data aggregators and brokers that collate information from multiple sources to sell to the highest bidder, Cohen pointed to their 2014 study that pointed out that data brokers could use data to make sensitive inferences, such as categorizing a consumer as “Expectant Parent.” Gil Dabah, co-founder and CEO of Piiano, a company that helps safeguard customers’ PII by helping developers comply with evolving privacy regulations, believes holding organizations’ feet to the fire for privacy protection is the right approach. “Does anyone who isn’t a lawyer think people will read privacy disclosures and weigh their risks over quick access to an app?” Dabah asked Lifewire rhetorically. “As if they could even understand the risks.” More importantly, Dabah argues that properly securing such sensitive data is challenging and applauds the FTC for pointing out that ‘anonymizing’ alone isn’t enough to protect personally identifiable information (PII) about people.
People Power
Adding some context, Pepper pointed out that changes to data privacy laws in recent years have put consumers in the driving seat. “Recognizing the value and commoditization of personal data, new and updated laws put consumers back in control of their personal data,” noted Pepper, “through aspects such as informed consent about what data is being collected, greater transparency of how data is used and shared, and rights for data to be anonymized, amended, and erased.” Referring to Cohen’s note, he adds that, unfortunately, not every company plays by these rules. Explaining FTC’s concerns, Pepper says that, for starters, the commission is after apps that are collecting ‘too much’ data about their users, for example, tracking an individual’s location even when they’re not actively using the app and going against the permissions they’ve set. Next come the companies who are reidentifying individuals for financial gain, such as health or fitness providers that combine geo data with health app data to target specific individuals with local services or offers. “This new FTC notice helps enforcement for when data is knowingly shared but violates privacy laws,” Lior Yaari, CEO and co-founder of Grip Security, told Lifewire over email. “However, an even bigger problem is when companies unknowingly share or mishandle data that violates consumer privacy rights.” Building on that, Dimitri Shelest, founder and CEO of OneRep, an online privacy company that helps people remove their sensitive information from the internet, argued that the need of the hour are laws regulating social media and big tech that help guide how people’s privacy is managed by tech providers. “Naturally, these companies are being guided by commercial interests, and our task is to install legislation to manage socially important issues like protecting consumer privacy and preventing information manipulation that influences public attitudes,” opined Shelest. “Any kind [of action] that will help advocate for consumers is a strong step in the right direction.”