‘Creepy’ data collection: Here’s how it hurts you

0
494

More than ever, businesses are talking about customer data: collecting it, using it for insights and maximizing profits. But what’s happening to yours — and how is it affecting the choices you have and the prices you’re offered?

Most people know some companies have some access to their data, but data privacy experts warn that no one really knows how many companies are gathering it, where it’s ending up, or how it could hurt them.

Last week, ProPublica reported medical devices are gathering more and more data — heart rates, sleep patterns, the number of steps taken in a day — and patients have been shocked at the ways that information is being used.

According to the investigation, the data that’s constantly collected by step counters, blood glucose monitors, medication alerts and trackers, in-home cameras, heart monitors and CPAP breathing machines is being packaged and sold for advertising, anonymized and used by IT companies, and even shared with health insurers, who can use it to deny reimbursement.

We know what you bought last summer

But it turns out people who use health or lifestyle-monitoring devices aren’t the only ones who should be holding their breath.

Your credit card use, loyalty cards, phone location data, race, education level, TV habits, marital status, net worth, social media posts, bill payment history and browser use are all being collected. All day, every day, companies are sucking up vast amounts of your personal information from countless sources — then manipulating it, packaging it and selling it.

- Advertisement -

“There’s a whole universe of data brokerage, data transfers and profiling which most consumers know nothing about,” said Katharine Kemp, an expert in data privacy and consumer protection in digital financial services at the University of New South Wales in Sydney, Australia. “It’s invisible to us. And you won’t find companies explaining what goes on here.

“The Federal Trade Commission conducted an investigation into the data brokerage industry several years ago which would be a big eye-opener for most people, but still likely just shows the tip of the iceberg.”

Data brokerage firms collect your personal information from various companies you’ve dealt with and combine it with information they buy from each other — as well as some publicly available information — to build profiles on you made up of thousands of data points.

“Then they put you in certain lists based on what they’ve discovered and sell those lists to companies who want to know about you,” Kemp said. “You don’t know which lists you’re on. It could be ‘Diabetes Interest,’ ‘Cholesterol Focus,’ ‘Financially Challenged’ or ‘Urban Scramble.’

“Based on your list, you might be targeted with certain products, or charged higher prices or interest rates, or excluded from certain offers.”

The decisions companies make about you based on these handy parcels of your personal details are not harmless.

‘Unverified, error-prone’

Another investigation published by ProPublica in July said health insurers are using “unverified, error-prone ‘lifestyle’ data to make medical assumptions” that could lead them to improperly price plans.

“With little public scrutiny, the health insurance industry has joined forces with data brokers to vacuum up personal details about hundreds of millions of Americans …,” Marshall Allen wrote in the report. “The companies are tracking your race, education level, TV habits, marital status, net worth. They’re collecting what you post on social media, whether you’re behind on your bills, what you order online. Then they feed this information into complicated computer algorithms that spit out predictions about how much your health care could cost them.”

The potential harm goes well beyond product pricing and health care costs. Life- and business-changing opportunities like loans can be offered — or not — based on data we don’t even know is being collected.

‘Immune from scrutiny’

In testimony last year before the U.S. House of Representatives Subcommittee on Digital Commerce and Consumer Protection, University of Maryland law professor Frank Pasquale said the algorithms used by search engines, credit raters and major banks to convert our data into scores and rankings “are all too often immune from scrutiny,” and have critical consequences.

“The personal reputation business is exploding,” said Pasquale, who specializes in issues related to machine learning and algorithms. “Having eroded privacy for decades, shady, poorly regulated data miners, brokers and resellers have now taken creepy classification to a whole new level.

“They have created lists of victims of sexual assault, and lists of people with sexually transmitted diseases. Lists of people who have Alzheimer’s, dementia and AIDS. Lists of the impotent and the depressed. There are lists of ‘impulse buyers.’ Lists of suckers: gullible consumers who have shown that they are susceptible to ‘vulnerability-based marketing.’”

If that’s not disturbing enough, Pasquale notes that some companies sold the mailing addresses and medication lists of depressed people and cancer patients, and one firm reportedly combined credit scores and a person’s specific ailments into one report.

Even the Federal Trade Commission has so far been unable to get a handle on the extent of these practices and the stores of data, which is easily encrypted.

Encryption is not just the enemy of hackers; it’s also the enemy of transparency.

But I’m not diabetic

This leads to another surprise: The details used to build these pictures might not even be correct. In the lists and scores traded among companies, misinterpretations and errors are just as indelible as facts.

“In the kind of consumer profiles that we’ve been talking about, there may be mistakes made about whether that information is actually your information, or whether it supports the conclusion that that profile suggests,” Kemp said. “But given that most people have no way of finding out that there is this profile or score that relates to them, they also have no say in whether that profile or score can be amended. So it can go on having an effect on that person without their knowledge for many years — or for life.”

According to Pasquale, this kind of runaway data “can lead to cascading disadvantages,” in which the assumptions built in the digital world play out over and over in real life.

“Once one piece of software has inferred that a person is a bad credit risk, a shirking worker, or a marginal consumer, that attribute may appear with decision-making clout in other systems all over the economy,” he said.

Who reads privacy policies?

It’s tempting to say, ‘Simple — don’t share your data in the first place.’

Not so fast. There are good reasons the vast majority of consumers don’t read privacy policies for apps, web browsers, loyalty programs and online forms, Kemp said.

“Research has shown it would take the average person six working weeks a year to read all privacy policies that apply to them,” she said, “so you have a physical impossibility for most people right there.

“On top of that, many of the policies are drafted in misleading and obscure ways: ‘We care about your privacy, we never sell your data, we use your data to enhance our services for you … Oh, by the way, clause 46, we may share your information with our affiliates for statistical and marketing purposes.’ Clause 46 is actually the most important clause in that policy.

“Even if you read and understood the policy, there’s still the question of whether you’re actually in a position to bargain for better terms. Are there competing services in this area that offer any better terms? Do you have to have that app to receive your child’s school report?”

It’s not real consent if you have no alternative.

‘Lobby your congressman’

In a recent investigation by Money, cybersecurity experts agreed that anyone who wants to protect their personal data from these uses is “pretty much out of luck.”

Can we prevent our data from being harvested in without our knowledge in the first place? Security technologist Bruce Schneier told Money the answer is No.

“You can’t do anything,” he said. “That’s the fundamental problem with this.

“You live in the United States and the United States doesn’t regulate surveillance capitalism,” he added. “Your data can be bought and sold without your knowledge and consent. That’s the way it works.

“If you don’t like that, lobby your congressman. That is your only option.”

Note: Learn more about data use and data privacy in the Business Journal’s two-part series in the Nov. 16 and Nov. 23 editions.

Disclosure: CSBJ Digital Editor Helen Robinson and Katharine Kemp, who is cited in this article, are related.