Editor’s note: This is the second in a two-part series about the impacts of the collection, distribution and commercial use of consumer data. Part I was published in the Nov. 16 edition of the Business Journal.
All day, every day, companies are sucking up vast amounts of your personal information, manipulating it, packaging it and selling it.
Your data is the lifeblood of an unfathomably large industry that, at least in the United States, is opaque and almost wholly unregulated. And what companies decide about you based on these handy parcels of your personal details can have startling impacts on your life.
It’s not harmless.
“This is not simply a matter of receiving mildly annoying ads that show that someone was ‘looking over your shoulder’ when you were browsing for gym shorts. There is a vast and complex ecosystem of companies that profit from our personal information, almost entirely without our knowledge,” said Katharine Kemp, an expert in data privacy and consumer protection in digital financial services at the University of New South Wales in Sydney, Australia. “Your credit card transactions, your store loyalty card history and a number of lifestyle factors can be used by these companies to make conclusions about your health, the likelihood that you suffer from a certain disease, your reliability, your weaknesses; and seemingly harmless information like the make of your car or the fact that you purchased plus-size clothing or buy red meat or alcohol or milk, can all take on meaning in this context.”
These conclusions are used to create consumer “scores” and consumer profiles with real consequences, Kemp said.
“They might determine whether you’ll have access to certain credit or health insurance, or whether you’ll be charged higher interest rates or premiums, or higher prices for other services,” she said. “Even without a score or profile, a company might extract certain information about you by using cookies to find out, for example, that you have a more expensive brand of mobile phone or that you’ve searched for exactly the same day and destination of flight multiple times, so charge you higher prices for your airfares.”
‘Hypersensitive and hyper-detailed’
Lina Khan, who researches antitrust law and competition policy and is a Legal Fellow at the Federal Trade Commission, last week outlined similar concerns on NPR’s 1A.
“The dangers … are all the more stark and severe when you have companies … that are critically involved in collecting hypersensitive and hyper-detailed information about us that will enable them potentially to discriminate,” she said. “… [W]e’re in a world where each company — especially digital companies — would be able to target us with specific prices, so we could all be seeing a different price for the same good or service based on what this company happens to think we’d be willing or able to pay. That’s a dramatically different kind of power than even Walmart enjoyed as the monopolist of the last century. …
“What I’m describing is known as first-degree price discrimination, which means that every individual person will be charged a different price, based on what this company knows about your willingness to pay, about how often you visit the website, what time of day, how often you keep items in your shopping cart without clicking on them. … You and your neighbor could pay totally different prices for the same good.”
Kemp emphasized that the kind of consumer profiles companies are trading in are different from the credit scores we’re familiar with, and are applied in unexpected areas of our lives.
“Some health insurance companies collect personal information from wearable devices which they provide to their customers,” she said. “They’ll sometimes say they don’t use this information to charge customers higher premiums but only to decide what discount they’ll offer you on your premiums. It doesn’t take a rocket scientist to realize you’re still paying more, whatever label you put on it.”
An investigation published by ProPublica in July discovered health insurers are using “unverified, error-prone ‘lifestyle’ data to make medical assumptions” that could lead them to improperly price plans.
“With little public scrutiny, the health insurance industry has joined forces with data brokers to vacuum up personal details about hundreds of millions of Americans …,” Marshall Allen wrote in the report. “The companies are tracking your race, education level, TV habits, marital status, net worth. They’re collecting what you post on social media, whether you’re behind on your bills, what you order online. Then they feed this information into complicated computer algorithms that spit out predictions about how much your health care could cost them.
“Are you a woman who recently changed your name? You could be newly married and have a pricey pregnancy pending. Or maybe you’re stressed and anxious from a recent divorce. That, too, the computer models predict, may run up your medical bills. …”
The potential harm goes well beyond product pricing and health care costs. Life- and business-changing opportunities like loans can be offered — or not — based on data we don’t even know is being collected.
‘Immune from scrutiny’
In testimony last year before the U.S. House of Representatives Subcommittee on Digital Commerce and Consumer Protection, University of Maryland law professor Frank Pasquale said the algorithms used by search engines, credit raters and major banks to convert our data into scores and rankings “are all too often immune from scrutiny,” and have critical consequences.
“The personal reputation business is exploding,” said Pasquale, who specializes in issues related to machine learning and algorithms. “Having eroded privacy for decades, shady, poorly regulated data miners, brokers and resellers have now taken creepy classification to a whole new level.
“They have created lists of victims of sexual assault, and lists of people with sexually transmitted diseases. Lists of people who have Alzheimer’s, dementia and AIDS. Lists of the impotent and the depressed. There are lists of ‘impulse buyers.’ Lists of suckers: gullible consumers who have shown that they are susceptible to ‘vulnerability-based marketing.’”
If that’s not disturbing enough, Pasquale notes that some companies have assembled and sold the mailing addresses and medication lists of depressed people and cancer patients, and one firm reportedly combined credit scores and a person’s specific ailments into one report. Even the Federal Trade Commission has so far been unable to get a handle on the extent of these practices and the stores of data, which is easily encrypted. Encryption is not just the enemy of hackers; it’s also the enemy of transparency.
But I’m not diabetic
This leads to another surprise: The details used to build these pictures might not even be correct. In the lists and “scores” that are assembled and traded among companies, misinterpretations and errors are just as indelible as facts.
“In the kind of consumer profiles that we’ve been talking about, there may be mistakes made about whether that information is actually your information, or whether it supports the conclusion that that profile suggests,” Kemp said. “But given that most people have no way of finding out that there is this profile or score that relates to them, they also have no say in whether that profile or score can be amended. So it can go on having an effect on that person without their knowledge for many years — or for life.”
According to Pasquale, this kind of runaway data “can lead to cascading disadvantages,” in which the assumptions built in the digital world play out over and over in real life.
“Once one piece of software has inferred that a person is a bad credit risk, a shirking worker, or a marginal consumer, that attribute may appear with decision-making clout in other systems all over the economy,” he said.
There’s little in the law to stop companies from selling their profiles of you, Pasquale noted, and information can be layered, combined, swapped and recombined by numerous data brokers, including credit bureaus, analytics firms, catalog co-ops, direct marketers, list brokers and affiliates.
“If you looked at a diagram of how these various companies interact,” Kemp said, “you would be stunned at exactly how many companies there are in that ecosystem and the complex web of information sharing between those companies and the companies you deal with directly.”
Your life in whose hands?
And if identity theft strikes fear in your heart, think of the gold mine these data stores represent to cybercriminals.
“Even huge and sophisticated companies can be hacked…,” Pasquale said in his testimony. “In at least one case, an established U.S. data broker accidentally sold Social Security and driver’s license numbers — as well as bank account and credit card data on millions of Americans — to [identity] thieves.”
“Every time your personal details, and more of them, are shared with another company, you become more vulnerable to fraud and identity theft,” Kemp said. “The reality is that all companies will be hacked at some point. That doesn’t mean we need to avoid all disclosure, but it does mean we should only disclose our information to the extent that it provides some real benefits for us.
“There’s no point increasing our risk for little or no benefit. But again, it’s a question of whether the data privacy regulation in your jurisdiction really permits you to make those choices.”
Disclosure: CSBJ Digital Editor Helen Robinson and Katharine Kemp, who is cited in this article, are related.