Feedback
Business

Here's the Real Score: Big Data Knows Everything About You!

Big Data is watching you Google via AP file

Your life is an open book.

Unless you live off the grid, big data companies know all about you. They constantly collect information about the things you buy and the websites you visit, plus thousands of other bits of personal information gathered from public records and your social media activity.

This data is run through sophisticated algorithms that make it possible for retailers, utilities and financial institutions to predict how you will respond to various marketing offers. These computer programs can predict your ability to pay your bills on time or predict that you are sick. They can even predict that a woman is pregnant.

The consumer scores generated from this "predictive analysis" determine what ads you see when you go online, the offers for products and services you get in the mail and the coupons that are sent to you via email.

"This predictive world is on its way and I'm not sure we're fully ready for it,"said Pam Dixon, executive director of the World Privacy Forum.

You can get your credit score, but there is no way to find out about these other consumer scores that have an ever-increasing impact on your life. Credit bureaus are highly regulated. Big data is not.

In its new report, "The Scoring of America," the World Privacy Forum outlines how these secret consumer scores can threaten privacy and fairness. And it calls on federal regulators to police this relatively new industry of predictive scoring.

"Not all scores are bad, but any score that's secret is a problem because it's a simple matter of fairness not to have a secret score," Dixon said. "I'm really concerned about potential discrimination lurking in scores that we don't see and don't know what goes into them."

You will never know when a predictive score was used or how it affected the offers you did or did not receive. And even if you did find out, you can't change that score.

Robert Gellman, a privacy and information policy consultant who worked on the report, said these scores can be based on thousands of factors, including race, religion, age, gender, household income, ZIP code, medical conditions and purchase history.

2:49

"There could be a discriminatory effect here—some kind of red-lining that isn't visible on the surface," he said.

And that's the rub. Whether you get a discount coupon isn't a big deal, but if consumer scoring keeps you from receiving credit card offers or makes you a target for sub-prime loans—because of your ZIP code, household income or race—well that's another matter.

The companies that create and market this ever-expanding treasure-trove of personal information don't see a problem or the need for any additional regulation. They say the marketing information they gather and analyze is used to offer people relevant ads and money-saving deals.

"Marketers want the most accurate information possible," said Rachel Nyswander Thomas, executive director of the Direct Marketing Association's Data-Driven Marketing Institute. "But at the end of the day, if the data is wrong and the predictive analytics is wrong, the worst thing that can happen to a consumer is that the ad or offer they get is not relevant."

Jennifer Barrett Glasgow, chief privacy officer at Acxiom, one of the country's biggest data brokers, explained that a marketing score really isn't all that personal.

"It's a mathematical computation that puts a group of individuals into a defined audience for a marketing campaign," she said. "And these scores aren't static in the same way that credit scores are. Their lifespan may be milliseconds."

Credit scores are based on information in your credit file. Federal law aims to prevent discrimination by prohibiting certain information from being included in those files, such as race, national origin, religion, gender, marital status and sexual orientation.

The Fair Credit Reporting Act gives you the right to check your files and correct errors. If this information was used to deny your application for credit, you must be told.

The marketing databases used to determine consumer marketing scores are virtually unregulated, so they can legally contain all sorts of sensitive information that cannot be included in your credit files—including health and medical information gleaned from Web searches, purchases and public postings.

The World Privacy Forum report estimates there were fewer than 25 consumer scores in 2007. Today, there are hundreds and probably thousands. A few examples cited in the report:

  • Job security score: Predicts future income and capacity to pay.
  • Churn score: Predicts when customers will move their business or account to another merchant.
  • Brand name medicine propensity score: Predicts if you will buy generics or brand name medications.
  • Fraud score: Predicts if a customer is not who they claim to be or may be up to some mischief.

Privacy advocates would like to see federal regulators establish some rules for the use of consumer scores to make sure they are not being used unfairly or to discriminate.

They believe the companies that collect this data should be required to take steps to ensure that it is accurate.They want companies to disclose that a predictive score was used, if that score adversely impacts someone's employment, credit, insurance or any significant marketplace opportunity.

Even if the decision is made to regulate this industry—and that is far from certain—it won't be easy to do. The World Privacy Forum estimates that there are now more than 4,000 databases collecting and analyzing every bit of information they can gather on us.

There are ways to stop some of this data collection. You can use Web browsers that don't track where you go, or you can shop with cash. You could even stop sharing all of your personal information on social media—fair game for the data collectors.

But there are so many sources of personal information beyond your control that it's really a losing battle. Your data are now a commodity, bought and sold, whether you like it or not.