IE 11 is not supported. For an optimal experience visit our site on another browser.

That 'Free' App Isn't Free: How Self-Improvement Became a Market

The next time a goal-setting app notification pops up, remember: The people who make your technology have their own agenda.
Image: We are starting to live in a world where the technology knows us better than we know ourselves and it doesn't always have our best interests at heart.
We are starting to live in a world where the technology knows us better than we know ourselves and it doesn't always have our best interests at heart.Photo Illustration by NBC News/Getty Images / Getty Images

Birthday wishes sent to your best friend from high school? Done.

Walked ten thousand steps today? 10,481 to be exact...and counted 1,500 calories too.

What about buying more dog food? Same-day delivery is on its way.

Phew! Nothing beats tapping a checkmark next to the last item on your to-do list app at the end of the day. There are thousands of platforms, trackers, and schedulers available to keep us barreling towards better — being better parents and better friends with better bodies and better home furnishings. As long as we chart the course laid out for us by our digital tools, we may even get to best. Gold star for you.

But the next time a notification pops up prompting you to drink 8 ounces of water or leave the office for yoga class, remember: the people who make your technology have their own agenda. And you reaching your goal isn’t it. It’s to mine every possible bit of data about you — and make you like it.

Our digital tools depend on a constant feed of our personal information which gets sold to data brokers or used to nudge you to spend more money. You get “monetized” in return for that free app or service. Don’t get me wrong, profit isn’t the only goal. There are apps like the period tracker Clue that anonymizes users data for reproductive health studies. One of my listeners, Bria, told me she gladly shares the specifics of her period in order to contribute to their menstrual research. In fact, in a survey of two thousand Note to Self listeners, 75% agreed they would willingly share their personal data if they knew it was being used for the greater good of society.

But there’s almost no way to know if the app-makers really are using your data to make the world a better place. Few consistent standards are in place to ensure apps do anonymize us and fewer still to ensure we are informed of exactly what information is being collected and to whom it’s being sold. Uber, for example, just announced it would share the information it collects from billions of rides with city planners. But we also recently learned that employees tracked celebrities like Beyonce’s trips and used ride data to stalk their exes.

Image: Manoush Zomorodi.
Manoush Zomorodi.Amy Pearl

You probably know already that our apps collect location data, search terms and purchase histories. You’ve probably also heard of “cookies,” ad trackers that follow you from page to page online, making it possible for marketers to show you ads for the towels you thought about buying a month ago. But that’s so 2012.

New online trackers use digital “fingerprinting,” so companies can follow you by combining your computer’s address, the font you use, which browser you search on, plus dozens of other data points to figure out who you are.

Then there’s the parsing of your behavior: what kind of punctuation you use in emails, how hard and often you press the touchscreen, or whether you show an “affinity” for a specific ethnic group. Yes, they know all that. Liking Joe Lieberman, for example, could lead Facebook to decide you have an “ethnic affinity” with Jewish-Americans and sell that information to marketers who may want to sell you Manischewitz.

There are services that don’t even study you — they analyse the people you’re in touch with. Crystal analyzes everything your recipient has written online and then coaches you to use language that appeals to them. As the company founder, Drew D’Agostino, told me, “I realized you can develop much more healthy and productive relationships with people.” Do we care that Drew may not actually feel that empathy, just write emails as though he does?

The confusing part about these services is that being watched and tracked can have a clear upside. Maybe we get an amazing coupon or never run have to run out of toilet paper because Amazon is on it. If the algorithms have decided you’re a good customer, because you pay for app upgrades or use those coupons and spend more money in the process, you can feel like a winner.

But when we take a moment to think how we won, it can be deeply unsettling. The dehumanizing aspect of being culled, collected, and sold can be downright discriminatory in the wrong hands. Based on who they are, someone may not be offered a discount, a free service, or even a job. Algorithms don’t necessarily have basic human decency programmed into them.

Less sinister but still noteworthy is how digital design can lead people to unexpected behaviors. I’ve heard from listeners who jumpstarted their diet with fitness trackers and deleted the app after losing ten pounds. But I’ve also heard from people who find themselves racking up steps by compulsively walking around their dining room table at night. One man told me he doesn’t think he’ll ever be able to eat a meal without mentally calculating the fat, protein, and carb content. He looks great, he said, but he feels worse.

We are starting to live in a world where the technology knows us better than we know ourselves and it doesn’t always have our best interests at heart. Keep your eye on your own bottom line; don’t just check your apps, check in with yourself. Set your goals, but maybe write them down with pen and paper.

Manoush Zomorodi is the host and managing editor of Note to Self, “the tech show about being human,” from WNYC Studios. Her latest project is "The Privacy Paradox" challenge, taking the mystery out of digital privacy.