It amuses me that people's imaginations always run towards retail discrimination when the technology clearly would be so much more cost effective for wholesale discrimination.
Think "credit scores". Except based off of much, much deeper data sets, and applicable to more fields than credit/insurance/employment.
I mean, I can tell with 80% accuracy what gender somebody is with 400 words they've written. Think of what could hypothetically be done with tying as much of their online persona I could possibly get to and then judging them on it.
Of course, you could get a new free email account and start fresh -- but then, that tells me something too, right? It will be just like a credit score: an absence of a credit score makes you look incredibly risky. Similarly, a few years from now, only reason someone in their early thirties won't have an online history is if they have something to hide. At the very least, that will be a good enough approximation to work with, given how stupidly effective the technology will be.
I mean, say I'm pricing car insurance for you. (Not for you, really. For a pool of a million people. But in the instant case, for you.) If I can make a determination of your risk algorithmically, even if I botch it some of the time, all I have to do is shave a fraction of a percent of accidents off my total and I save the company millions of dollars. How could I do that? Well, lets see whose Facebook accounts suggest that they routinely enjoy hard partying and alcohol. That might be worth a percent or two. And I won't be firing blind -- I'll have a hundred years of insurance claims, scores for people involved in 80% of the accidents last year, and a Hadoop cluster at my disposal to torture the data until it gives up its secrets.
I worry as well. Who stands to profit? Unsurprisingly, PB's employer. I'm not going ad hominem here, I am just in the habit of asking such questions, and there may be sharks in open water.
The more people share about their lives, the fewer activities can be deemed socially unacceptable. Everyone is on the end of some bell curve or another. The more you share, the more there is to define you. The more there is to define you, the less one data point can define you.
Sometimes, one data point is all that people have the patience to know about you.
There's a joke this reminds me of. Something about an old man at a bar ruminating, "a man can spend his life building bridges. Do they call him John the Bridge Builder? No. A man can spend his life raising crops. Do they call him John the Farmer? No. But you fuck one goat . . ."
On second thought, I'm not sure exactly how well that joke supports my point, but it's worth a thought.
Sure, but that's because, in today's society, fucking one goat is virtually unknown—not because it doesn't happen, but because it goes unmentioned. The first John-the-goat-fucker to be caught by the panopticon might be arrested, but the 9128th? Probably not. When we look at the population curves we'd build from this data and extrapolate that, throughout history, John himself was likely the 340-thousandth or so in his series, we might even let him out, too.
Another thing to keep in mind is that when all information is public to the finest of grains, causality is quite easy to infer. This dataset would be any psychiatrist's dream; it's a perfect, objective record of everything that happened to a person, to contrast against their subjective interpretations of events, and infer the causalities of their influences. When Clippy pops up to some poor sod in 2048 saying "by my interpretation of your recent actions, it looks like you may soon develop an urge to fuck a goat! It'd probably be a good idea to talk to a therapist, even if you don't feel like you need to," I'm guessing truly unethical behavior will take a nosedive. Anyone who's comfortable with everyone else knowing they feel that way about a goat, however, will likely be free to go ahead—since every action will be an intrinsic, quite-visible scarlet letter, there will be no need for any external punishment mechanism.
(This is assuming, of course, that we get global immigration policies worked out by then, so that people born in an environment that don't agree with its mores, no matter where that may be, can leave for a place which does.)
It worries me that all of this data about is stored, where someone at some point will potentially be able use it against me or my family.
I mean, I would feel violated if my Google web history was brought into court because I searched "Anthrax" at some point in my life.
Or worse, if I share a link of police brutality on twitter and I am harrassed by local police force for doing so.