Some Further Thoughts On Privacy

The US has a (large­ly reli­gion-dri­ven) absti­nence-until-mar­riage move­ment; in some states, schools are not required to pro­vide sex­u­al edu­ca­tion to teens, and where it is pro­vid­ed, absti­nence from inter­course is pro­mot­ed as the best method of main­tain­ing sex­u­al health. But a 2007 meta-study found that absti­nence-only at best had no effect at all on teen sex­u­al health, and at worst led to high­er rates of sex­u­al­ly-trans­mit­ted infec­tions: in com­mu­ni­ties with greater than 20% of teens in absti­nence-only pro­grams, rates of STDs were over 60% high­er than in those of reg­u­lar pro­grams.

Igno­rance of their options meant these teens were less like­ly to use con­tra­cep­tion when they did have sex, were more like­ly to engage in oral and anal sex, and less like­ly to seek med­ical test­ing or treat­ment.

I wor­ry that ‘total pri­va­cy’ advo­cates are caus­ing sim­i­lar igno­rance in peo­ple online. An arti­cle in the lat­est Wired UK heav­i­ly hypes up the scare of your data being pub­licly avail­able, but with­out offer­ing any expla­na­tion of why that’s bad or how you can take back con­trol, beyond block­ing all data shar­ing. By pro­mot­ing zero-tol­er­ance pri­va­cy, encour­ag­ing peo­ple to leave social net­works or unin­stall apps that share data, total pri­va­cy advo­cates fail to edu­cate peo­ple on the pri­va­cy options that are avail­able to them, and ways they can use data to their own advan­tage.

Face­book, for exam­ple, has excel­lent expla­na­tions of how they use your data, fil­ters and pref­er­ences that let you con­trol it, and links to exter­nal web­sites that explain and pro­vide fur­ther con­trols for dig­i­tal adver­tis­ing.

My con­cern is that, if you advise only a zero-tol­er­ance pol­i­cy you run the risk of dri­ving peo­ple away to alter­na­tives that are less forth­com­ing with their pri­va­cy con­trols, or mak­ing them feel help­less to the point where they decide to ignore the sub­ject entire­ly.  Either way they’ve lost pow­er over the way they con­trol their per­son­al data, and are miss­ing out on the val­ue it could give them.

And I strong­ly believe there is val­ue in my data. There is val­ue in it for me: I can use it to be more informed about my health, to get a smarter per­son­al assis­tant, to see ads that can be gen­uine­ly rel­e­vant to me. And there is val­ue in it for every­one: shared med­ical data can be used to find envi­ron­men­tal and behav­iour­al pat­terns and improve the qual­i­ty of pub­lic pre­ven­ta­tive health­care.

I’m not blithe about it; I don’t want my data sold to unknown third par­ties, or used against me by insur­ers. I’m aware of the risks of the panop­ti­con of small HD cam­eras that could lead to us all becom­ing wit­ting or unwit­ting infor­mants, and mon­i­tor­ing of com­mu­ni­ca­tion by peo­ple who real­ly have no busi­ness mon­i­tor­ing it.

What we need is not total pri­va­cy, but con­trol over what we expose. We need trans­paren­cy in see­ing who gets our data, we need leg­is­la­tion to con­trol the flow of data between third par­ties, we need the right to opt out, and we need bet­ter anonymi­ty of our data when we choose to release it into large datasets.

Knowl­edge is pow­er, and I’d rather have con­trol of that pow­er myself than com­plete­ly deny it a place in the world.

Sources and further reading