this is a post that i started last year and never finished. i'm posting it as-is just to lay the groundwork for a new post i have in mind on privacy and technology-- laying out the problems and approaches that interest me and inform my interest in these questions going forward. it's twice as long as it should be and stops abruptly. apologies. i'll edit it someday.
before the technological advances of the last quarter century, there was a natural limit to the information about us by which we would likely be judged by our friends, family, potential employers, our government, creditors, and the community at large. these were the limits of human memory, the limited life of paper. our indiscretions may have been on record, but they were preserved by means that naturally degrade over the course of a lifetime, and in which information, even where preserved, can nonetheless be buried. or this, at least, is the premise of this week's new york times magazine feature-- jeffrey rosen's "the end of forgetting"-- a meditation on the theme of viktor mayer-schonberger's new book delete: the virtue of forgetting in a digital age.
the notion is that there is something essential in forgetting. the most interesting suggestion, i think, is that there is a necessary relationship between forgetting and forgiving-- that our ability to forgive is imperiled by these advances. but the more general concern is that our newfound (and soon-to-be-found) abilities to collect, record, aggregate, and search through vast amounts of information will result in potential dates, employers, and everyone else, making crucial judgments about our character and potential based on information that we haven't chosen to share-- information taken dramatically out of context, or about past behaviors that no subsequent penitence can erase. rosen presents a picture of a world in which
"people will be able to snap a cellphone picture of a stranger [or a job candidate], plug the image into google and pull up all tagged and untagged photos of that person that exist on the web...[and] internet searches for images are likely to be combined with social-network aggregator search engines...which combine data from online searches-- including political contributions, blog posts, youtube videos, web comments, real estate listings, and photo albums...in the web 3.0, [founder of ReputationDefendender michael] fertik predicts, people will be rated, assessed and scored based not on their creditworthiness, but on their trustworthiness as good parents, good dates, good employees, good baby sitters or good insurance risks."
to correct these disturbing trends, it is suggested that we put time limits or expiration dates on the photos, blog comments, and other bits of our digital trail that would mimic the limitations of human memory. another suggestion (of jonathan zittrain's) is that we be allowed to file for some sort of reputation bankruptcy, which would wipe the digital slate clean ever so many years.
this feature was fun to read, but i confess that i got a lot more out of "speak, memory", evgeny morozov's review of delete in last month's boston review. in it he does a pretty good job ob of defending the view that, though mayer-schonberger's argument is "interesting", it
"suffers from three large and arguably fatal flaws: a very loose account of what memory is, an insufficient appreciation of the value of remembering, and—most important for public policy—an unconvincing effort to distinguish the animating concerns about memory from more conventional (and serious) concerns about privacy."
it's a relatively dry and wonkish set of objections to mayer-schonberger's "romanticist rebellion against technology", but even as someone more prone to and qualitified to offer romanticist cultural criticism than technical assessment of the likely implications of specific new technologies (more of a mayer-schonbeger, that is, than a evgeny morozov), i like that morozov does mayer-schonberger the honor of taking his view seriously enough to offer such useful and academic assessment of what might prevent serious tech thinkers and policy makers from taking mayer-schonberger's critiques and suggestions seriously.
but what i loved most about morozov's review is that it pointed me toward hellen nissbaum's really wonderful book privacy in context, which has dominated my reading list all summer. her argument, in short, is that our privacy interests aren't just a matter of how much of our information is shared (as some brute percentage of what there is to know overall), or whether a particular kind of personal information (income, say, or medical records) is shared at all with anyone ever-- they are, rather, a matter of whether or not our personal information is being shared in accordance with what she calls "context-relative informational norms" or in violation of those norms. we live, she argues, within
"a finely calibrated system of [these] social norms, or rules, [that] govern the flow of personal information in distinct social contexts (e.g. education, health care, and politics). these...context relative informational norms define and sustain essential activities and key relationships and interests, protect people and groups against harm, and balance the distribution of power. responsive to historical, cultural, and even geographic contingencies, informational norms evolve over time in distinct patterns from society to society"
in our society, for example, there are norms according to which it is generally proper to share facts about your income with the IRS, and improper to share facts about your income at dinner parties with your coworkers. and there are norms according to which it is generally proper that to share facts about your sexual history with your family doctor, and generally improper to share facts about your sexual history with, say, your kids' teacher. these norms arise for all kinds of reasons, often having to do with the nature of particular kinds of relationships. (you have only to describe what kind of a thing a doctor is to start to see why sharing your sexual history with your doctor is appropriate, though you may refrain from sharing this information from even people you are very close to.)
on nissbaum's view, we ought to assess the alleged intrusiveness of particular instances of information gather or dissemination according to whether or not they violate the existing social norms of a particular society. and we ought to structure our laws around information gathering and dissemination in such a way as to protect and buttress those norms, even as new technologies make it ever easier to violate those norms on ever grander scales.
there's a crapload that's problematic about nissbaum's view-- most centrally its inherent conservatism. her program is essentially one of conserving whatever the existing norms are, however contingent, and without any particular attention being paid to what sorts of contingencies we're talking about. the contingent fact that a particular society exists in an extraordinarily warm climate looks to be different from the contingent fact that a particular society has lived for sixty years under cruel dictatorship, but it's hard to see how the existing norms established under this latter contingency are any less norm-y than the existing norms in the former. as with all societal norms, informational norms will sometimes seem, upon inspection, sound and other times silly, or oppressive, or otherwise objectionable. it's at the very least not obvious that laws that both varieties of norm ought equally be protected by law.
but there's something i fundamentally love about this view: it begins with an understanding of the dynamics underwriting all privacy concerns: the near-illiminable diversity of relations in which a human being can stand to other human beings, each essentially constituted by what the various parties know about one another and how they know it. to be someone's friend, or husband, or doctor is to stand in a particular kind of relation to that person-- a relationship subject to both generalizable norms and a a thousand little peculiarities. we might think, for example, that friendship is a particular sort of a relation, and that liking or trusting someone is partially constitutive of it, such that you could not truly say of someone that you neither like nor trust 'she is my friend.' and then beyond the general features constitutive of any particular type of relationship, there will be more localized norms (what i share with my best friend, my childhood friends, my friends at work), and ultimately the entirely unique pattern of sharing and expectation that we have with each individual friend, sensitive to specific contours of who someone is, or who they are to you. (the problem with facebook-- its ground-up misunderstanding of how relationships works-- the need to distinguish beyond "friend," to share in a much more targeted and discerning way.)
one of the philosophies around privacy issues is that it's ultimately the protection of a precious resource: intimacy. that in controlling what people know about us, we control who they are to us and who we are to them. we decide who gets in and to what degree, according to (best case scenario) our judgments about who they are.
the particular problem of aggregation.
"a finely calibrated system of [these] social norms, or rules, [that] govern the flow of personal information in distinct social contexts (e.g. education, health care, and politics). these...context relative informational norms define and sustain essential activities and key relationships and interests, protect people and groups against harm, and balance the distribution of power. responsive to historical, cultural, and even geographic contingencies, informational norms evolve over time in distinct patterns from society to society"
in our society, for example, there are norms according to which it is generally proper to share facts about your income with the IRS, and improper to share facts about your income at dinner parties with your coworkers. and there are norms according to which it is generally proper that to share facts about your sexual history with your family doctor, and generally improper to share facts about your sexual history with, say, your kids' teacher. these norms arise for all kinds of reasons, often having to do with the nature of particular kinds of relationships. (you have only to describe what kind of a thing a doctor is to start to see why sharing your sexual history with your doctor is appropriate, though you may refrain from sharing this information from even people you are very close to.)
on nissbaum's view, we ought to assess the alleged intrusiveness of particular instances of information gather or dissemination according to whether or not they violate the existing social norms of a particular society. and we ought to structure our laws around information gathering and dissemination in such a way as to protect and buttress those norms, even as new technologies make it ever easier to violate those norms on ever grander scales.
there's a crapload that's problematic about nissbaum's view-- most centrally its inherent conservatism. her program is essentially one of conserving whatever the existing norms are, however contingent, and without any particular attention being paid to what sorts of contingencies we're talking about. the contingent fact that a particular society exists in an extraordinarily warm climate looks to be different from the contingent fact that a particular society has lived for sixty years under cruel dictatorship, but it's hard to see how the existing norms established under this latter contingency are any less norm-y than the existing norms in the former. as with all societal norms, informational norms will sometimes seem, upon inspection, sound and other times silly, or oppressive, or otherwise objectionable. it's at the very least not obvious that laws that both varieties of norm ought equally be protected by law.
but there's something i fundamentally love about this view: it begins with an understanding of the dynamics underwriting all privacy concerns: the near-illiminable diversity of relations in which a human being can stand to other human beings, each essentially constituted by what the various parties know about one another and how they know it. to be someone's friend, or husband, or doctor is to stand in a particular kind of relation to that person-- a relationship subject to both generalizable norms and a a thousand little peculiarities. we might think, for example, that friendship is a particular sort of a relation, and that liking or trusting someone is partially constitutive of it, such that you could not truly say of someone that you neither like nor trust 'she is my friend.' and then beyond the general features constitutive of any particular type of relationship, there will be more localized norms (what i share with my best friend, my childhood friends, my friends at work), and ultimately the entirely unique pattern of sharing and expectation that we have with each individual friend, sensitive to specific contours of who someone is, or who they are to you. (the problem with facebook-- its ground-up misunderstanding of how relationships works-- the need to distinguish beyond "friend," to share in a much more targeted and discerning way.)
one of the philosophies around privacy issues is that it's ultimately the protection of a precious resource: intimacy. that in controlling what people know about us, we control who they are to us and who we are to them. we decide who gets in and to what degree, according to (best case scenario) our judgments about who they are.
the particular problem of aggregation.