12 February 2015

Disclosure

'Unraveling Privacy: The Personal Prospectus & the Threat of a Full Disclosure Future' by Scott R. Peppet in (2011) Northwestern University Law Review comments 
Information technologies are reducing the costs of credible signaling, just as they have reduced the costs of data mining and economic sorting. The burgeoning informational privacy field has ignored this evolution, leaving it unprepared to deal with the consequences of these new signaling mechanisms. In an economy with robust signaling, those with valuable credentials, clean medical records, and impressive credit scores will want to disclose those traits to receive preferential economic treatment. Others may then find that they must also disclose private information to avoid the negative inferences attached to staying silent. This unraveling effect creates new types of privacy harms, converting disclosure from a consensual to a more coerced decision. This Article argues that informational privacy law must focus on the economics of signaling and its unraveling of privacy.
Peppet states -
Every day that Tom Goodwin drives his Chevy Tahoe, his insurance company uses a small electronic monitor in his car to track his total driving time, speed, and driving habits. If he drives less than ten thousand hours a year, doesn’t drive much after midnight, and avoids frequently slamming on the brakes, at the end of the year he receives up to twenty-five percent off his premiums. “There’s this Big Brother thing, but it’s good,” Goodwin says. “Since I know I’m being watched, I’m on my best behavior.” To date, Progressive Insurance’s MyRate program is available in twenty states and has enrolled roughly ten thousand customers. Other insurance companies are following suit. Some carriers are going further, offering discounts for the use of more sophisticated devices that record geographical location, minute-by- minute speeding violations, and whether seat belts are in use. Rental car companies have also experimented with using such monitors to incentivize safe driving.
Similarly, every day the Mayo Clinic in Rochester, Minnesota uses remote monitoring devices to check up on the health of residents at the nearby Charter House senior living center. The devices transmit data about irregular heart rhythm, breathing rate, and the wearer’s position and motion. “The goal,” says Dr. Charles Bruce, the lead investigator on the project, “is to have full remote monitoring of people, not patients, just like you measure the pressure of your tires today.” Medical device companies are racing to enter the remote monitoring space. Proteus Biomedical, for example, is testing a wearable electronic device that can sense when patients have taken their pills and transmit that information to the patients’ doctors, and GlySens is working on an implantable subcutaneous blood sugar sensor for diabetics that uses the cellular network to constantly send real time results to one’s doctor. Although today these devices do not report data to users’ health insurers, it would be a simple step for a patient to provide such access in return for a discount. Indeed, such “pervasive lifestyle incentive management” is already being discussed by those in the healthcare field.
Finally, every day tenants, job applicants, and students voluntarily disclose verified personal information to their prospective landlords, employers, and safety-conscious universities using online services such as MyBackgroundCheck.com. Rather than forcing these entities to run a background check, an applicant can digitally divulge pre-verified information such as criminal record, sex offender status, eviction history, and previous rental addresses. Moreover, these services allow an applicant to augment her resume by having verified drug testing done at a local collection site and added to her digital record. MyBackgroundCheck.com calls this “resume enhancement.”
This Article makes three claims. First, these examples—Tom Goodwin’s car insurance, pervasive health monitoring, and the incorporation of verified drug testing into one’s “enhanced resume”— illustrate that rapidly changing information technologies are making possible the low-cost sharing of verified personal information for economic reward, or, put differently, the incentivized extraction of previously unavailable personal information from individuals by firms. In this new world, economic actors do not always need to “sort” or screen each other based on publicly available information, but can instead incentivize each other to “signal” their characteristics. For example, an insurance company does not need to do extensive data mining to determine whether a person is a risky driver or an unusual health risk—it can extract that information from the insured directly. Second, this change towards a “signaling economy” (as opposed to the “sorting economy” in which we have lived since the late 1800s) poses a very different threat to privacy than the threat of data mining, aggregation and sorting that has preoccupied the burgeoning informational privacy field for the last decade. In a world of verifiable information and low-cost signaling, the game-theoretic “unraveling effect” kicks in, leading self-interested actors to disclose fully their personal information for economic gain. Although at first consumers may receive a discount for using a driving or health monitor, privacy may unravel as those who refuse to do so are assumed to be withholding negative information and therefore stigmatized and penalized. Third, privacy law and scholarship must reorient towards this unraveling threat to privacy. Privacy scholarship is unprepared for the possibility that when a few have the ability and incentive to disclose, all may ultimately be forced to do so. The field has had the luxury of ignoring unraveling because technologies did not exist to make a signaling economy possible. Those days are over. As the signaling economy evolves, privacy advocates must either concede defeat or focus on preventing unraveling. The latter will require both a theoretical shift in our conception of privacy harms and practical changes in privacy reform strategies.
The Article’s three Parts track these claims. Part I explores the emerging signaling economy. In the signaling economy, individuals and firms can seek out verified, high-quality, low-cost data from each other directly rather than searching through mountains of unverified, low-quality information. Developments in information technology make information increasingly verifiable, and thus increasingly useful as signals in conditions of information asymmetry. I propose a simple metaphor to capture the extreme possibilities of this signaling economy: the “personal prospectus.” The personal prospectus would be a compilation of an individual’s verified private information about themselves — a digital repository containing the data collected from the sensors and drug tests in the previous examples (and the many other innovative monitors undoubtedly around the corner), plus information from one’s bank accounts, educational records, tax history, criminal history, immigration records, health records, and other private sources. It would be the aggregate of all of one’s private tests, records, and history in one massive digital resume, sharable with others at the click of a button.
The personal prospectus provides a useful means to explore the limits and possibilities of this new signaling economy. It also illustrates the shortcomings of existing privacy law and scholarship. Part II takes up the Article’s second claim: that even the first steps we are now taking towards a signaling economy—steps like those in the three examples above—pose a new set of privacy challenges previously largely ignored.
Richard Posner first articulated these challenges decades ago, although at the time they were more theoretical than practical. Even with control over her personal information, he argued, an individual will often find it in her self interest to disclose such information to others for economic gain. If she can credibly signal to a health insurer that she does not smoke, she will pay lower premiums. If she can convince her employer that she is diligent, she will receive greater pay. As those with positive information about themselves choose to disclose, the economic “unraveling effect” will occur: in equilibrium, all will disclose their information, whether positive or negative, as disclosure by those with the best private information leads to disclosure even by those with the worst.
The classic example of unraveling imagines a buyer inspecting a crate of oranges. The quantity of oranges in the crate is unknown and opening the crate before purchase is unwise because the oranges will rot before transport. There are stiff penalties for lying, but no duty on the part of the seller to disclose the number of oranges in the crate. The number of oranges will be easy to verify once the crate is delivered and opened. The buyer believes that there can’t be more than one hundred oranges. The unraveling effect posits that all sellers will fully disclose the number of oranges in the crate, regardless of how many their crate contains. Begin with the choice faced by a seller with one hundred oranges in his crate. If the seller stays silent, the buyer will assume there are fewer than one hundred oranges and will be unwilling to pay for the full amount. The seller with one hundred oranges will therefore disclose and charge full price. Now consider the choice of a seller with ninety nine oranges. If this seller stays quiet, the buyer will assume that there are fewer than ninety nine oranges and will discount accordingly. The silent seller gets pooled with all the lower-value sellers, to his disadvantage. He will therefore disclose. And so it goes, until one reaches the seller with only one orange and the unraveling is complete. As Douglas Baird, Robert Gertner and Randal Picker put it, “[s]ilence cannot be sustained because high-value sellers will distinguish themselves from low-value sellers through voluntary disclosure.” The economist Robert Frank coined the term the “full disclosure principle” to describe this phenomenon in his classic text Passions Within Reason. The principle is simple: “if some individuals stand to benefit by revealing a favorable value of some trait, others will be forced to disclose their less favorable values.”
In the decades since Posner’s challenge, however, privacy law has almost entirely overlooked the threat of unraveling. Instead, recent informational privacy scholarship has focused on the privacy threats of firms sorting individuals by mining aggregated public data such as credit histories. Informational privacy law has reacted to sorting becoming more commonplace and sophisticated. The field is dominated by Daniel Solove’s concept of the “digital dossier,” which is a metaphor for the aggregate of information available online about a given person. Privacy scholars fear that we are moving towards a world in which everything becomes public—where all of our personal information becomes easily available to others as part of our digital dossier. In reaction to this fear, the literature is replete with calls to give individuals greater control20 over their personal information through the common law of property and tort and through stronger statutory privacy rights.
The personal prospectus poses a different threat than Solove’s digital dossier, however, and it demands different solutions than increased control over one’s information. In a signaling economy, even if individuals have control over their personal information, that control is itself the undoing of their privacy. Because they hold the keys, they can be asked—or forced—to unlock the door to their personal information. Those who refuse to share their private information will face new forms of economic discrimination. How long before one’s unwillingness to put a monitor in one’s car amounts to an admission of bad driving habits, and one’s unwillingness to wear a medical monitor leads to insurance penalties for assumed risky behavior? In a signaling economy, forced disclosure will be as or more difficult a problem as data mining and the digital dossier. Part III thus begins to reorient informational privacy law towards the threats of signaling and unraveling. Unlike Posner, however, I do not assume that unraveling necessarily leads to the end of privacy. Instead, Part III explores both the economic limits of the unraveling effect—what empirical investigation has shown us about the conditions necessary for unraveling—and the legal means available to constrain unraveling. In particular, it examines the three possible legal responses to the unraveling of privacy—“don’t ask,” “don’t tell,” and “don’t use” rules—and explores their limitations and implications. I conclude that although constraining the unraveling of privacy is possible, it will require privacy advocates to move well beyond their traditional focus on increasing information control. Instead, the privacy field must wrestle directly with the problems of paternalism inherent in limiting the use of information that at least some consumers may want to disclose for economic advantage. Limiting blood glucose monitoring by insurers may protect the privacy of the least healthy diabetics who might otherwise be forced to disclose by the unraveling effect, but such limits will impose costs on the most healthy and conscientious patients who would otherwise receive discounts for wearing a glucose monitor. How will legislatures respond to consumer pressure for the right to disclose? If to date the informational privacy field has been unable to muster legislative support even for increasing control over personal data, how persuasive will it be when faced with these more difficult prescriptive debates?
Part III offers the first comprehensive exploration of these questions. This discussion is extremely timely, because as the signaling economy develops, courts and legislatures are increasingly wrestling with these problems. The most prominent example is the recent health care bill, the Patient Protection and Affordable Care Act (PPACA). Incentive-based health insurance premiums were a central battleground in the give-and-take leading up to the PPACA’s passage. The PPACA’s § 2705 increases the degree to which employers and insurers can use discounts on health insurance premiums to incentivize employees to participate in wellness programs and try to achieve specified personal health goals. Disease advocacy groups fought for various limitations on the use of incentives, but, intriguingly, privacy advocates were largely absent from the debate. The privacy field seems to have assumed that the only privacy issues in the bill arose in the context of the use and security of electronic health records. This is a mistake. Incentives to signal raise exactly the questions to which informational privacy law must turn: questions of justice, fairness, paternalism and power; questions about coercion and the limits of “voluntary” disclosure; questions, in short, about how to deal with the threat of privacy’s unraveling.