Highlighted Selections from:

Big Data Ethics

Richards, Neil M. and King, Jonathan H., Big Data Ethics (January 23, 2014). Wake Forest Law Review, 2014. Available at SSRN: http://ssrn.com/abstract=2384174

p.1: In this paper, we argue that big data, broadly defined, is producing increased powers of institutional awareness and power that require the development of a Big Data Ethics. -- Highlighted mar 12, 2014

p.1: Critically, if we fail to balance the human values that we care about, like privacy, confidentiality, transparency, identity and free choice with the compelling uses of big data, our Big Data Society risks abandoning these values for the sake of innovation and expediency. -- Highlighted mar 12, 2014

p.3: Technologists often use the technical “3-V” definition of big data as “high-volume, high-velocity and high-variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision making.” -- Highlighted mar 12, 2014

p.5: We show how the prophesies that “privacy is dead” are misguided. Even in an age of surveillance and big data, privacy is neither dead nor dying. Notions of privacy are changing with society as they always have. -- Highlighted mar 12, 2014

p.5: We argue that “privacy” in today’s information economy should be better understood as encompassing information rules that manage the appropriate flows of information in ethical ways. -- Highlighted mar 12, 2014

p.5: Binary notions of privacy are particularly dangerous and can erode trust in our era of Big Data and metadata, in which private information is necessarily shared by design in order to be useful. -- Highlighted mar 12, 2014

p.6: Big data predictions and inferences risks compromising identity by allowing institutional surveillance to identify, categorize, modulate and even determine who we are before we make up our own minds. -- Highlighted mar 12, 2014

p.22: Such techno-centric worldviews carry an implied undertone of technology infallibility. We must yield our expectations of privacy, they suggest, to make way for the inevitable and get out of the way of technological innovation. -- Highlighted mar 12, 2014

p.23: At a minimum, lawyers use the word “privacy” and the legal rules that govern it to mean four discrete things: (1) invasions into protected spaces, relationships, or decisions; (2) collection of information, (3) use of information, and (4) disclosure of information.74 In the leading conceptual work on privacy, legal scholar Daniel Solove has taken these four categories and expanded them to sixteen categories, including surveillance, interrogation, aggregation, and disclosure. -- Highlighted mar 12, 2014

p.27: The problem is thus not the Death of Privacy, but rather the need for additional principles to govern information flows. -- Highlighted mar 12, 2014

p.30: We have long had confidentiality rules like the duties lawyers owe to their clients and doctors owe to their patients to incent individuals to feel safe in sharing their confidences to advance important societal values of providing effective legal representation and medical care.102 We also have statutory rules that explicitly create confidential relationships regarding health103, financial104, and video records105 information. We also protect obligations of confidentiality that arise through voluntary promises or confidentiality agreements like preventing employees from revealing business secrets.106 Confidentiality law reveals how we have long recognized shared information can still kept private using effective legal tools. Expanding confidentiality law approaches would seem one way to help keep shared information private. -- Highlighted mar 12, 2014

p.31: FTC is now starting to move “beyond the four corners of privacy policies” and shift its focus from enforcing broken promises of privacy, to broken expectations of consumer privacy.110 This subtle but powerful shift puts the FTC in a position to increasingly look at the totality of circumstances surrounding privacy policies including when consumers assume their shared information is being kept private. This expanded view could put the FTC in a position to “demand that companies engage in practices that will correct mistaken consumer assumptions or at the very least not exploit such assumptions;” like when consumers assume their shared private information is being kept confidential -- Highlighted mar 12, 2014

p.33: More broadly, Sotomayor questioned the underlying premise “that an individual has no reasonable expectation of privacy in information voluntarily disclosed to third parties.”123 Justice Sotomayor observed that in the digital age “people reveal a great deal about themselves to third parties in the course of carrying out mundane tasks.” -- Highlighted mar 12, 2014

p.35: In our last paper, we described a Transparency Paradox of big data where all manner of data is collected on individuals by institutions while these same institutions are cloaked in legal and commercial secrecy. -- Highlighted mar 12, 2014

p.45: Given big data’s power to identify, categorize, and nudge us, we will also want to take certain big data predictions and inferences off the table. For example, in the analog world we protect the identity of rape victims. In the big data world it was revealed that data brokers built lists of rape victims for sale. -- Highlighted mar 12, 2014

p.47: Big Data Ethics needs to be part of the professional ethics of all big data professionals, whether they style themselves as data scientists or some other job description. -- Highlighted mar 12, 2014

p.47: Technologists need to drop “privacy is dead” beliefs and move to embrace Big Data Ethics and implement ethical business practices. Privacy by Design is a prominent set of information ethics best practices supported by legal scholars, regulators and technology leaders alike. -- Highlighted mar 12, 2014

p.48: Big data by its very nature requires experimentation to find what it seeks. A central part of this experimentation, if we are to have privacy, confidentiality, transparency and protect identity in a big data economy, must involve informed, principled and collaborative experimentation with privacy subjects. To govern big data experimentation, Professor Calo proposes consumer review boards modeled on the long standing principles of human subject review boards created by universities to resolve ethical problems involving human-subject research.183 Calo observes that the power relationship the experimenter and the subject require higher standards of minimizing harm or causing unfairness as a result of the experiment. -- Highlighted mar 12, 2014

Add to Reading List

Peter Swire, Social Networks, Privacy, and Freedom of Association: Data Protection vs. Data Empowerment, 90 N.C. L. REV. 1371 (2012);

Deirdre Mulligan & Jennifer King, Bridging the Gap Between Privacy and Design, 14 U. PA. J. CONST. L. 989 (2012);

M. Ryan Calo, Against Notice Skepticism in Privacy (And Elsewhere), 87 NOTRE DAME L. REV. 1027 (2012);

Ira S. Rubenstein, Regulating Privacy By Design, 26 BERKELEY TECH. L. J. 1409 (2011);