Highlighted Selections from:

Joining the Surveillance Society?

Gangadharan, Seeta. “Joining the Surveillance Society?” New America Foundation (2013): 1–18. Web.

p.1: As tracking and targeting practices become more widespread, members of underserved communities—typically the poor, communities of color, immigrants, and indigenous groups—may be at greater risk of data-driven discrimination than other Internet users. Individuals from these communities have historically suffered from analog forms of data profiling -- Highlighted mar 18, 2014

p.2: But if online privacy and surveillance problems are increasing, and the discriminatory effects of data profiling are becoming evermore apparent, what does being digitally included mean? -- Highlighted mar 18, 2014

p.3: In contrast to digital divide studies, the field of privacy includes a small body of research focused on online privacy, surveillance, and historically marginalized communities. This empirical work speaks to the negative consequences that result from various forms of tracking and monitoring by corporate and government actors and that disproportionately affect the underserved. -- Highlighted mar 18, 2014

p.3: One of the most egregious examples of data profiling took place in the 2000s, when subprime lenders targeted African Americans and Latinos, monitoring their online behavior and plying them with toxic financial products.1 -- Highlighted mar 18, 2014

p.3: Another body of evidence questions the neutrality of search algorithms and identifies ways in which statistical aggregations magnify racial stereotypes. -- Highlighted mar 18, 2014

p.4: We take a qualitative—or descriptive—approach partly due to the fact that digital inequalities—of access, availability, or skill—are coterminous with other forms of social and economic inequalities. This complex situation makes it difficult to isolate discrete variables that differentiate between cause and effect. -- Highlighted mar 18, 2014

p.6: The constrained environment in which marginal Internet users access digital technology corresponds to a situation of diminished power to self-govern and control personal destinies, as described by researchers. 27 While reliance on staff for help also reveals the importance of trusted institutions in helping the underserved go online, 28 marginal Internet users enter into digital society under already unequal conditions of social status. -- Highlighted mar 18, 2014

p.8: Instead, a different kind of fear than one related to terrorism captivated users as they learned to go online: fear of technology. While some of this technophobia pertained to a general anxiety about one’s ability to learn and conquer technology (e.g., “Will I break the computer if I do X?”), fear also meant trepidation of being exploited, duped, or misled while using a computer and being digitally connected to others. -- Highlighted mar 18, 2014

p.9: Such behavior reveals a complex set of attitudes towards corporate tracking, rather than a simple binary of “yes” or “no” in relation to data collection, profiling, or targeting done by companies. -- Highlighted mar 18, 2014

p.10: One woman at the senior center quibbled with the idea that surveillance was new for her or her community. She advocated visibility rather than retreat from the Internet. -- Highlighted mar 18, 2014

p.10: “We all are targeted, because [companies] do the demographics. They find out who’s in the neighborhood, what schools—just a whole lot of information. If you are not in one system, you’re in another. I feel that part of my protection is being visible. Being visible on the Internet helps my protection. Because if I am visible, maybe if something happens, somebody will say,‘No, that’s not her.’” -- Highlighted mar 18, 2014

p.11: On top of the challenges felt at the individual level, institutional capacity shapes the scope and quality of education—privacy or otherwise— available to marginal Internet users. -- Highlighted mar 18, 2014

p.13: Our goal in this study has been to illuminate privacy and surveillance issues in the context of digital literacy institutions and, conversely, digital inclusion issues for privacy and surveillance debates. -- Highlighted mar 18, 2014

p.13: The more we know about privacy, surveillance, and historically marginalized communities, the more policymakers can make informed judgments about context-sensitive remedies to a complex online world. Another area of research ought to identify the specific consequences, both immediate and long-term, of data collection, storage, sharing, and analysis on political, economic, and social life of the underserved. -- Highlighted mar 18, 2014

p.13: Marginal Internet users carry existing inequalities with them into digital environments, including a past history of being surveilled, and they encounter the perils and pitfalls of sharing information -- Highlighted mar 18, 2014