Explore our archive of completed work.
Student-Led
Electronic Health Systems in Estonia, a thesis by Ransho Ueno. 2022
Coding Care: My Visit to Green Village Nursing Home, a senior project by Leila Henry. 2022
Collaborative
The Data Will Not Save Us: Afropessimism and racial antimatter in the COVID-19 pandemic. Anthony R. Hatch. 2022
This project emerged as the COVID-19 pandemic was still in its early phase (March-May 2020). That summer, I joined several other sociologists in writing a public reflection intended for the website of the Section of Science, Knowledge, and Technology (SKAT) of the American Sociological Association. This short essay, “Two Meditations in Corontatime,” provided a point of entry for a more extended critical assessment of how racial health disparities data are mobilized in public health crises. I undertook that assessment as part of my 2022 Robin M. Williams Distinguished Lecture in the Eastern Sociological Association and a special issue of Big Data & Society on “Data, Power, and Racial Formation,” guest edited by Kate Henne and Renee Shelby. Xia Xiang and Sarah Asiedu provided research assistance on this manuscript in Black Box Labs’ first year.
To Search and Protect?: Content Moderation and Platform Governance of Explicit Image Material. Mitali Thakor, Sumaiya Sabnam, Ransho Ueno, and Ella Zaslow. 2023
Child pornography, or child sexual abuse material (CSAM), is often held up as the limit case justifying government surveillance of digital platforms and devices. Under the 2008 Protect Our Children Act, technology companies are mandated to report CSAM to the National Center for Missing and Exploited Children (NCMEC), the government clearinghouse for such data. However, the means through which companies obtain knowledge of CSAM on their platforms is discretionary and variable, using both human and algorithmic processing power. Pro-privacy groups have raised concerns about the mission creep of image-based content moderation and digital search for its potential violations of the US Constitution’s Fourth Amendment protection of the “right to privacy.” Nonetheless, legal instruments continue to expand the reach of government power to search and seize incriminating data and to hold technology companies further liable for failing to report CSAM content, chipping away at the publishing immunity they currently have under US law. This case study describes the current scope of content moderation practices and introduces arguments for and against expansion of digital search toward removal of explicit or violent image content.