How Citizens Are Affected by Algorithmic Systems

Algorithmic systems are a pervasive aspect of modern life in areas such as web searches, hiring decisions, credit scoring, and determining the cost of health insurance policies. Important social issues arise when the data and information produced by algorithmic systems turns out to be inaccurate, biased, or discriminatory. This situation is made more complicated because algorithmic systems tend to be black boxes, the inner workings of which are often kept secret for proprietary reasons.

Malte Ziewitz, Science and Technology Studies, is studying how ordinary citizens are affected by, struggle with, and challenge algorithmic systems. Ziewitz is specifically using qualitative, historical, and ethnographic methods to understand how people interact with web search engines, the role of search engine optimization (SEO) services, and how the situation of those affected by these systems might be improved.

In one study, he is combining in-depth interviews and participant diaries to understand the lived experiences of data subjects. In a second study, he is blending oral history interviews and document analysis to trace the history of the SEO industry. In a third study, he is using ethnography participant observation to investigate the ethically ambiguous role of commercial providers in representing individuals regarding the system. The studies are accompanied by a new clinical program that engages multidisciplinary teams of undergraduate students in research and advocacy on behalf of data subjects who are not able to defend themselves.

Cornell Researchers

Funding Received

$400 Thousand spanning 5 years