SARA Book Club



Savitribai Ramabai (SARA) Institute of Data Science
Making Data Science Accessible to All
Free Training in Coding, Statistics, and Research Methods
Priority admission for marginalized communities & women.

The Founder of Algorithmic Justice League
Award-winning researcher and “poet of code”.
Rhodes Scholarship, Fulbright Fellowhship, & Stamps President’s Scholarship.
BS Computer Science, Georgia Institute of Technology
MSc Education Learning & Technology, University of Oxford
MS Media Arts & Science, MIT
PhD Media Arts & Sciences, MIT
“the ways in which the priorities, preferences, and prejudicies of those who have power to shape technology can propagate harm, such as discrimination and erasure” (p. xiii)
Tip
Coded Gaze like “male gaze” or “white gaze” or caste gaze.
“already used to determine who gets a mortgage, who gets hired, who gets admitted to college, and who gets medical treatments” (p. xvi)
“Given the harm of AI, how can we center the lives of everyday people, and especially those at the margins, when we consider the designs and deployment of AI? Can we make room for the best of what AI has to offer while also resisting its perils?” (p. xvii)
“If the AI systems … mask discrimination and systematize harmful bias, we entrench algorithmic injustice.” (p. xix)
“when machines fail, the people who often have the least resources and most limited access to power structures are those who have to experience the worst outcomes.” (p. xix)
“AI will no solve poverty … discrimination … climate change” (p. xix-xx)
“we cannot use AI to sidestep the hard work of organising society … to sidestep conversations about patriarchy, white supremacy, ableism, or who hold power and who doesn’t.” (p. xx)
“how technology can encode harmful discrimination and exclusionary practices.” (p. xx)
“deeper understanding of why each and every one of us has a role to play in reaching toward algorithmic justice … you walk away with questions that push us all to rethink, reframe, and recode the future of AI.” (p. xxi)
“this was certainly not the first time cameras had failed me.” (p. 08)
Personal Favs
“my parents taught me that the unknown was an invitation to learn, not a menancing dimension to avoid. Ignorance was a starting place to enter deeper realms of understanding.” (p. 05)
“You will never find your worth in things.” (p. 05)
Unwelcomed at MIT.
Less funding for her group working technology impact on society.
“What will you do with your privilege?” (p. 19)
Personal Favs
“I was acutely aware of being stereotyped as an angry Black woman, eager to play the race card and find offence in the seemingly innocuous.” (p. 29)
“occurs when one group is better served than another by an AI system. If you are denied employment because an AI system screened out candidates that attended women’s colleges, you have experienced algorithmic bias.” (p. 34.35)
“Machines are presumed to be free from the societal biases that plague us mortals. My experiences were showing otherwise” (p. 42)
Personal Favs
“Access to the training data is crucial when we want to have a deeper understanding of the risks posed by an AI system. Unless we know where the data comes from who colleted it, and how it is organized, we cannot know if ethical processes were used.” (p. 53)
“Was the data obtained with consent? What were the working conditions and compensation for the workers who processed the data? These questions go beyong the technical.” (p. 53)
“Simply because decisions are made by a computer analyzing data does not make them neutral” (p. 54)
“The coded gaze does not have to be explicit to do the job of oppression. Like systemic forms of oppression, including patriarchy and white supremacy, it is programmed into the fabric of society. Without intervention, those who have held power in the past continue to pass that power to those who are most like them.” (p. 55)
“instead of saying a system uses facial recognition, a company might say”face matching” to distance themselves from scrutiny.” (p. 58)
the assumption that a system performs the task it was designed to execute as expected.” (p. 59)
“guess the emotions of a person … age estimation … gender classification … go as far as claiming that their systems can predict someone’s sexual orientation, political affiliation, intelligence or likelihood of committing a crime based solely on their facial features” (p. 60-61)
“facial verification (one-to-one matching) and facial identification (one-to-many matching)” (p. 63)
HireVue case of using AI to analyze candidate’s video.
Dutch student Robin Pocornie remote examination case.
“When companies require individuals to fit a narrow definition of acceptable behaviour encoded into a machine learning model, they will reproduce harmful patterns of exclusion and suspicion.” (p. 66)
“ImageNet showed that strategic data collection, often seen as grunt work and inferior to the development of algorithms, was just as important for advancing artificial intelligence. Data was queen.” (p. 69)
Tip
Team work, you will have to find your own team. You have believe in your cause and find people to get it done.
“the [LFW] database of images contained 77.5 percent male-labeled faces and 83.5 percent faces labeled white. The gold standard for facial recogniation, it turned out, was heavily skewed. I started calling these”pale male datasets”.” (p. 77)
“machine learning community was not yet applying these insights from anti-discriminaiton scholarship.” (p. 80)
“people underrepresented in the dataset. It would mean automated vehicles would be more likely to crash into some groups of people than others.” (p. 81)
“Power shadows are cast when the biases or systemic exclusion of a society are reflected in the data.” (p. 83)
“India with its vast diversity of skin types has an entertainment and beauty industry that elevates light-skinned actors and actresses. … Beyond beauty, lighter skin is also associated with having more intelligence in societies touched by white supremacy.” (p. 84)
“Relying on convenient data collection methods by collecting what is most popular and most readily available will reflect existing power structures.” (p. 85)
