From statistical literacy to automating bias
We are continually told that to make sense of the world being pulled over our eyes, we need media literacy to understand how and when we are being lied to by generative AI that is amplifying the biases in existing media reporting, entrenching inequalities and teaching people from early childhood that CEOs and lawyers look like able-bodied class white men, while caring is women’s work and that the lowest paid workers in our society have darker skin tones. We know that most people aspire only to roles and spaces where they see people like themselves and therefore believe they are likely to feel they belong. We also know that people recruit only those people who they perceive to be a good fit for a role and organisation and that this is biased by their cultural expectations – who they have seen succeeding in those roles in the past. This often means that Black people, women and openly LGBTQ+ people are often excluded from professional roles and senior management positions because recruiters struggle to picture these people in these roles, having little to no experience of people from those social groups occupying, let alone succeeding in those roles. This oppression continues long after recruitment: when people in oppressed groups succeed in a role, they continue to be made to feel they do not belong. To take just one representative example, Black female academics continually find themselves self-censoring and doubling down on efficiency, feeling they will be harshly judged if they ever make mistakes that would not even be noticed if committed by white male colleagues.
Sins of generative AI in pictures
Generative AI is unthinkingly reinforcing all these existing prejudices, entrenching societal discrimination and disadvantage. Since it enforces the status quo, which violently represses those currently without power. Most people are invested in the status quo to some extent and fear losing status if society fundamentally changes, and so will not object to existing lines of discrimination being reinforced by new technology. Timnit Gebru claims she was driven out of her job at Google for pointing out how harmful generative AI has been to women of colour. We should not be surprised at the attempt to sideline or silence a minority voice speaking out against one of the potentilaly most profitable
It is an increasingly tall order to master digital, data and media literacy, including being able to investigate video footage, images and text to identify whether it is accurate, truthful and biased; information literacy to determine whether we know enough to form a judgement around something to which we have been exposed and what more we need to know and how to find reliable information to meet our needs in a world that is threatened with being drowned in AI-generated fake news.
Truth and fiction in facts and figures
I started this post intending to argue that we need to master yet another often-overlooked literacy: statistical literacy before finding myself deluged with the call for caution raised by AI. According to nineteenth-century journalist and writer Mark Twain, “There are lies, damn lies, and statistics.” While it is true that data can be wilfully distorted to create the illusion of support for anything a person wants to claim in the same way that words can be crafted to subtly misrepresent the facts to support spurious arguments, carefully interpreted statistical data continues to offer insights into reality. We simply need to be able and willing to examine how the data on which facts and figures are based was gathered, whether it is inherently biased because of the questions asked, the people being consulted, and crucially what we are not being shown. The classical example is the Challenger space shuttle disaster. The number of dangerous faults appeared to occur equally frequently at a range of environmental temperatures. What they ignored was that every successful shuttle launch had occurred when it was relatively warm. Famed physicist Richard Feynman later famously proved on television that the rubber seals being used in the fuel compartments deformed at low temperatures using a sample of the rubber seal, a clamp, and a beaker of liquid nitrogen, silencing the eloquent arguments being put forward by the project coordinators that the people in charge were all free from fault. Ignoring vital information introduced a perceptual bias that doomed an expensive project and killed the shuttle crew.
Another classical example is the misrepresentation of terrorists. White people, including white supremacists, commit more acts of terrorism each year than people of Middle Eastern appearance, yet the image we are presented of the terrorist is always that of the Arab and this prejudice is once again being reinforced by today’s generative AI because the decision engines have been trained using a heavily biased dataset that echoes society’s current prejudices. The visibility of statistical data is being deliberately manipulated to feed our pre-existing prejudices and expectations an thereby maintain the landscape of public opinion.
Give me a lever long enough and a fulcrum on which to place it, and I shall move the world.Archimedes
Becoming the change we want to see
What is terrifying is that unless we are very careful, we will never notice because it is not given to humankind to question things that meet our expectations. Every Black person has felt oppression and yet few, if any, White person owns to oppressing Black people. Something does not add up here. We need to engage in a deeply uncomfortable level of self-scrutiny and a tiresome habit of asking what is being put in front of us and what is being concealed to reinforce what we have come to expect our whole lives. We are all guilty of bystander apathy, struggling on with our own lives while those less privileged than ourselves struggle all the harder for even less. Much of the time, our apathy stems from simply not seeing the oppression because it is normal, expected, the only way we can imagine the world could be. We need to get a lot more creative in how we imagine the world could be and a lot more critical discerning fact from carefully crafted fiction, both in words and figures.