We are continually told that to make sense of the world being pulled over our eyes, we need media literacy to understand how and when we are being lied to by generative AI that is amplifying the biases in existing media reporting, entrenching inequalities and teaching people from early childhood. What is terrifying is that unless we are very careful, we will never notice because it is not given to humankind to question things that meet our expectations. A brief sojourn through the ways we are manipulated shows us how we are being led deeper into the mire of our existing societal prejudices and also how we can choose to begin the slow process of wading back out again.

You may have read a lot recently about tools such as ChatGPT for text or Midjourney for art (there are many others) and realized that artificial intelligence has moved out of the realms of science fiction, and even self-driving cars for that matter, and into the realm of an actually usable technology for individuals. There are lot of reasons you might want to avoid using such tools and yet there are some good academic reasons for wanting to experiment with them.

AI is, predictably, getting into hot water amid law suits where artists and their supporters are trying to sue companies that created and own the AI that have begun ‘scraping’ (learning from observing) artists works and producing new works based on what it has learned and then copyrighting these works.

You can now get an AI artist to create fantasy artwork on demand. While this is a boon for creating a huge volume of new artworks for use in the publishing industry, it also means that any artist can have their distinctive art style they have spent half a lifetime creating emulated in minutes by a computer programme. Such AIs can spin off thousands of original works in the style of an existing artist they have studied. This begs the question, can and how will copyright law protect creatives from having their work emulated? As yet, the law seems able and willing but the details are still missing.