The Vocabulary of Media Consumption, AI Slop, and the Erosion of ‘Food for Thought’
- Ksenija Carleton
- 9 hours ago
- 3 min read

What happens when there is no longer any ‘food’ to supply the thought?
In 2015, two of Silicon Valley’s sweethearts, Sam Altman and Elon Musk (sigh), teamed up to spawn a love child from their shared hubris that the hard task of making the world a better place was vested upon them. This love child is generative AI, and in December of 2015, OpenAI was born.
Generative AI, which we are now unfortunately all too familiar with, is defined by Merriam Webster as: “artificial intelligence that is capable of generating new content (such as images or text) in response to a submitted prompt (such as a query) by learning from a large reference database of examples”. The functioning of Generative AI is akin to the treatment of a farmed pig: for years and years it’s force-fed all kinds of shit until one day it makes something new out of itself, like bacon. What differentiates AI from the pig however is that the pig suffers, whereas AI - a passion project made by self-aggrandizing technocrats - causes the suffering.
A new report released this month conducted by the firm Graphite found that over 50% of content circulating on the Internet is AI slop. This term - AI ‘slop’ – caught my eye when it gained traction as a way to describe the AI generated videos and images that now infest our social media feeds: bunnies jumping on a trampoline, Jake Paul putting on makeup and coming out as gay, faces that don’t exist, faces that do exist saying words that are fake, ‘content’ that is ever more devoid of anything at all.
I was intrigued by the word ‘slop’ because it allows me to circle back to my pig analogy. Originally, ‘slop’ refers to an unappetizing hodge podge of food used to feed animals. It is most often associated with feed for pigs. Thus, this begs the question of who even is the pig anymore? We are, after all, the ones now eating the ‘slop’.

This choice of words also fits into the larger semiotic practice of using food, and all its derivative verbs and nouns, as a metaphor for knowledge, media, and our intake of each. Think of how, when discussing the kinds of media we’re interacting with, we almost always talk about “consumption”. Information is debated in terms of how “digestible” it is, media “diets” are picked apart at length, and we are so often left “hungry for more”.
Like most idioms, the origins of the expression “food for thought” are swimming around history, but the idea of deep thinking being a form of mental nourishment - a way to feed the mind if you will - dates back to Ancient
Philosophy. The idiom itself gained popularity in the 19th and 20th centuries. For example, in A Connecticut Yankee in King Arthur’s Court, Mark Twain writes “There was food for thought here. I moved along as one in a dream.”
So ultimately knowledge and its production are much like the pig: in order to make bacon, it must be fed. New forms of knowledge are created out of its previous iterations, new frameworks are made with the help of old tools. Everything we create or conjure up in the mind is building off of a foundation of someone else’s ideas and creations with which we’ve come into contact with. Nothing is ex-nihilo, and originality is a myth. Art is always a riff off of other art.
Therefore, in our current system of information circulation where the Internet is the sun around which we all revolve - a sun made up of ‘slop’ – what will be the references from which we pull, the ideas upon which we build? What will art look like when most of what we’re exposed to, as media, is fake images and fake text, jumbled up together in order to create a soup of stimuli with
no substance?
If ‘slop’ is food, then it is poison for thought. The MIT Media Lab conducted a study this year, analyzing how using ChatGPT affects critical thinking skills. The results showed that, when given an essay-writing exercise, students that used ChatGPT had the lowest levels of brain activity as opposed to students who used Google search and those who didn’t use the Internet whatsoever. The study also showed that the ChatGPT students essentially “didn’t integrate any [of their essays] into [their] memory networks”.
The brain is a muscle and it will become flaccid when generative AI is used in its place. Similarly, if all the brain is being fed is AI ‘slop’ content, it will be capable of producing nothing but more ‘slop’. The pig will start to eat itself.
Ksenija Carleton
Artwork by Chassidy Stephenson




Comments