The Internet as a Giant Skinner Box
For both users and workers, it's hyper-gambling all the way down
The Skinner box is a famous model for operant behavioral conditioning. It’s really simple, you place a pigeon in a box with a button. Every time it presses the button, reward (food) is dispensed.
Psychologist B. F. Skinner tried this with different reward schedules to measure behavior. Sometimes the reward would be dispensed after a few clicks, sometimes after a set amount of time. But when Skinner set the reward to be random, the pigeon engaged in addictive behavior and pressed the button as much as possible. The pigeons also began to follow superstitious behaviors, as if turning around or nodding their heads was related to the reward.
Randomizing the reward induced the highest and most consistent rate of response. It was also the most habit-forming, persisting long after the reward stopped being dispensed. The Skinner box is a very basic form of behavioral conditioning that nonetheless works extremely well. Not only does it keep casinos profitable, but it’s also core internet infrastructure now.
If you’re online, you most likely already know this. Platforms want to retain users as much as possible, so they gamify through reward-based randomness. The algorithmic, infinite scroll on practically every platform nowadays is a prime example.1 Whereas novelty-seeking behavior comes with risks in the real world, the endless scroll lets you pursue that reward in complete safety. It also erodes the social world outside, since you’re too “spent” to save that dopamine for other people.
The Skinner box online goes even deeper though, also affecting the invisible workers who are building AI and the new web. In Empire of AI (2025), journalist Karen Hao interviews ghost workers employed by the data annotation industry. This little-talked-about sector has grown into a multi billion-dollar industry in recent years to train and moderate AI systems like ChatGPT. Workers from Venezuela, Kenya, Ghana, Colombia, Bangladesh, and elsewhere are paid very low wages (often $1-2/hr) to quickly parse through images and texts to identify them for machine learning: “Should this item be listed under clothing or accessories?” “Does this video contain crime or human rights violations?”2
As I was reading Hao’s anecdotes, I was surprised to find how gamified this sort of work really is. Just like users, ghost workers are conditioned through Skinner-like methods except the stakes are much higher. In her book, Hao interviews a worker named Oskarina Veronica Fuentes Anaya, a Venezuelan refugee who then shared an apartment with half a dozen relatives in Colombia. She worked for Appen, a tech firm that trains chatbots and AI assistants. As a worker, the tasks and bonus payouts were practically random.
Hao writes:
The erratic, unpredictable nature of when the work came and went began to control Fuentes’s life. Once she was taking a walk when a task arrived that would have earned her several hundred dollars, enough money to live for a month. She sprinted as fast as possible back to her apartment but lost the task to other workers.
From that day on, she stopped leaving the house on weekdays, allowing herself only thirty-minute outings on weekends. She slept fitfully, worried about the tasks that would arrive in the middle of the night.
When things were good, they were really good. When things were bad, she stayed tethered to the platform with the stubborn faith that it would return her loyalty.3
This is only a degree removed from actual gambling, but it is one’s entire livelihood. Some of the ghost workers Hao profiles had no workspaces and had to be found through WhatsApp location pins, working exclusively from their phones in impoverished conditions.
Ghost workers include not only those in data annotation but also content moderation for social media. The moderation work is especially dark, requiring workers to quickly sift through violent and disturbing images to flag. Virtually all of them develop PTSD from their jobs.4 The rapid piecemeal work, randomized tasks and bonuses, harsh penalties for inaccuracies, and never knowing when the rewards will end, all function similarly to a Skinner box.
Demand for this kind of work is only growing. Virtually all of the leading tech companies rely on these contractors to moderate and train machine learning. This hidden industry is not much of a secret anymore. Recently, Meta purchased a large stake in Scale AI, one of the largest employers of ghost workers, for $14 billion.
With the rise of gamified ghost work, the internet has come full circle. Not only are billions of users exposed to Skinner-like conditioning, but so too is the invisible labor that moderates and annotates the internet’s data. And with AI adoption, platforms will undoubtedly invent new ways to keep attention flowing. In this environment, the best defense is to simply put up as many obstacles as possible to keep the false rewards away.
The mass adoption of the algorithmic infinite scroll by platforms started around the early 2010s. It was invented in 2006 by Aza Raskin who has now come to regret it.
Empires of AI (2025), pg. 197
Empires of AI (2025), pg. 200-201
I appreciated article. There is an uncomfortable irony however, as the request to share this in the 21st century means using the Skinner boxes. Since cancelling all my accounts but one - and I only scan that for news - I'm conflicted about supporting this newsletter and this story there. Will sharing this info with a couple million addicts make any difference?