Since then, other employees have corroborated these findings. A former Facebook AI researcher who joined in 2018 says he and his team conducted “study after study” confirming the same basic idea: models that maximize engagement increase polarization. They could easily track how strongly users agreed or disagreed on different issues, what content they liked to engage with, and how their stances changed as a result. Regardless of the issue, the models learned to feed users increasingly extreme viewpoints. “Over time they measurably become more polarized,” he says.
The pigeon. The object of technological experimentation, manipulation, and control, weaponized.
Source: The Pigeons of Ed-Tech
The diet of engagement and behaviorism served by Facebook and big tech make for primitive moral development in both server and served.
Nolan’s inability to think about others, combined with his obvious self-interest certainly exemplify a primitive level of moral development… but, if the adults in his life subscribe to an equally primitive kind of character development, how can we come to expect anything more? How can we expect kids like Nolan to progress to a higher level of ethical behavior when our dependence on rewards and punishment is precisely what condemns kids to such primitive self-interest.
Where were we radicalized? In a Skinner Box. (Skinner preferred the term “Operant Conditioning Chamber”.)
The beauty of the Internet is that by combining big data, behavioral targeting, wearable and mobile devices, and GPS, application developers can design more effective operant conditioning environments and keep us in virtual Skinner boxes as long as we have a smart phone in our pockets.
I would argue, in total seriousness, that one of the places that Skinnerism thrives today is in computing technologies, particularly in “social” technologies. This, despite the field’s insistence that its development is a result, in part, of the cognitive turn that supposedly displaced behaviorism.
We’ve constructed vast engines of confirmation bias that bubble many of us in bespoke Skinner box universes of misinformation, though we dress it all in cognitive science and AI.
These are “technologies of behavior” that we can trace back to Skinner – perhaps not directly, but certainly indirectly due to Skinner’s continual engagement with the popular press. His fame and his notoriety. Behavioral management – and specifically through operant conditioning – remains a staple of child rearing and pet training. It is at the core of one of the most popular ed-tech apps currently on the market, ClassDojo. Behaviorism also underscores the idea that how we behaveand data about how we behave when we click can give programmers insight into how to alter their software and into what we’re thinking.
If we look more broadly – and Skinner surely did – these sorts of technologies of behavior don’t simply work to train and condition individuals; many technologies of behavior are part of a broader attempt to reshape society. “For your own good,” the engineers try to reassure us. “For the good of the world.”
So frustrating, especially when the Internet can be used to the opposite effect.
Chance favors the connected mind. Opportunities for serendipity increase with bigger, more diverse networks. Build personal learning networks. Expose yourself to new perspectives. Listen in solidarity. Be in the space. When we seek perspectives different than our own, share hunches, and connect ideas, we participate in created serendipity.
We can’t make ethical tech if we’re in the behaviorism business.
(It’s worth teasing out a little – but probably not in this talk, since I’ve rambled on so long already – the difference, if any, between “persuasion” and “operant conditioning” and how they imagine to leave space for freedom and dignity. Rhetorically and practically.)
It’s a hyper-evolutionary process that rewards the most extractive, most addictive, most viral strain from the cohort. The key measurement is ENGAGEMENT.
More engagement. More rage, more fake news, all resulting in more hours spent, more eyeballs fixated, more clicks and taps made.
- Persuasion and Operant Conditioning: The Influence of B. F. Skinner in Big Tech and Ed-tech – Ryan Boren
- The Risks of Rewards – Alfie Kohn
- Rewards Are Still Bad News 25 Years Later – Alfie Kohn
- Challenging Behaviorist Dogma – Alfie Kohn
- It’s Not About Behavior – Alfie Kohn
- Autism and Behaviorism – Alfie Kohn
- Education Technology and Skinner’s Box
- Pigeons, Operant Conditioning, and Social Control
- Behaviorism Won
- Digital Literacies and the Skinner Box | Dr. Ian O’Byrne
- Tech Ethics and the New Behaviorism – Ryan Boren
- Post-truth, Open Society, and the Business of Behaviorism – Ryan Boren