One of the last public events I attended before the COVID lockdowns in 2020 was a panel discussion on technology, profit and market trends in the education technology sector called Friends, Enemies, Frenemies. The event was an opportunity for ed tech reps to get together to talk about trends in education technology (Ed Tech) and strategies for promoting the industry. 

One piece of advice was particularly chilling. The panelists noted that they had already enjoyed some success marketing directly to professors in the post-secondary system and that they viewed teachers as more likely than boards to experiment with and adopt new technologies. Their advice was simple: leapfrog Ministry and board policy by marketing directly to teachers, and then let teachers pressure their schools and boards to officially adopt what they are already using. 

Consider the context in which that advice is playing out. In nearly every public education system around the world teachers and education workers are under-resourced, overworked, expected to take on roles outside their actual job, and struggling to connect with and prepare a population of students with increasingly complex needs for a world of increasingly complex demands but with drastically scaledback social, environmental, and community supports. Educators are caught in a perfect storm. Enter Ed Tech and all its promises of reducing workloads, personalizing learning, and generally revolutionizing education. 

So far, artificial intelligence in education (AIED) is following a pattern that is very familiar to observers of education technology, or Ed Tech. The pattern goes like this: tech’s hyped potential outpaces evidence of its efficacy; profit opportunities incentivize targeted marketing to educators; school boards and unions belated scramble to establish policy guard rails. As a result, new technological practices, including AI, become embedded in schools through a for-profit model before pro-public actors are able to articulate, let alone implement, a fully public alternative. 

Education unions and our allies need to work on how to communicate with members and the public about the potential benefits and the very real hazards of AIED. Such communications need to reflect the complexities of AI itself as well as the multiple ways in which educators will use AI in their jobs. To do this, we need explanatory frameworks that help transform complex and abstract realities about AIED into concrete terms. This will be important for providing pedagogical and policy advice, certainly, but it will be absolutely essential for helping educators and the public to understand how AIED threatens to increase privatization. 

Author and tech critic Cory Doctorow has introduced a new word and a new framework for understanding why our collective experience of the internet and apps seems to be getting worse and worse: enshittification. 

Enshittification captures the market incentives and resulting behaviours of internet behemoths such as Facebook, Amazon, Google, X (formerly Twitter), and TikTok and how those incentives inexorably lead to a worse and worse—enshittified—experience for users and advertisers alike. The potential parallels for education privatization are remarkable.

“At this point, enshittification is complete. Platforms no longer function to provide either an optimal user experience or an optimal business experience. They only provide an optimal profit experience for their shareholders.”

According to Doctorow, enshittification follows a three-stage process through which platforms develop a large user base, hold the user base hostage on behalf of advertisers and publishers, and then hold both hostage so they can rake in massive products. The result of enshittification is that the experiences of the platforms’ individual and business users alike become gradually and then rapidly worse and worse.

It goes something like this. 1

First, a platform provides a novel experience and value with big promises of making that experience better and better as more people join and the platform grows. Think of Facebook for example, with its initial promise to show you content generated by your friends (and only content generated by your friends) without spying on you or harvesting your data. As more people joined, Facebook enjoyed a “network effect,” meaning both that the user experience improved (you have more friends to follow, more groups to like!) and that the cost of remaining outside or leaving got higher (you’ll miss out on all those friends and groups!). In effect, friend groups got locked into Facebook. It was time for stage two. 

In the second stage of enshittification, platforms sell their user base to advertisers and publishers. Facebook broke its promise to only show you the content you asked for (by following friends and subscribing to groups) and instead began to show you ad content. They also broke their promise not to spy on their users and began harvesting data to sell to advertisers. In doing so, Facebook re-allocated the value of their product away from the user base and toward advertisers and publishers. Soon enough, advertisers and publishers also become hostage to the platforms. Practically the only way to get eyeballs on their ads, videos, and stories is to have the platforms force their content into users’ feeds. At this point, the time is right to re-allocate value again; this time it goes to the platforms’ shareholders. 

In third-stage enshittification, advertisers, who had been getting cut-rate deals to show their products to users, start having to compete against each other to be at the top of feeds. Publishers, who used to be able to get users over to their own websites by showing some teaser text with a link, start getting punished (by being sent further down the feed) unless they include full-text articles with no way of redirecting readers off platform. 

At this point, enshittification is complete. Platforms no longer function to provide either an optimal user experience or an optimal business experience. They only provide an optimal profit experience for their shareholders. 

A similar dynamic will almost certainly play out in the education sector. Recall the advice from the Friends, Enemies, Frenemies panel: market directly to teachers. This is how AIED is happening. Teachers need supports and they are excited by the potential AI has to offer a customized, interesting, and novel experience for students. The AI adopters are innovators who see AI as inevitable and are deeply committed to making sure their students are ready for an AI-infused world. The Ministry of Education, school boards, and even educators’ unions have all been left behind as AIED’s astonishing proliferation has outpaced institutions’ capacities to think through pedagogical and policy implications, develop ethical standards, and implement guardrails. In this context, AI companies don’t need to exert pressure on the Ministry and boards to promote adoption of AI…educators are doing that both through their individual use and through the upward pressure they put on boards to accommodate and standardize what is already happening on an ad hoc basis. 

This is stage one enshittification. Teachers and education workers are discovering and making use of all kinds of new teaching and communication strategies made available—and fun!—by AIED. As more and more educators adopt AIED, there will be increasing pressure from students and families for late adopters to jump on board. AIED will become standard and expected, and educators will find themselves ‘stuck’ to various AI apps and software. 

However, boards have a legitimate interest in developing policies and procedures to ensure security and privacy for students. They also need to ensure pedagogical rigour is not abandoned for the sake of technological novelty. Such policies and procedures will almost certainly require standardization and will therefore pressure, or even require, educators to use a specified selection of AI apps. This will help address concerns about security and privacy. It will also enable schools and Boards to save money in the short run through more favourable board-wide purchasing agreements. As institutionalization progresses, Boards will find themselves stuck to apps as the cost of switching to alternatives gets higher and higher: stage two of enshittification will be complete. 

At that point, the platforms and developers will have considerable leverage to start raising prices and adding new criteria to agreements. Want to renew these licenses? Then you’ll have to bundle them with this additional software. Or you’ll have to purchase proprietary hardware and devices to run the software. Or you’ll have to purchase through a specific portal that brings with it new junk fees and surveillance. In a worst-case scenario, the tech companies force use of their own lesson plans and content as part of access to AIED. This is the potential logic of enshittification in education. 

Suddenly, educators find that they’re unable to engage in the discovery and innovation that attracted them to AI in the first place. Educators will be bound by board policies meant to ensure high standards for educational practice, while boards will be bound by contracts and service agreements that make switching to a better alternative unworkably costly, both financially and administratively. At that point, AIED will function in ways that are profitable for developers, but that do not attend to what educators want and students need. 

In his analysis of the enshittification of platforms such as Facebook and TikTok, Doctorow points to key constraints that would help stop tech leaders from pushing us inexorably down the enshittification pathway. These include robust and well-enforced competition laws, equally well-enforced regulations, maintaining users’ ability to free themselves from the worst faces of enshittification through ad-blockers and other kinds of technological self-defense, and strong worker protections to enable folks working in the tech industry to push back against their leaders’ worst impulses. 

The work of education unions and our allies in the fight to defend quality public education needs to immediately focus on figuring out comparable constraints within the education sector. At the very least, boards and educators must avoid getting locked into purchase agreements with ballooning rents and hidden fees lying in wait. Educators also need to be empowered to protect themselves and their students from enshittification through access to quality professional development. They also Educators will be bound by board policies meant to ensure high standards for educational practice, while boards will be bound by contracts and service agreements that make switching to a better alternative un- workably costly, both financially and administra- tively. At that point, AIED will function in ways that are profitable for developers, but that do not attend to what educators want and students need. 18 need to be protected through strong collective agreement protections that defend their professionalism and protect them from unanticipated technological developments. 

At a recent Ontario Secondary School Teachers’ Federation/Fédération des enseignantes-enseignants des écoles secondaires de l’Ontario (OSSTF/FEESO) workshop on artificial intelligence, participants reported that they saw potential for AIED. They thought it could assist with time management, help broaden students’ perspectives, help teachers with providing feedback and planning lessons, and even level the playing field among students. However, they also expressed major concerns about bias in AI-enabled decision-making, the potential for cheating, loss of creativity and critical thinking skills, and even the replacement or de-professionalization of teachers. These results are perhaps unsurprising as they reflect AIED’s dual nature: tremendous potential accompanied by serious concerns. Defenders of public education need to take both sides of the duality seriously. 

Happily, unions and boards are beginning to work on these issues, if somewhat belatedly. The Canadian Teachers’ Federation (CTF) has issued a policy brief on AIED, noting that policies around its adoption are murky and variable across the country. They rightly note major risks created by the absence of coherent and well-considered policies, particularly relating to privacy and security, commercial exploitation, discrimination and bias, and de-professionalization of teaching.2

In the United States, both the American Federation of Teachers (AFT) and the National Education Association (NEA) have developed comprehensive policies and guidance documents. Indeed, AFT has gone further by partnering with NewsGuard and GPTZero to provide tools and supports to members so they can safely and effectively integrate AI into their practice.3

This work needs to continue and be expanded upon across Canadian jurisdictions. If we want to avoid an enshittified education system where tech giants make decisions about pedagogy and AIED on educators’ behalf, we need to mobilize now to develop student-centered, indeed human-centered, guardrails for how and to what extent artificial intelligence makes its way into our classrooms and workspaces.

Notes

  1. Cory Doctorow, “TikTok’s enshittification,” (21 January 2023), https://pluralistic.net/2023/01/21/potemkin-ai/#hey-guys; Cory Doctorow, “My McLuhan lecture on enshittification,” (30 January 2024), https://pluralistic.net/2024/01/30/go-nuts-meine-kerle/.
  2. Canadian Teachers Federation—Fédération canadienne des enseignantes et des enseignants, Towards a Responsible use of Artificial Intelligence in Canadian Public Education (Canadian Teachers Federation—Fédération canadienne des enseignantes et des enseignants).
  3.  American Federation of Teachers, “AFT resolution: Social media, artificial
    intelligence and generative artificial intelligence,” (1 June 2023), https://www.aft.org/resolution/social-media-artificial-intelligence-and-generative-artificial-intelli-gence; National Education Association, Report of the NEA task force on Artificial intelligence in Education (Washington, DC, 26 June 2024), https://www.nea.org/resource-library/artificial-intelligence-education.