A small country with a big chance to hit social media platforms where it hurts
2026-03-28 - 16:04
Comment: In the past few weeks, two stories have run in parallel that, on the surface, didn’t appear to have anything to do with each other. The first: New Zealand is moving toward a social media ban for children under 16. The second: a documentary about the manosphere, the loose online ecosystem of misogynistic content, incel forums and alpha-male influencers which has sparked fresh outrage about what boys are being fed by their algorithms. Different issues. Different audiences. But pull back the lens a little, and they are the same story. Both are about what happens when a technology sector, which has spent two decades building engagement machines optimised for profit, is allowed to operate without meaningful accountability. And both are pointing, however haltingly, toward the same conclusion: something has to change. Start with the ban. The instinct behind it is understandable. Parents are frightened – even those who do not typically subscribe to governments intervening in their lives. Schools are overwhelmed. Governments feel pressure to do something visible. And restricting children’s access to platforms feels like a logical lever to pull. But let’s call it what it is: we are regulating the victim, not the perpetrator. We are telling children – and their parents – that the burden of protection sits with them. Log off. Put the phone away. Opt out. Meanwhile, the platforms that designed the system, that built the recommendation engines and the infinite scroll and the notification loops specifically to maximise time-on-app, carry on largely untouched. The ban is not a solution; it is a confession that we don’t know how to demand one. And the victims of online harm are not only young. The evidence of harm to adult mental health is substantial and growing. Older users are disproportionately targeted by scams, fraud and financial manipulation that originate on the same platforms. The Wall Street Journal has described Meta as increasingly a cornerstone of the internet fraud economy. A subsequent Reuters investigation, citing leaked internal documents, found that Meta’s own projections estimated roughly 10 percent of its 2024 revenue – around $16 billion – came from ads linked to scams and banned goods. Meta disputes the figure, but the documents also showed that when the company weighed whether to crack down, the calculation was explicit: the anticipated fines were simply less than the money being made. The harm runs across every age group. The policy debate has just been slow to catch up. To be fair, the government knows this. Education Minister Erica Stanford has been explicit that the age restriction is only the first move – stage one of a two-part plan. The real ambition, in her words, is “the bigger piece of work around an act, a serious regulator and that changing of behaviour”. The parliamentary select committee that reported back earlier this month went further, calling for a dedicated national online safety regulator with genuine powers to hold platforms to account. That is a meaningfully different proposition from a ban. It is about changing what platforms do, not just who can see it. But here’s the catch. A serious regulator, one with real teeth, real independence and the mandate to go after algorithmic design rather than just user access is also slower, harder and far more susceptible to being watered down by the time it survives caucus, Cabinet and the lobbying that will intensify the moment it looks like it might actually pass. The ban is the easy part. The regulator is where the real fight will be. The manosphere problem makes this even clearer, and the recent Netflix documentary has reinvigorated a conversation that many hoped was fading but clearly isn’t. Because this isn’t a story about children stumbling onto something harmful by accident. It is a story about platforms deliberately surfacing and amplifying content that radicalises young men not despite their algorithms, but because of them. Boys are being fed a diet of rage, contempt and grievance, served up in escalating doses, because that content keeps them watching. Meta’s vice president and global head of safety Antigone Davis presents Instagram’s ‘Teen Account’s features in Auckland in February. The new features include default restrictions for young people. Image: Screenshot/RNZ The result is a generation of young men being quietly reshaped by systems designed not to inform or connect them, but to keep them angry and online. The fear is not just what they believe now, it is what they will do with those beliefs. In households, in relationships, in workplaces. The documentary gave that fear a face. The algorithms have been building it for years. And no age restriction fixes that. Banning a 15-year-old from Instagram does not touch the machinery distributing the content. The platforms know this, of course. They have spent years making themselves indispensable – not just to teenagers, but to all of us. The small business owner whose entire customer base lives on Instagram. The community group that organises on Facebook because there is nowhere else everyone already is. The tradesperson, the artist, the journalist, the charity – all of them dependent on infrastructure they do not own and cannot influence. Social media has become, in any practical sense, public infrastructure. We rely on it the way we rely on roads and electricity. And yet we have allowed it to operate with none of the obligations we place on any other form of essential infrastructure. A power company cannot decide, unilaterally, what you are allowed to see or say. We have rules for utilities because we understand that dependency creates vulnerability. The same logic applies here and the fact that these platforms are privately owned does not change the nature of what they have become. Nobody is arguing we burn it down. The argument is that it needs to be governed. And with this broad impact, a strange coalition has been quietly forming, strange because its members have almost nothing else in common. Some are concerned about what platforms have done to the economics of their own industries – draining revenue while simultaneously extracting and repurposing content without fair compensation. Others are angry about becoming unwilling infrastructure for fraud and financial crime, absorbing reputational and financial costs of harms that originate on platforms they don’t control. There are health advocates and researchers. Parent groups and teachers. Those furious about the manosphere. Allies of our most vulnerable communities who are disproportionately impacted. Journalists being bullied to the point of leaving their jobs. Those furious about what they see as ideological content moderation going the other way. And now, with new data showing that digital advertising, which reportedly accounts for 72 percent of total New Zealand ad spend, is built on modelled estimates rather than actual disclosed revenue from the platforms themselves, you can add market integrity to the list. The figure that shapes billions of dollars of advertising decisions in this country is, according to industry insiders, effectively unverified. The same opacity that enables online harm, it turns out, also runs through the economics. A remarkable range of people and sectors, approaching from completely different directions, arriving at roughly the same destination: the platforms need to be held accountable. It is one of the stranger political alignments of our time. And yet it has not, until now, cohered into anything like real, collective pressure. Why? Part of the answer is that strange bedfellows don’t naturally sleep in the same bed. Some don’t want to be seen marching alongside others whose motivations differ from their own. Some are cautious about anything that looks like political activism. Everyone has their own grievance, their own preferred solution, their own concern about being associated with someone else’s cause. The coalition exists in theory. In practice, it fragments. But there’s another part of the answer that’s more interesting and more hopeful. Because the question of whether it matters that your reasons differ, if the change you want is the same, is actually a solvable one. History is full of coalitions held together by a single shared demand, even when the motivations behind it are wildly divergent. Which brings us to this: In early February, Antigone Davis, Meta’s vice president and global head of safety, flew to Auckland. Not to Sydney. Not to a regional hub. To Auckland. For a country of five million people. She was there, ostensibly, to promote Meta’s ‘Teen Accounts’, a set of default restrictions for younger users that the company argues addresses safety concerns without the need for legislation. It was a polished move. Here is our solution, she was saying. You don’t need a regulator or a law – trust us. Think about what that visit actually signals. Meta dispatched one of its most senior global executives to the end of the world to make the case, in person, against legislation that would affect a fraction of a fraction of their user base. That is not courtesy or a routine stakeholder management, that is a company that is paying very close attention and that understands, better than we perhaps do, what is at stake if New Zealand gets this right. They know that New Zealand is small, connected, with a tradition of moving quickly on social policy and the kind of place where coalitions form faster than in larger, more fragmented markets. They know that the select committee’s own final report acknowledged NZ cannot regulate global platforms alone, and called for alignment with the EU, the UK and Australia, precisely the kind of international coordination that actually frightens a global platform. They know that if the strange coalition actually coheres somewhere, it might cohere here first. And they came to make sure it doesn’t. The fact that they showed up is, paradoxically, the strongest argument for why we should. This is the moment. Not because New Zealand is uniquely righteous, or because the age restriction bill is exactly the right instrument it may not be, but because the conditions for something real are finally in place. The strange bedfellows are in the same room, albeit not quite yet in the same bed. The public appetite is there. The select committee has delivered a blueprint. And the Government has until June to respond. The question is not whether New Zealand is too small to matter. Small countries change things all the time – in trade, in arms control, in environmental law – when they act with clarity and hold the line. The question is whether we are willing to stop treating this as a collection of separate problems – the children problem, the manosphere problem, the advertising problem, the fraud problem – and start treating it as one. The question is whether the regulator that Stanford has promised actually arrives with teeth intact, or gets quietly defanged on the way through. The platforms are not ignoring us, they never were. They were just waiting to see if we’d figure that out.