Stop Killing Games, fresh from speaking at EU parliament about video game consumer rights, has now joined the pushback against age verification laws, claiming that they don’t fix the causes of online harm and implementing them is prohibitively complex for game preservation once a publisher moves on.”
In a statement posted on X, Stop Killing Games founder and public face Ross Scott wrote: “While too big for #StopKillingGames to tackle directly, SKG supports pushback against the age verification laws, which will make [distributions] of Linux illegal in California and have caused the game Urban Dead to be killed. This could potentially outlaw private servers.”
He links to a larger statement on the Stop Killing Games Reddit, where the consumer rights movement confirms it has signed a joined a joint statement alongside other entities, which pushes back against these laws, claiming they “can also make private servers, modding communities, fan projects, open-source tools, and preservation work harder or even impossible to operate”.
In a section detailing why Stop Killing Games, an initiative specifically started in response to developers removing access to bought games, has attached itself to another big issue like age verification, the statement reads: “SKG is about making sure games are not destroyed when official support ends. That does not just mean ‘publishers should keep servers on forever’. It means players and communities need practical ways to keep games working after publishers move on.”
The argument is that many games are kept working by community-run ventures like private servers, Discord hubs, open source tools and wikipedias, and that these resources are threatened by these laws.
Stop Killing Games points to Urban Dead, a 20-year-old browser game that shut down last year as a result of the UK’s Online Safety Bill. The game’s creator, Kevan Davis, found it unfeasible to implement safety measures required by this act, and as such closed the game down on the 14th March.
The Stop Killing Games statement goes on to emphasise that this isn’t just a UK issue, pointing to California’s Digital Age Assurance Act / AB 1043, which it believes makes independent software harder to maintain due to requiring age assurance checks in operating systems, software distribution, and app stores.
It also acknowledges that child safety is important but believes this is the wrong approach to solving the problem: “It is frustrating to see policymakers suddenly claim everything is ‘for our safety’ while young people are often left to deal with bigger problems on their own elsewhere. And even when the goal is reasonable, this approach goes far beyond what is normal or proportionate. Mission creep is real and some actors don’t just creep. The issue is that blunt access bans and mandatory age checks do not fix the root causes of online harm.”
“They often create new gatekeepers, collect more sensitive data, and make the open web harder to use. They also risk punishing the small community projects that are least able to comply, while the largest platforms adapt and become even more entrenched.”
Last month, PlayStation started requiring age checks from users in the UK and Ireland to comply with such laws. Discord, too, started introducing age verification in February for “sensitive content”. Both measures received pushback from those not keen to share personal information with companies.





