In October 2025, California's governor signed a new law that seemed small but carries enormous implications. The Digital Age Assurance Act (AB 1043) requires operating system providers to collect age information from users at account setup and transmit that data to app developers via a real-time API starting January 1, 2027.
On the surface, it sounds like a reasonable child safety measure. In practice, it signals something far larger: the operating system (the foundation of your digital life) is becoming an identity machine. And when operating systems become identity machines, anonymity becomes incompatible with the default internet.
AB 1043 is written with specific language: any "operating system provider" must collect age data at account setup and provide an "age bracket signal" to applications that request it. The law doesn't require government-issued ID verification. It accepts self-reported age as "just a birthdate field". It's framed as a way to protect minors from age-inappropriate content without collecting personally identifying information.
But here's the catch: the law doesn't distinguish between Apple, Microsoft, and a volunteer-run Linux distribution maintained by a teenager in Brazil.
Under the broadest legal interpretation, every operating system provider (including each of the 600+ active Linux distributions) is a separate entity with separate compliance obligations. Each one must build the infrastructure to collect age data, store it, and transmit it to third-party applications through an API.
Linux distributions don't have centralized servers. They don't have legal departments. They don't have budgets. Debian, Fedora, Ubuntu, Pop!_OS, Linux Mint, Arch, Gentoo..They're maintained by volunteers who contribute in their spare time. The law, as written, demands something they can't build.
The response from major Linux distributions has been more ambiguous than outright refusal.
System76, the company behind Pop!_OS, said it hopes the laws will be "recognized for the folly they are" and overturned or found unconstitutional. However, the company also stated it will likely comply, saying: "We are accustomed to adding operating system features to comply with laws."
Ubuntu's parent company Canonical announced it is reviewing the law with legal counsel but has "no concrete plans on how, or even whether, Ubuntu will change in response."
Fedora and other distributions are discussing technical approaches, with community conversations ongoing but no official refusal statements yet. 4
However, one new project "Ageless Linux" emerged in March 2026 as an explicit protest. Designed to "deliberately refuse to comply" with age verification laws, Ageless Linux is positioning itself as an operating system for "humans of indeterminate age."
What's notable is that no major established Linux distribution has publicly refused to comply. Instead, they're reviewing options, discussing technical workarounds, and hoping the laws get overturned. The real rebellion is coming from new projects rather than existing distros.

California's law reveals a structural problem with how regulation is written in 2026. Legislators were thinking about Apple, Microsoft, and Google. They imagined centralized companies with servers, APIs, legal teams, and budgets.
But the legal text doesn't mention those companies by name. It says "operating system provider." That means it applies equally to Canonical (which makes Ubuntu) and to a volunteer maintaining a Debian spin-off used by thousands of people.
The result is a compliance trap: large companies can build the required infrastructure. Volunteer-run projects cannot. And when only large companies can comply, regulatory pressure becomes a form of competitive advantage for centralized platforms.
This is the pattern that kills open source. Not through direct censorship, but through impossible-to-meet compliance requirements that only well-funded organizations can afford.
On March 17, 2026, Brazil's Digital Statute for Children and Adolescents (ECA Digital – Law No. 15,211/2025) took effect, requiring operating system-level age verification. The response was immediate: Arch Linux 32 restricted access from Brazil entirely. MidnightBSD followed. Other projects issued warnings.
The projects didn't do this because they support age restrictions. They did it because they couldn't build the required infrastructure and couldn't afford the legal risk. Blocking access was the only option available to a volunteer-run project.
This is what regulatory overreach looks like in practice. Within days of a law taking effect, open-source projects that served Brazilian users simply vanished from that country. Not because they were malicious. Because they couldn't comply with impossible-to-meet requirements.
What makes laws like this significant isn't age verification itself. It's that it normalizes the idea that operating systems should collect and transmit identity signals.
For decades, you could use a computer without proving who you are. Windows and macOS have gradually eroded that model through account requirements and sign-ins. Now a government is making identity collection at the OS level a legal mandate.
Age verification is the wedge. But once identity collection is normalized at the operating system level, the next layers (verification for financial services, political participation, professional licensing) become trivial to add.
And the precedent spreads. Other jurisdictions look at California and see a model. If California can require OS-level age collection, why can't Texas require it for a different purpose? Why can't the EU require it for GDPR compliance? Why can't other countries require it for their own policy goals?
This isn't hypothetical. The UK Online Safety Act, which came into force in July 2025, already required age verification for adult content. What started as "protect children from pornography" quickly expanded into a broader surveillance infrastructure.
Platforms were required to implement age checks. Users had to prove age to access restricted content. And the infrastructure that was built to prevent children from seeing adult content became available for reuse.
UK regulators even prohibited platforms from promoting VPN use to circumvent age checks, recognizing that the moment identity becomes mandatory, people will seek ways to maintain privacy. And the moment they do, the regulatory response is to treat privacy tools as suspicious.
That's the pattern to watch. Child safety is the justification. But the infrastructure being mandated can be repurposed for almost anything.

AB 1043 isn't an isolated California story. It's part of a much larger global trend.
The EU is rolling out Digital Identity Wallets by the end of 2026, under eIDAS 2.0 regulations. Each member state must offer at least one wallet to citizens by then. The UK is considering mandatory digital ID for workers. Over 90 countries are either operating or piloting digital identity frameworks that integrate biometric verification and legal compliance.
The common thread is simple: identity is becoming infrastructure. And once identity is infrastructure, the internet changes.
When identity becomes the default, anonymity stops being a technical choice. It becomes a political choice and political choices are easier to restrict than technical ones.
Here's what's actually happening:
First, anonymity is being stigmatized. Once identity collection is the default and the norm, people who refuse to participate become visible. They stand out. And what stands out attracts suspicion.
Second, the tools that enable privacy are being treated as suspect. VPNs, proxies, Tor, anonymous email services - they're all being reframed as "evasion tools" rather than privacy tools. The UK explicitly prohibits promoting them as ways to circumvent age checks. The framing privacy = evasion becomes the norm.
Third, lawful anonymity stops being protected. You can claim "I have nothing to hide." But anonymity isn't only about hiding wrongdoing. It's about protecting journalists, activists, victims of abuse, politically exposed people, and ordinary people who simply don't want to be tracked. When identity becomes mandatory at the OS level, those people lose protection.
For a private or anonymous hosting provider, this is the context you're operating in.
Your value proposition is that you minimize what you collect, reduce unnecessary identity binding, and let people participate online without leaving permanent traces. That's becoming a counter-cultural position.
As identity becomes normalized at infrastructure layers, hosting that doesn't demand identity becomes more valuable. And more pressured.
The pressure comes from:
AB 1043 and similar laws are a signal that regulators are comfortable pushing identity collection into the deepest layers of infrastructure. That pressure will spread to other layers: hosting infrastructure, payment systems, identity verification for basic services etc.
What does this look like in practice? Operating systems are the layer where users begin. If the OS requires age or identity, then every application built on that OS inherits that identity. The entire stack becomes linked to identity from the bottom up.
For hosting providers, the implication is that users will increasingly come from an identity-linked context. That changes what "private" hosting actually means. You can minimize your own collection, but your customers arrive pre-identified at the OS level.
This creates pressure to collect more (to "comply" with regulatory expectations), store longer (because regulators demand audit trails), and cooperate more (because the identity layer is already there).
Some people think the answer is "just don't collect data." But AB 1043 shows the pattern: regulators are moving toward mandatory collection at infrastructure levels. The real counter-move is philosophical and structural: