It is a concern, I just don’t know how it’s meaningfully enforceable at scale. Just like OSA. What do you want me to do about it personally?
I never supported the idea.
Piefed.social Staff
Community owner of !television@piefed.social and !obscuremusic@piefed.social
It is a concern, I just don’t know how it’s meaningfully enforceable at scale. Just like OSA. What do you want me to do about it personally?
I never supported the idea.
Is your argument really “this won’t affect linux, so it doesn’t matter” ? At the very least, FOSS development by anyone in California will be a problem, as the law quite literally names “persons” as potentially liable.
I’m taking the position that this is largely unenforceable at a software and OS level beyond larger players that come from California or specifically do a lot of trade in California.
The reality remains, the US is the most thirsty for this kind of thing. Not the least.
This specifically is quite different to most other efforts. Not sure if it might get constitutionally tested.
Windows, and any other OS will be illegal in California unless it implements this.
Right, as I said - I just don’t see how this is meaningfully enforceable. It’s a complete farce. It’s on the level of the Online Safety Act it being enforceable.
Apple, for one, is headquartered in California.
Oh, I forgot Apple. Sure.
But there are many other OS. How on earth can they credibly enforce this?
Did you not read my comment? Anyone writing software for an OS that implements this, can be sued (in California) if it ignores the API signals from the OS and allows access to age-restricted content.
Yeah, this is just not meaningfully enforceable. Big companies will follow, but it would mostly be ignored by everyone else.
Yes, but if the OS was not designed in California and you are not based in California (you’re not Windows, basically) - I fail to see how they can meaningfully compel anyone to follow this. Moreover, even if an OS somehow could know the users age - that doesn’t automatically mean all other software that exists automatically reads it and responds to it as necessary.
Does the law compel anyone making software to recognise this?
Whether they do so optionally is a different thing entirely, to be fair.
I’m not even sure how that is remotely enforceable, although this also is a somewhat different thing to what this thread is about.


@rimu@piefed.social This would have been a rimu decision.
Yes? USA is the least likely to do this. Porn laws in various states don’t apply to social media.
Other attempts have been stuck in legislative hell, been unenforced or have court cases challenging their legality (Mississipi)
I just think this is a logistical dead-end for regulators who may rely on the chilling effect of the thought of being targeted rather than actually being targeted. Unless the Fediverse somehow becomes massive, I don’t see that it’ll ever enter their eyes. Especially as many places will be based in the USA who is the least likely country to implement these laws, and the most hostile to any threats from foreign regulators (see again the 4chan example).
Do they? There’s one thing to make it law, another thing to enforce it. OSA in the UK has been around since last July and managed to do nothing other than pick a fight with 4chan and get nowhere. I seem to recall someone mentioned Lemmy to Ofcom in a discussion regarding OSA and they were literally like “What’s a Lemmy?”
How on earth do you imagine a regulator is going to work out how to deal with 50+ federated instances (for instance)?
In comparison to Europe/UK/AUS which is far further along this road (and implemented social media age requirements), absolutely. Also, apparently it’s just a checkbox as far as this particular California law goes.