I find it interesting that Meta Platforms, Inc., a company known for harvesting user data, is blocking some servers from fetching its public posts. They decided to implement a feature Mastodon calls Authorized fetch.
This was always going to happen. They will block agressively, because they can’t have their precious advertising money mixed with CSAM, nazis and other illegal content. And the fedi is full of that.
Well that’s good. It’s a great game. I’ve been spared of some of the technical problems, so I’m good - but there are still some bugs lurking around. Could have used couple of more months of polishing before release.
Holy crap this was amazing work.
I was active during “the Providence Wars” (shoutout to all Ushra’Khan peeps) and it’s probably the most immersive MMO-experience I’ve ever had.
Good times with 4am alarm clock stront hauling ops
My top 5 would be:
World of Warcraft (easily over 10k hours, been playing since release)
EVE Online
Crusader Kings
Stellaris
Blood Bowl
It’s also a matter of scale. FB has 3 billion users and it’s all centralized. They are able to police that. Their Trust and Safety team is large (which has its own problems, because they outsource that - but that’s another story). The fedi is somewhere around 11M (according to fedidb.org).
The federated model doesn’t really “remove” anything, it just segregates the network to “moderated, good instances” and “others”.
I don’t think most fedi admins are actually following the law by reporting CSAM to the police (because that kind of thing requires a lot resources), they just remove it from their servers and defederate. Bottom line is that the protocols and tools built to combat CSAM don’t work too well in the context of federated networks - we need new tools and new reporting protocols.
Reading the Stanford Internet Observatory report on fedi CSAM gives a pretty good picture of the current situation, it is fairly fresh:
https://cyber.fsi.stanford.edu/io/news/addressing-child-exploitation-federated-social-media