• 0 Posts
  • 12 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle
    1. Refurbished ones are just as good as fresh ones, and basically always “on sale” since their price is reduced.

    2. Valve seems to be moving towards a very likely Steam Deck Refresh. Very little is known about when or how this will happen. Based on previous comments and data-mining, the refresh will have the exact same amount of gaming-power. It may, however, have a better WiFi-chip, better screen, and stuff like that. Nothing is certain and if you want a Deck soon-ish, I wouldn’t recommend waiting for this.






  • The solution I’ve sort-of found is to go to communities of Arch-based systems instead of Arch itself. The same solution should work in most cases*, and the communities are more newbie-friendly.

    *Depends on how close to Arch the distro is in this aspect/subsystem. The Manjaro community is probably less likely to offer AUR based solutions, since the AUR can be unreliable/unsafe on Manjaro.





  • Direct link to the (short) report this article refers to:

    https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf

    https://purl.stanford.edu/vb515nd6874


    After reading it, I’m still unsure what all they consider to be CSAM and how much of each category they found. Here are what they count as CSAM categories as far as I can tell. No idea how much the categories overlap, and therefore no idea how many beyond the 112 PhotoDNA images are of actual children.

    1. 112 instances of known CSAM of actual children, (identified by PhotoDNA)
    2. 713 times assumed CSAM, based on hashtags.
    3. 1,217 text posts talking about stuff related to grooming/trading. Includes no actual CSAM or CSAM trading/selling on Mastodon, but some links to other sites?
    4. Drawn and Computer-Generated images. (No quantity given, possibly not counted? Part of the 713 posts above?)
    5. Self-Generated CSAM. (Example is someone literally selling pics of their dick for Robux.) (No quantity given here either.)

    Personally, I’m not sure what the take-away is supposed to be from this. It’s impossible to moderate all the user-generated content quickly. This is not a Fediverse issue. The same is true for Mastodon, Twitter, Reddit and all the other big content-generating sites. It’s a hard problem to solve. Known CSAM being deleted within hours is already pretty good, imho.

    Meta-discussion especially is hard to police. Based on the report, it seems that most CP-material by mass is traded using other services (chat rooms).

    For me, there’s a huge difference between actual children being directly exploited and virtual depictions of fictional children. Personally, I consider it the same as any other fetish-images which would be illegal with actual humans (guro/vore/bestiality/rape etc etc).



  • Spiracle@kbin.socialtolinuxmemes@lemmy.worldSome trouble
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    1 year ago

    Doesn’t even have to be a “class of idiots”. It would be enough if stuff didn’t just sometimes break, seemingly randomly. (It’s not quite random, obviously.)

    Recent example: I had OpenSuse TW recommended because of its reliability. First tip: install codecs, which requires adding the Packman repository. Now, simply updating threw up errors several times because Packman and the other repositories are apparently not in sync, and some dependencies would break if I updated. (Waiting a few days “fixed” it, but still shouldn’t happen.)

    Depending on which update method you use (Yast/Discovery/zypper/update widget) you get different error messages, most of which are not informative. This is for an established distribution known for its reliability, and this alone would keep me from ever recommending it to normal users, even moderately tech-savvy ones.

    Things are getting better, but I’m still shopping around for a distro that just works. Perhaps that new Fedora version, or one of the immutable ones, now that they are getting popular.