Google’s DeepMind unit is unveiling today a new method it says can invisibly and permanently label images that have been generated by artificial intelligence.

  • Sethayy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    I think youre mixing together a couple angles here to try n make a point.

    ‘Unless the open source model is the best…theyre using proprietary code’ youre talking about a hypothetical program hypothetically being stolen and referencing it as a definite?

    As per the companies, of course they only use certain resoures, theyre companies they need returns to exist. A couple million down the drain could be some CEO’s next bonus, so they won’t do anything theyre into sure they’ll get something from (even if only short term)

    As per the 4chan, was that a coincidence or are you referencing unstable diffusion? Cause they did do almost exactly that (before of course it got mismanaged cause the nsfw industry is always been a bit ghetto)

    And like sure fold it at home or donate for aws, same end result really doesn’t matter what the user’s are comfortable with

    And whew finally sure ms bought github but like you think stable diffusion bought the internet? Courts have proven webscraping is legal…

    Ik this is a wall of text but like I said these arguments all feel like a bunch of thoughts tangentially related