Westfield is but one example of an issue all school districts are grappling with as the omnipresence of technology — including artificial intelligence — impacts students’ lives, the district’s superintendent Raymond González said in a statement.

  • rar@discuss.online
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Yes, I suppose given equal input (model, keyword, seed, etc.) two Stable Diffusion installs should output same images; what I am curious about is whether the hardware configuration (e.g. gpu manufacturers) could result in traceable variations. As abuse of this tech gains prominence, tracing back the producer of a certain synthetic media by the specific hardware combination could become a thing.

    • JohnEdwa@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      While it could work like bullet forensics, where given access to the gun you can shoot it and compare it to the original bullet, there is no way to look at a generated image and figure out exactly what made it as there are simply way too many variables and random influences. Well, unless the creator is dumb enough to keep the metadata enabled, by default automatic1111 stable diffusion embeds all of it in the file itself as a png comment thingy.