• 1 Post
  • 50 Comments
Joined 5 months ago
cake
Cake day: March 26th, 2024

help-circle
  • Please don’t assume anything, it’s not healthy.

    Explicitly stating assumptions is necessary for good communication. That’s why we do it in research. :)

    it depends on the license of that binary

    It doesn’t, actually. A binary alone, by definition, is not open source as the binary is the product of the source, much like a model is the product of training and refinement processes.

    You can’t just automatically consider something open source

    On this we agree :) which is why saying a model is open source or slapping a license on it doesn’t make it open source.

    the main point is that you can put closed source license on a model trained from open source data

    1. Actually the ability to legally produce closed source material depends heavily on how the data is licensed in that case
    2. This is not the main point, at all. This discussion is regarding models that are released under an open source license. My argument is that they cannot be truly open source on their own.

  • Quite aggressive there friend. No need for that.

    You have a point that intensive and costly training process plays a factor in the usefulness of a truly open source gigantic model. I’ll assume here that you’re referring to the likes of Llama3.1’s heavy variant or a similarly large LLM. Note that I wasn’t referring to gigantic LLMs specifically when referring to “models”. It is a very broad category.

    However, that doesn’t change the definition of open source.

    If I have an SDK to interact with a binary and “use it as [I] please” does that mean the binary is then open source because I can interact with it and integrate it into other systems and publish those if I wish? :)














  • Is there a reason you’re not considering running this in a VM?

    I could see a case where you go for a native install on a virtual machine, attach a virtual disk to isolate your library from the rest of the filesystem, and then move that around (or just straight up mount that directory in the container) as needed.

    That way you can back up your library separately from your JF server implementation and go hog wild.


  • sunstoned@lemmus.orgtoSelfhosted@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 months ago

    Syntax-wise, it’s meant to be identical. I got on board when they were the only ones that enabled rootless (without admin privileges) mode. That’s no longer the case since rootless docker has been out for a while.

    I’m personally a fan of the red hat docs and how-to’s on podman over the mixed bag of tech bro medium articles I associate with docker.

    At the end of the day this is a bit of a Pokemon starter question. If your top priority is to get a reasonably common and straightforward job done just pick one and see where it takes you! :)


  • Chiming in to note that GNSS communications are actually receive only. A typical phone can’t physically broadcast a strong enough signal into mid-earth orbit (where most of those satellites typically are) to achieve the “pinging GPS satellites” issue.

    Note this only refers to how that signal physically hits your phone. Once your position is deduced and digitized there’s an entirely different attack surface.

    The other concerns (especially cell tower data tracking) are valid though.