I don’t know anything about GPU design but expandable VRAM is a really interesting idea. Feels too consumer friendly for Nvidia and maybe even AMD though.
also at beehaw
I don’t know anything about GPU design but expandable VRAM is a really interesting idea. Feels too consumer friendly for Nvidia and maybe even AMD though.
I can’t believe someone has paid for that domain name for 23 years… O_O
I like the friendlier feeling of Seaford (the o shapes have a little tilt to them rather than being straight on the grid), but I’m guessing they leaned towards the most “generic” of the five because as a default font you want it to become “invisible” almost. I think a more unique font would stand out and then become a little grating over time given how much it would be seen.
Yup; hopefully there are some advances in the training space, but I’d guess that having large quantities of VRAM is always going to be necessary in some capacity for training specifically.
Yup; hopefully there are some advances in the training space, but I’d guess that having large quantities of VRAM is always going to be necessary in some capacity for training specifically.
So I’m no expert at running local LLMs, but I did download one (the 7B vicuña model recommended by the LocalLLM subreddit wiki) and try my hand at training a LoRA on some structured data I have.
Based on my experience, the VRAM available to you is going to be way more of a bottleneck than PCIe speeds.
I could barely hold a 7B model in 10 GB of VRAM on my 3080, so 8 GB might be impossible or very tight. IMO to get good results with local models you really have large quantities of VRAM and be using 13B or above models.
Additionally, when you’re training a LoRA the model + training data gets loaded into VRAM. My training dataset wasn’t very large, and even so, I kept running into VRAM constraints with training.
In the end I concluded that in the current state, running a local LLM is an interesting exercise but only great on enthusiast level hardware with loads of VRAM (4090s etc).
I appreciate this point of view! My BA is in visual arts, but I’ve also leaned heavily into tech, programming as a hobby, etc.
I think there’s a lot of different topical threads at play when it comes to AI art (classism and fine art, what average viewers vs trained viewers find appealing in a visual medium, etc) – but the economic issue that you point out are really key. Many artists rely on their craft for their literal bodily survival, so AI art is very much a real threat to them.
But, when I first interacted with Midjourney, and seeing my mom (just an average lady) being excited about AI generated art, I can’t help but see it like photography – all of a sudden the average person gets access to a way of visually capturing things that make them happy, that they think look cool, something they saw in a dream but didn’t have the skill to create visually… and that doesn’t sound like an inherently bad thing to me.
This should just be part of configuring Sonarr/Radarr settings correctly. Do you have a red message in the settings that says a download client is missing, or have you filled out the download clients settings section with your torrent client info? If yes, have you checked the “auto import from client” box? and, have you set your root library folder in the media management section?
Yeah after some googling I’m kinda thinking this is a fake screenshot, idk
Damn, never seen that before. Is it a windows 11 thing? It’s looking more and more like I’ll have to move to linux on my desktop, I guess.
Edit: hard to find a source for the image; I assume if it was real there’d be a lot more reports of this online but I’m not seeing those.
Interesting. I wonder if it’s worth putting a Faraday cage around a home NAS – but it sounds like the electrical surge from it being plugged in might fry it as well.
Interesting! Sakurai would say keep your params out of the code, so that you can easily tweak all params in one spot when balancing things. But maybe having all params in code is reasonable to handle when you’re a solo dev.
Never heard of them, but just looking at a registrar comparison chart, their renewal costs are pretty high. eg. $20 for .wiki
renewal at Porkbun and $30 at Hover. Maybe they bundle in a lot of services along with it that make the price worth it? but unless you’re taking full advantage of those (if they’re offered) then you could def get a better deal elsewhere.
Namecheap has okay starting prices but man their renewal prices aren’t great compared to other registrars.
I just transferred all my domains out of Namecheap into Porkbun. I think Porkbun is 10 to 50 cents more expensive than Cloudflare, but they seemed a bit easier to use and could hold all my TLDs. So far, a way better experience than Namecheap!
I’m curious, how are you discovering new music this way? my understanding of soulseek and nicotine+ is that they’re great for finding music by artists you already know, but idk how they would work for discovery…?