I know the feeling, I just had a week off and returned to work on Friday a couple of days ago.
Formerly @russjr08@outpost.zeuslink.net
I know the feeling, I just had a week off and returned to work on Friday a couple of days ago.
It depends on who you’re referring to as a casual user. My mother for example would certainly have a hard time with it, then figuring out the key to bring up the boot menu (and being faced with a scary dialog that they’ve never seen), then selecting the right device, then likely being faced with GRUB which would also look scary to her, and by then she’d be overwhelmed before even getting to the install portion.
Hmm, gotcha. I just tried out a fresh copy of text-gen-webui and it seems like the latest version is borked with ROCM (I get the CUDA error: invalid device function
error).
My next recommendation then would be LM Studio which to my knowledge can still output an OpenAI compatible API endpoint to be used in SillyTavern - I’ve used it in the past before and I didn’t even need to run it within Distrobox (I have all of the ROCM stuff installed locally, but I generally run most of the AI stuff in distrobox since it tends to require an older version of Python than Arch is currently using) - it seems they’ve recently started supporting running GGUF models via Vulkan, which I assume probably doesn’t require the ROCM stuff to be installed perhaps?
Might be worth a shot, I just downloaded the latest version (the UI has definitely changed a bit since I last used it) and just grabbed a copy of the Gemma model and ran it, and it seemed to work without an issue for me directly on the host.
The advanced configuration settings no longer seem to directly mention GPU acceleration like it used to, however I can see it utilizing GPU resources in nvtop
currently, and the speed it was generating at (the one in my screenshot was 83 tokens a second) couldn’t have possibly been done on the CPU so it seems to be fine on my side.
Yeah, I definitely am not a fan of how AMD handles rocm - there’s so many weird cases of “Well this card should work with rocm, but… [insert some weird quirk that you have to do, like the one I mentioned, or what you’ve run into]”.
Userspace/consumer side I enjoy AMD, but I fully understand why a lot of devs don’t make use of rocm and why Nvidia has such a tight hold on things in the GPU compute world with CUDA.
Ah, strange. I don’t suppose you specifically need a Fedora container? If not, I’ve been using this Ubuntu based distrobox container recipe for anything that requires ROCM and it has worked flawless for me.
If that still doesn’t work (I haven’t actually tried out kobolcpp yet), and you’re willing to try something other than kobolcpp, then I’d recommend the text-generation-webui project which supports a wide array of model types, including the GGUF types that Kobolcpp utilizes. Then if you really want to get deep into it, you can even pair it with SillyTavern (it is purely a frontend for a bunch of different LLM backends, text-generation-webui is one of the supported ones)!
What card do you use? I have a 6700XT and getting anything with ROCM running for me requires that I pass the HSA_OVERRIDE_GFX_VERSION=10.3.0
environmental variable to the related process, otherwise it just refuses to run properly. I wonder if it might be something similar for you too?
I did the same move for similar reasons! Although I still keep windows around on another SSS - and even the Windows Nvidia drivers were being funky for me.
Nvidia shares a lot of logic between their Windows and Linux driver as far as I’m aware, so I suppose it makes sense.
Along with Helldivers 2, I can confirm Apex Legends works as well. Valorant as far as I’m aware is a definite no-go though.
Just adding on, ProtonDB is a great resource for checking game compatibility!
As long as you’re still signed into BW from any of your devices, you can always export the vault from there.
(But yes, actual backups are always a plus)
I’m not the original person you replied to, but I also have a similar setup. I’m using a 6700XT, with both InvokeAI and stable-diffusion-webui-forge setup to run without any issues. While I’m running Arch Linux, I have it setup in Distrobox so its agnostic to the distro I’m running (since I’ve hopped between quite a few distros) - the container is actually an Ubuntu based container.
The only hiccup I ran into is that while ROCm does support this card, you need to set an environmental variable for it to be picked up correctly. At the start of both sd-webui and invokeai’s launch scripts, I just use:
export HSA_OVERRIDE_GFX_VERSION=10.3.0
In order to set that up, and it works perfectly. This is the link to the distrobox container file I use to get that up and running.
We believe in the open internet, but we do not believe in the misuse of public content.
That’s real rich, coming from Reddit.
I had one of these done for an endoscopy - it ended horribly. It got “stuck” so I ended up having to have surgery for it to be removed.
Getting that surgery coordinated and scheduled took months all the while my health was declining. Eventually it got so bad that I couldn’t hold down food and I had to be pre-admitted to the hospital a month before the procedure and put on IV nutrition…
Granted I do have an autoimmune GI condition which is what prompted that test in the first place, and the chances of this happening is supposedly quite small but… Yeah I’ll take the endoscopy and colonoscopy over even that small chance of going through all of that all over again…
Let us know how it goes, and if you have any further questions, feel free to give us a shout!
Ah, gotcha - yeah that is definitely one of the pitfalls of Linux gaming still, which is the “there’s so many different configurations and some of them just work, others don’t” issue. I have an AMD card so perhaps it is an Nvidia issue - hopefully those sorts of issues smooth out once NVK is fully ready to go.
What issues does Forza have on Linux? I can’t speak for the Motorsport series, but I’ve been playing 4 and 5 on my Steam Deck and desktop for a while now without issue.
They have their own hosting plans, but you can also self host it.
This isn’t a problem of Lemmy itself in terms of the software, so I’m not sure it qualifies… But, I find that Lemmy still has the same problem of Reddit where if you say something that the majority of users disagree with, prepare to be torn apart in the comments. And I do not just mean by getting corrected on something you said being factually incorrect, I mean more of a “your opinion is wrong because…”
For example, any discussion revolving around Linux (and let me just prepend this by saying I am a Linux user), if you happen to prefer using Windows be prepared to be told all of the reasons why you have to use Linux instead. And that’s usually tame compared to what I’ve seen on other subjects.
Obviously there are cases where yeah, you absolutely deserve to be torn a new one in the extreme cases when someone is actually being truly vile, such as trying to advocate for the harm of someone/a group of people - but the “extremes” are not what I’m really referring to here.
I’ve blocked a lot of users that while I’ve had no interaction with them, I see how they are clearly engaging in, let’s just say, bad faith with others.
In terms of software-specific issues, I can’t say that I really have had a lot of problems with Lemmy itself as of recently. As an instance owner, I used to have a lot of weird (what seemingly appeared to be, at least) random federation issues, but I haven’t seen any federation problems in a while now. Though just today I swear I submitted a comment somewhere, and its just poof not there - not even locally, but I’m chalking that one up to something I’ve done (whether a misclick, or I’m just hallucinating as badly as an LLM) rather than an actual issue.
Ah, well I hope your new system is more to your liking then!
OpenRGB might be able to help you change the colors if that’s something you’re interested in fixing nowadays.
How about Thunderbolt? This looks like macOS, and while I’m not 100% sure if they utilize HDMI ports anymore, they certainly use Thunderbolt.