this post was submitted on 01 Sep 2025
250 points (98.1% liked)

Linux

9303 readers
427 users here now

A community for everything relating to the GNU/Linux operating system (except the memes!)

Also, check out:

Original icon base courtesy of lewing@isc.tamu.edu and The GIMP

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Joker@piefed.social 37 points 5 days ago (24 children)

FYI this is the nouveau driver no one uses. There is absolutely no reason to use an nvidia card and this driver.

[–] malloc@programming.dev 16 points 5 days ago (17 children)

Maybe best to avoid NVDA if using Linux, entirely.

My next build is going to be AMD GPU and CPU with nixOS. I heard GPU support for Linux is better with AMD cards, but honestly haven’t delved into it whether it holds any truth or not.

[–] Joker@piefed.social 23 points 5 days ago (3 children)

It’s generally easier because the drivers are built in. Nvidia is perfectly usable, but it’s more susceptible to breaking during kernel updates. It’s not as bad as everyone makes it sound though. That said, AMD is usually the way to go on Linux unless your use case requires Nvidia.

[–] NeilBru@lemmy.world 9 points 5 days ago

The use case is precision CAD and DNN development.

cuDNN+CUDA+TensorCores have the best TOPS/$/kWh performance (for now). Plus, I need ECC VRAM for professional CAD calculations.

There's plenty of reasons to use an NVIDIA stack.

It's just weird when people say there's no reason to use their products.

[–] DeprecatedCompatV2@programming.dev 8 points 5 days ago (3 children)

I use it because I need HDMI :/

Display Port ftw

[–] thingsiplay@beehaw.org 3 points 5 days ago (1 children)

Can you explain what you mean by that? You can't use HDMI with AMD?

[–] MangoPenguin@lemmy.blahaj.zone 4 points 5 days ago (1 children)

Not the newer version of it, they're stuck on the older one.

[–] thingsiplay@beehaw.org 5 points 5 days ago (2 children)

Ah right, I have read about that, just forgot. Man HDMI is such a mess. Use Display Port whenever you can and don't buy a monitor without one ever again.

It really is, most people could probably be using Display Port anyways, unless trying to hook up to a TV I suppose.

[–] frozen@lemmy.frozeninferno.xyz 2 points 5 days ago (1 children)

The 9070 XT supports HDMI 2.1b, and unfortunately my Sapphire NITRO+ has two of them and two DisplayPorts. None of my three monitors support HDMI 2.0 or 2.1, so one of them is stuck at 60 Hz right now, and I'm pretty annoyed about it.

[–] thingsiplay@beehaw.org 1 points 5 days ago (1 children)

Did you make sure its not an issue with the cable? Because the cables need to support the "correct" version and features of HDMI too, not just the GPU and monitor connections and the driver. Man typing that out makes me dizzy.

I've checked it, just to be sure, it's definitely a 2.1 cable, but unfortunately the cable doesn't matter in this case. My monitors are good, but they're older. HDMI 2.0/2.1 wasn't around back then. I get good refresh rates over DisplayPort (I believe they have DP 1.4), and my RX 6800 XT had three of those, so I just naively assumed a 9070 XT would as well.

[–] shadowedcross@sh.itjust.works 2 points 5 days ago (3 children)

What does that have to do with needing NVIDIA?

[–] davidgro@lemmy.world 10 points 5 days ago

Something about AMD not being able to license the HDMI protocol in a way that allows open source code.

The main Nvidia driver that people use is proprietary, so it doesn't have that problem

[–] bjoern_tantau@swg-empire.de 10 points 5 days ago

The newer versions of HDMI aren't supported on AMD cards due to licencing issues.

[–] 9tr6gyp3@lemmy.world 5 points 5 days ago* (last edited 5 days ago)

Obviously AMD and Intel don't include HDMI on their cards.

/s

[–] Jumuta@sh.itjust.works 0 points 5 days ago

"perfectly usable" as in you have to install a third party translation layer to make hardware video decoding work on firefox

load more comments (13 replies)
load more comments (19 replies)