this post was submitted on 25 Nov 2025
28 points (100.0% liked)

Linux

10244 readers
428 users here now

A community for everything relating to the GNU/Linux operating system (except the memes!)

Also, check out:

Original icon base courtesy of lewing@isc.tamu.edu and The GIMP

founded 2 years ago
MODERATORS
 

I recently purchased a LG OLED C2 TV, which supports 4k120Hz but only has HDMI 2.1 ports. I am aware of HDMI 2.1 being an issue on AMD, even though my GPU is Nvidia (RTX 3060Ti) I want to switch to AMD in the future, so I opted to invest in an AMD-friendly setup. I purchased the Cable Matters active DisplayPort -> HDMI 2.1 converter mentioned in this Reddit thread, which purportedly can do 4k120. However, when I change it from the default 4k60 modeline to 4k120, the TV shows a "no signal" message. In fact, to get 120Hz I need to drop the resolution down to 1440p.

Even though I'm not going for VRR (yet), I also tried flashing the VRR-enabled "Spyder" firmware just in case it fixes the refresh rate (it does not). I tested every DP port on the 3060Ti as well, with no changes. What might I be doing wrong?

top 7 comments
sorted by: hot top controversial new old
[–] toothbrush@lemmy.blahaj.zone 10 points 21 hours ago

In that reddit thread, they dont seem all too happy about that converter either, lots of people mention that it doesnt work correctly for them, so maybe your exact setup just isnt compatible with it?

[–] fulg@lemmy.world 6 points 21 hours ago

I have the same TV. It doesn’t specifically help you but using HDMI 2.1 sources works fine for 4K HDR 120Hz and VRR. I have too many devices (Xbox Series, PS5, AppleTV, Switch) so they are routed through a GUIDE3 switcher.

So I would think it is the DP converter or the HDMI cable not really being 2.1.

[–] truthfultemporarily@feddit.org 6 points 22 hours ago

What adapter are you using specifically, because on their website I can only find 4k30.

[–] Lemmchen@feddit.org 2 points 21 hours ago (1 children)

Can you elaborate on why HDMI 2.1 is supposed to be problematic on AMD?

[–] Viirax@piefed.social 8 points 20 hours ago

HDMI Forum blocking the implementation of HDMI 2.1 in open source drivers. Really sucks for people who want to use their systems connected to TVs.

[–] afk_strats@lemmy.world 2 points 16 hours ago

I'm familiar with that thread. I have a C1 tv which I wanted to use with an Intel iGPU at 4K120 HDR. The Intel HDMI output can only do 4K60 outside of Wndows. The iGPU's dp does work at 4K120HDR. Out of curiosity, I also confirmed it works on a 7900xt at 4K120 HDR. I did this on Kubuntu and CachyOS with Plasma on Wayland

I used this cable: https://www.amazon.com/dp/B0D7VP726N

Not an ideal setup as HDMI 2.1 had slightly more bandwidth which translates to better picture quality, but it works better for me

[–] Hond@piefed.social 2 points 16 hours ago* (last edited 16 hours ago)

I have a LG CX and a AMD 6900XT. I just settled on some random noname adapter cable since i never could find reliable enough information on the net regarding VRR(irc you need atleast a rx7000 series amd card for that). Everything works on my setup except VRR. Atleast on the CX remote you can press the green button seven times to get an overlay that shows if 10bit, vrr, hdr, chroma subsampling, etc is in place. Helped me a lot when i tried to dial in everything.
Like 4k120 HDR VRR is perfectly possible with hdmi 2.0 bandwith. It will just be subsampled down to horrible 4:2:0...