xrandr: Configure crtc 0 failed when adding 4k@60Hz mode
I have a 4k monitor (AOC U3277PWQU) that only runs at 30Hz even though it should be able to handle 4k@60Hz. The monitor is connected via HDMI. (Yes, very similar questions have been asked before, but none had an answer that solved my question.)
What I tried:
~> cvt 3840 2160 60 -r
# 3840x2160 59.97 Hz (CVT 8.29M9-R) hsync: 133.25 kHz; pclk: 533.00 MHz
Modeline "3840x2160R" 533.00 3840 3888 3920 4000 2160 2163 2168 2222 +hsync -vsync
~> xrandr --newmode "3840x2160R" 533.00 3840 3888 3920 4000 2160 2163 2168 2222 +hsync -vsync
~> xrandr --addmode HDMI-1-1 "3840x2160R"
~> xrandr --output HDMI-1-1 --mode 3840x2160R --verbose --crtc 0
crtc 0: 3840x2160R 59.97 +0+229 "HDMI-1-1"
xrandr: Configure crtc 0 failed
crtc 0: disable
crtc 1: disable
crtc 2: disable
screen 0: revert
crtc 0: revert
crtc 1: revert
crtc 2: revert
Same story with cvt
without -r
and gtf
. The modeline above seems to fit the parameters of the monitor that I found in the manual, i.e.:
Pixel clock: 600MHz(DP,HDMI2.0)
Horizontal scan range: 30~160KHz(DP,HDMI2.0)
Vertical scan range: 23Hz~80Hz
What makes the configuration fail? Is it the case that the pixel clock needs to be exactly 600MHz? If so, how can I modify the modeline – is it safe to just edit the first parameter?
P.S.: The EDID obtained from xrandr --verbose
cannot be parsed with this tool I found, so it may be corrupt.
I have a GeForce GTX 1050, nvidia driver 384, Ubuntu 16.04. Since this was requested in another question:
~> lspci -k | grep -EA2 'VGA|3D'
00:02.0 VGA compatible controller: Intel Corporation Device 591b (rev 04)
DeviceName: Onboard IGD
Subsystem: Dell Device 07be
--
01:00.0 3D controller: NVIDIA Corporation Device 1c8d (rev a1)
Subsystem: Dell Device 07be
Kernel driver in use: nvidia
Edit: I noticed that nvidia-settings doesn't show my displays in "X Server Display Information". It just says "X Screen 0 (no Scanout)".
I found the solution in this post: My laptop, a Dell XPS 15 9560, apparently can only deliver 4k@30Hz over HDMI. The USB-C port in contrast is capable of that, maybe it's connected to the dedicated graphics card and the HDMI isn't. A bit disappointing, but by using a USB-C to HDMI converter, the problem went away. I didn't even have to make any changes to xrandr settings.