Why doesn't my 10-bit monitor use 10-bit?
I build my computer in 2014 with a Z87-Pro, GTX 780 and a Dell U2713HM and I've been using that until now. I just bought a new monitor, the Asus PA329C which is a 10-bit monitor and now use that as my primary monitor with the U2713HM as a secondary.
My issue is that I don't seem to get the 10-bit output on the PA329C. When I look at both Windows 10 Screen settings and the NVIDIA control panel it only shows 8-bit (without the option to select 10-bit).
I'm also curious about the refresh rate as I can only select 60 Hz even though it seems to me it should be able to output up to 76 Hz.
I've tried different cables, both 3 different HDMI cables (two older I had lying around and the one that came with the monitor) as well as one DP cable. I've also tried swapping the GPU to an RTX 2060 I had in another machine but so far nothing has worked.
What am I missing and how do I use the full potential of my monitor?
Solution 1:
Answers are in the comments really (thanks Mokubai and Tonny) but here's a summary:
- Install drivers for display (not sure if this did anything as I had to do #2 as well, which I did after this).
- Connect monitor using DisplayPort. HDMI didn't work. My PC doesn't have video over USB-C so haven't tried it.
I couldn't find any documentation on this regarding my monitor but in the end I had to use DisplayPort to use the monitors full potential. HDMI 2.0b, which my monitor and GPU has, supports 10-bit and HDR as far as I know but it didn't work for me with my GPU and monitor combination.
The reason I used HDMI instead of DisplayPort in the beginning was that my secondary monitor only supports full resolution using DisplayPort and DVI-D, and DisplayPort is a nicer cable. I was lucky that my new GPU that was needed for DisplayPort 1.4 also has a DVI-D output so I can use full potential of both my monitors.
Setting up a multiple monitor solution was more tricky than I imagined.