Nvidia and 4k HDR TV

Dazinek

n00b
Joined
Nov 23, 2019
Messages
4
Hey all,

Want to understand this fully and get this figured out..

I am wanting the best possible picture out of my 4k tv (Samsung 55in MU8000), Supports up to 10bit and has HDR.

In nvidia settings when I am at the color settings what exactly do I set everything at? RGB then full output dynamic range? If so, That only gives 8bpc apparently. Or do I use YCbCr422 and then use 10 bpc? That leaves the output dynamic range only to limited...


Anyone wanna help me with this? Thanks!
 

defaultluser

[H]F Junkie
Joined
Jan 14, 2006
Messages
14,399
Use Windows 10's HDR settings. It's necessary, because HDR isn't just changing the color depth to 10-bit, it's also changing the Color Space.

Settings->System->Display->Play HDR Games and Apps. This automaycally selec ts 10-bit and 4:2:0 for you.

If it's greyed-out, try a different cable, and make sure you have HDR enabled under your TV's settings.

You have to have at least a Maxwell video card to support HDMI 2.0. You can do HDR over DisplayPort, but it requires a more expensive v1.3 converter.
 
Last edited:
  • Like
Reactions: Nenu
like this

Nenu

[H]ardened
Joined
Apr 28, 2007
Messages
20,166
Use Windows 10's HDR settings. It's necessary, because HDR isn't just changing the color depth to 10-bit, it's also changing the Color Space.

Settings->System->Display->Play HDR Games and Apps. This automaycally selec ts 10-bit and 4:2:0 for you.

If it's greyed-out, try a different cable, and make sure you have HDR enabled under your TV's settings.

You have to have at least a Maxwell video card to support HDMI 2.0. You can do HDR over DisplayPort, but it requires a more expensive v1.3 converter.
It may not change the colour space.
I use RGB full normally.
When I turn on HDR in Windows, my Q9FN TV remains RGB full but uses HDR 8 bit.
I can change to 4:2:0 HDR 10 if I choose, but HDR 8 looks great 99% of the time.

Your comment about Maxwell is correct.
 

Dazinek

n00b
Joined
Nov 23, 2019
Messages
4
Are you using an hdmi 2.0a or b cable?
I am using a HDMI 2.0 cable.

Use Windows 10's HDR settings. It's necessary, because HDR isn't just changing the color depth to 10-bit, it's also changing the Color Space.

Settings->System->Display->Play HDR Games and Apps. This automaycally selec ts 10-bit and 4:2:0 for you.

If it's greyed-out, try a different cable, and make sure you have HDR enabled under your TV's settings.

You have to have at least a Maxwell video card to support HDMI 2.0. You can do HDR over DisplayPort, but it requires a more expensive v1.3 converter.
It may not change the colour space.
I use RGB full normally.
When I turn on HDR in Windows, my Q9FN TV remains RGB full but uses HDR 8 bit.
I can change to 4:2:0 HDR 10 if I choose, but HDR 8 looks great 99% of the time.

Your comment about Maxwell is correct.

So I should just enable HDR(which I do) in windows and have nvidia be set at RGB full? Or should I just have the "default color settings" radio button checked? also should I never use YCbCr on the TV?
 

Nenu

[H]ardened
Joined
Apr 28, 2007
Messages
20,166
So I should just enable HDR(which I do) in windows and have nvidia be set at RGB full? Or should I just have the "default color settings" radio button checked? also should I never use YCbCr on the TV?

You need a graphics card that supports HDMI 2.0 (you havent told us what you are using).
The result with HDR 8 may not look as good on your TV, mine is a newer version of your TV.
I made that point in case it doesnt look good enough, to demonstrate you might need to change to ycbcr 4:2:0 to get HDR 10.
Issue it might present is colour banding. I have only seen it happen with HDR 8 in one movie, HDR games look great.

ps
I dont know if your TV supports HDR 8, it might not.
 

Dazinek

n00b
Joined
Nov 23, 2019
Messages
4
You need a graphics card that supports HDMI 2.0 (you havent told us what you are using).
The result with HDR 8 may not look as good on your TV, mine is a newer version of your TV.
I made that point in case it doesnt look good enough, to demonstrate you might need to change to ycbcr 4:2:0 to get HDR 10.
Issue it might present is colour banding. I have only seen it happen with HDR 8in one movie, HDR games look great.
Ah my apologies, I am using a Strix 2080 Ti. When I go into the color settings, the only one that will let me set it at 10bit is 4:2:2. 4:2:0 only has 8 and 12.

According to Rtings, the mu8000 also supports HDR10
 
Last edited:

Nenu

[H]ardened
Joined
Apr 28, 2007
Messages
20,166
Ah my apologies, I am using a Strix 2080 Ti. When I go into the color settings, the only one that will let me set it at 10bit is 4:2:2. 4:2:0 only has 8 and 12.

According to Rtings, the mu8000 also supports HDR10
Yeah I struggle to get 4:2:0 as well.
Use 4:2:2, its better than 4:2:0 anyway.
 

defaultluser

[H]F Junkie
Joined
Jan 14, 2006
Messages
14,399
Yeah I struggle to get 4:2:0 as well.
Use 4:2:2, its better than 4:2:0 anyway.


Right, compressed video just defaults to 4:2:0, because motion does not require any higher definition. 4:2:2 is actually what Windows HDR switches your video card into
 

Armenius

Extremely [H]
Joined
Jan 28, 2014
Messages
33,184
HDMI 2.0 doesn't have the bandwidth for YCbCr 4:4:4 or full RGB above 8-bit color at 4K.
 
Top