Posts: 7,153
Threads: 601
Joined: 2015 Jan
Thanks: 1081
Given 1466 thank(s) in 963 post(s)
Country:
Now that I'm starting to be interested in HDR - it was time! - I'd need a 10bit output from my computer.
There is a bit of confusion about this topic... some says that only pro-GPU (like Quadro and Radeon Pro) have a "real" 10bit output, others say that top consumer models output 10bit, too, but "only in fullscreen mode" DX11 and not openGL...
I also add that few suggested that adding a BlackMagicDesign DeckLink card will solve the problem - even if, according to their website, it seems no "simple" 4:2:0 or HEVC codec is supported.
I just want a 10bit output for everything - starting from HDR video, and ending with video editing... am I asking too much?
What should I get? Is a top consumer GPU like GTX 1080 able to output 10bit? Or do it need an "help" via a BMD DeckLink card? And, if so, will it play an UHD 10bit HDR HEVC video with the right video output?
HELP!
Posts: 7,153
Threads: 601
Joined: 2015 Jan
Thanks: 1081
Given 1466 thank(s) in 963 post(s)
Country:
Read a lot of opinions around, still confused... at least, discovered that DeckLink got "only" rec.601/709, hence no rec.2020 output - so goodbye to "real" UHD WGC!
Anyone that got UHD display connected to their computer, that could check if they can be able to get 10bit (with or without HDR)? And, if so, with which GPU?
Posts: 598
Threads: 75
Joined: 2015 Feb
Thanks: 391
Given 406 thank(s) in 201 post(s)
Country:
(2019-02-09, 02:44 PM)spoRv Wrote: Read a lot of opinions around, still confused... at least, discovered that DeckLink got "only" rec.601/709, hence no rec.2020 output - so goodbye to "real" UHD WGC!
Anyone that got UHD display connected to their computer, that could check if they can be able to get 10bit (with or without HDR)? And, if so, with which GPU?
I'd be interested to know this too - everything I've read suggests you need a Quadro but not experienced enough to know if they're right!
Posts: 20
Threads: 0
Joined: 2015 Oct
Thanks: 4
Given 16 thank(s) in 9 post(s)
Country:
I can output 10bit HDR with a Geforce GTX 970.
Posts: 7,153
Threads: 601
Joined: 2015 Jan
Thanks: 1081
Given 1466 thank(s) in 963 post(s)
Country:
(2019-02-09, 05:11 PM)babouin Wrote: I can output 10bit HDR with a Geforce GTX 970.
I knew that could be possible!
1080p, or 2160p?
Posts: 20
Threads: 0
Joined: 2015 Oct
Thanks: 4
Given 16 thank(s) in 9 post(s)
Country:
2160p struggles a bit with a 40ft cable but it works just fine with shorter cables.
Posts: 7,153
Threads: 601
Joined: 2015 Jan
Thanks: 1081
Given 1466 thank(s) in 963 post(s)
Country:
Great!
Posts: 7,153
Threads: 601
Joined: 2015 Jan
Thanks: 1081
Given 1466 thank(s) in 963 post(s)
Country:
To check if your display is 10 bit (or more), run Monitor Asset Manager: https://www.entechtaiwan.com/util/moninfo.shtm
If it supports 30bpp (10 bit) it "should" be 10 bit - or it is 8 bit + dithering?!?
To be sure, download the Spears 2160p HEVC Rotating 8bit, 8bit+dithering, 10bit Quantization Artifact test pattern: https://drive.google.com/file/d/0B68jIlC...9fRWs/view
Read more here: https://www.avsforum.com/forum/465-high-...depth.html
at the end:
Quote:As Stacey Spears has pointed out to me, there is no way to definitively measure the native bit depth of a display panel without physically dismantling the TV and testing the panel directly, which is clearly impractical. So I'm afraid those who believe that the tests mentioned in this article can provide this information are mistaken. Instead, these tests measure the performance of the system as a whole.
An HDR display with an 8-bit panel and good dithering algorithms can certainly display HDR content in a way that looks better than the best SDR presentation. But I would prefer a panel with a native bit depth of at least 10 bits, along with a signal path that maintains 10-bit precision from one end to the other
|