Hello guest, if you like this forum, why don't you register? https://fanrestore.com/member.php?action=register (December 14, 2021) x


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
[Help] 10bit output from a computer
#1
Now that I'm starting to be interested in HDR - it was time! - I'd need a 10bit output from my computer.

There is a bit of confusion about this topic... some says that only pro-GPU (like Quadro and Radeon Pro) have a "real" 10bit output, others say that top consumer models output 10bit, too, but "only in fullscreen mode" DX11 and not openGL...
I also add that few suggested that adding a BlackMagicDesign DeckLink card will solve the problem - even if, according to their website, it seems no "simple" 4:2:0 or HEVC codec is supported.

I just want a 10bit output for everything - starting from HDR video, and ending with video editing... am I asking too much?

What should I get? Is a top consumer GPU like GTX 1080 able to output 10bit? Or do it need an "help" via a BMD DeckLink card? And, if so, will it play an UHD 10bit HDR HEVC video with the right video output?

HELP!
Reply
Thanks given by:
#2
Read a lot of opinions around, still confused... at least, discovered that DeckLink got "only" rec.601/709, hence no rec.2020 output - so goodbye to "real" UHD WGC!

Anyone that got UHD display connected to their computer, that could check if they can be able to get 10bit (with or without HDR)? And, if so, with which GPU?
Reply
Thanks given by:
#3
(2019-02-09, 02:44 PM)spoRv Wrote: Read a lot of opinions around, still confused... at least, discovered that DeckLink got "only" rec.601/709, hence no rec.2020 output - so goodbye to "real" UHD WGC!

Anyone that got UHD display connected to their computer, that could check if they can be able to get 10bit (with or without HDR)? And, if so, with which GPU?

I'd be interested to know this too - everything I've read suggests you need a Quadro but not experienced enough to know if they're right!
Reply
Thanks given by:
#4
I can output 10bit HDR with a Geforce GTX 970.
Reply
Thanks given by:
#5
(2019-02-09, 05:11 PM)babouin Wrote: I can output 10bit HDR with a Geforce GTX 970.

I knew that could be possible! Happy

1080p, or 2160p?
Reply
Thanks given by:
#6
2160p struggles a bit with a 40ft cable but it works just fine with shorter cables.
Reply
Thanks given by:
#7
Great! Ok
Reply
Thanks given by:
#8
To check if your display is 10 bit (or more), run Monitor Asset Manager: https://www.entechtaiwan.com/util/moninfo.shtm

[Image: 1766257102c5d3a7eb.jpg]
If it supports 30bpp (10 bit) it "should" be 10 bit - or it is 8 bit + dithering?!?

To be sure, download the Spears 2160p HEVC Rotating 8bit, 8bit+dithering, 10bit Quantization Artifact test pattern: https://drive.google.com/file/d/0B68jIlC...9fRWs/view

Read more here: https://www.avsforum.com/forum/465-high-...depth.html

at the end:

Quote:As Stacey Spears has pointed out to me, there is no way to definitively measure the native bit depth of a display panel without physically dismantling the TV and testing the panel directly, which is clearly impractical. So I'm afraid those who believe that the tests mentioned in this article can provide this information are mistaken. Instead, these tests measure the performance of the system as a whole.

An HDR display with an 8-bit panel and good dithering algorithms can certainly display HDR content in a way that looks better than the best SDR presentation. But I would prefer a panel with a native bit depth of at least 10 bits, along with a signal path that maintains 10-bit precision from one end to the other
Reply
Thanks given by:


Possibly Related Threads…
Thread Author Replies Views Last Post
  Program for measure light output of HDR10+ and DV? PDB 3 1,820 2023-08-27, 12:52 AM
Last Post: axeyou
  computer for restorations - a simple FAQ spoRv 4 4,892 2019-03-25, 02:08 PM
Last Post: spoRv
  Computer Hardware Questions ww12345 15 16,803 2015-11-06, 02:08 PM
Last Post: MrBrown

Forum Jump:


Users browsing this thread: 3 Guest(s)