Hey, I already have the newest firmware, I installed it when I first got the tv!
Are you sure that my gpu’s HDMI can output 4k@30hz@4:4:4? it has HDMI 1.4 port, not HDMI 2.0!! Thats why i wanted the adaptor to get more features! That said the native HDMI connection actually gives me more features than it should!! i can set it to 4K@30hz@rgb4:4:4 AND i can enable 10bit-HDR in windows settings!! I thought HDMI 1.4 cant do this! Maybe thats why my display is WAY TO DIM when i enable all of the above settings?
Can you confirm that HDMI 1.4 can output 4k@30hz@4:4:4? Because if it can its strange because that specific combo is one of the worst looking ones as far as the test goes!
Hey i hope my posts aren’t too long i am trying hard to keep them short and to the point.
In my opinion i don’t get 4:4:4 on any connection with any setting combo! I could start a thread on LG or AMD forums about this problem but they will probably tell me its a limitation of the HDMI 1.4 port. Am I wrong, is it not? The question is why doesn’t the picture improve with the adapter?
I should also mention that I discovered that if I disable HDMI Ultra HD Deep Color on my tv and then enable HDR in windows the TV detects it and tells me it will automatically enable the HDMI Ultra HD Deep Color! Only HDR automatically enables the HDMI Ultra HD Deep Color changing 4:2:2 to 4:4:4 had no effect! Is it possible that my TV needs HDR to properly show 4:4:4?
At this point i am willing to purchase and test a different adaptor! Can you suggest a DP to HDMI adaptor that will give me 4K@60hz with 10bit-HDR @ 4:2:2? According to my research this should be possible with DP 1.2 and HDMI 2.0
Sorry for the confusion, both of the above pictures have at best 4:2:2 compression. The HDMI port is HDMI 1.4 which supports 4K @ 30Hz with 8-bits per color and 4:4:4 sub-sampling. The DisplayPort to HDMI adapter supports HDMI 2.0 which support 4K @ 60Hz with 8-bits per channel color and 4:4:4 sub-sampling. Neither device will be able to do 10-bits per color without affecting one of the other features ( for instance on the onboard HDMI setting 10-bits per channel will change the sub-sampling setting to 4:2:0 or 4:2:2.
When you have the DisplayPort to HDMI adapter connected, please verify you have 8-bits per color enabled ( instead of 10-bits per color ). Please let me know if this changes the sub-sampling quality. Additionally, manually setting the refresh rate to 30Hz may kick up the subsampling to 4:4:4 ( with 8-bits per color ). Please let me know if you can get a clean image with these settings.
Thank you for trying these reduced settings, there should be plenty of bandwidth for the television to output at 4:4:4 while this still looks like 4:2:2 Chroma sub-sampling is enabled.
To recap what we have done so far:
The input on the television is configured as a ‘PC’ input
The settings on the PC using the DisplayPort to HDMI 2.0 adapter:
Resolution: 3840x2160
Color Depth: 8 bit per color channel ( HDR10 is not supported )
Refresh rate: 60Hz
Chroma Sub-Sampling: 4:4:4
The settings on the PC, using the native HDMI 1.4 connection:
Resolution: 3840x2160
Color Depth: 8 bit per color channel ( HDR10 is not supported )
Refresh rate: 30Hz
Chroma Sub-Sampling: 4:4:4
Tested with a 2 Meter cable
All of these settings are within the specifications for the HDMI versions.
Is the HDMI cable labeled as a “Premium High Speed HDMI Cable”? These are capable of handling up to 18Gbit/s between the computer and Television. Previous “High Speed” HDMI cables should work with 4K@30Hz however older or unlabeled cables may not support this resolution either ( there are currently four generations of HDMI cable, most everything on the market should be at least the second generation, however some older cables may be first generation ).
Unfortunately the next step is to contact LG to see if perhaps this TV is defective or if they have an unlisted setting to enable 4:4:4 chroma sub-sampling. LG support can be reached through the LG webiste ( lg.com/support ).
I am sorry, I have no more ideas on why this is not working with either the native HDMI 1.4 output or the DisplayPort to HDMI adapter and it seems to be on the TV end this is not working.
Please let me know if LG has a solution that works, or if you have any additional questions.
Hey Thank you very much for all the help so far. I agree at this point i should contact LG and see if they can help further. I will report back here with anything new and i will also show them this thread.
Just for reference i am attaching a pic with the cables I’m using! both the 2m and 5m cables are the same brand.
Your recap above is correct but i will add 1 small note. When using DisplayPort, radeon setting dont give you the option to sellect pixel format (chroma sampling). With the hdmi i can confirm that i chose 4:4:4 but with DisplayPort it just gets autonegotiated so i cant really tell what is being used. i guess thats just how that tech works.
1 final question, does pluggable offer a DisplayPort to HDMI adaptor that can give me 4K@60hz with 10bit HDR (without the 4:4:4 support)? As i understand it theoretically this should be possible. I would like to purchase such an adapter regardless of results with my current one!
Thanks for the additional information, the cable should support HDMI 2.0 ( which should work well with our adapter ). Unfortunately we do not currently have a DisplayPort to HDMI adapter that supports HDMI 2.0b ( required for HDR10 or 10 bit color ). Some conversion chips do exist to support HDMI 2.0b however we have not yet found a solution that meets our standards. With the ever increasing number of HDR televisions I expect to see more manufacturers moving to HDMI 2.0b converter chips and availability to increase, but there will be some lag behind the market as we await development and testing.
We’re closing this thread due to inactivity, but if you have any further questions please feel free to contact support@plugable.com and we’ll be happy to help.