I’m trying to use the HDMI 2.0 “Plugable Active DisplayPort to HDMI Adapter” (full size Displayport version) to connect my PC to my receiver. This is with windows 10, an Nvidia 2080ti, and a Yamaha RX-V777BT. Connecting directly from the graphics card to my receiver works fine with the HDMI port on my graphics card, all channels show up, all frequencies and bit depths function properly, 44khz-192khz, 16bit and 24bit.
With the same cables and the same port on the receiver with your adapter I only have stereo 16bit 48khz available. Even using a vertex between them to emulate a different edid does nothing. I did get it to partially work before using “Custom Resolution Utility” and editing into the edid 5.1 manually, this did allow me to select 5.1, however even at the time that worked anything over 48khz would put no actual audio out to my speakers. This is also not reliable through any changes to my setup.
Another issue is I don’t seem to be able to select YUV 4:4:4 nor RGB Full, just RGB limited, both are selectable with a direct connection, while less important than the audio issue it would be nice to find a solution to this as well.
Is there a firmware update available for me to try?
Have tried with 2 different 5’ cables as well as a 6’ and a 15’ cable between the adapter and the receiver, no changes, all these cables work fine in the graphics card’s hdmi 2.0 port
No change for switching down to 1080p as far as yuv, the only options are RGB limited and YUV 4:2:2
Television is an LG OLED65C9
No change when connecting directly to the television, same color options
Some additional testing I have found that if I connect to only the receiver, with the receiver itself disconnected from the display I am offered 5.1, though if i pick anything above 48khz it sounds distorted. as soon as a tv is connected to the receiver, the name in windows changes from the receiver name to the television name and 5.1 and 7.1 vanish leaving only stereo. changing the receiver options to disable full 4k 4:4:4 also has no effect
The same results with a samsung 4k television as far as RGB limited go
Thanks for the additional details, this television supports 4K with HDR (High Dynamic Range) and requires HDMI 2.0b which has increased bandwidth and adds support for HDR, our adapter only supports up to HDMI 2.0. Normally this will cause the refresh rate to be reduced to 60Hz, but can also cause the YUV compression 4:2:2 depending on how the television and graphics card negotiate the connection.
The limited audio issues can also be caused by the system attempting to negotiate as much bandwidth as possible for the video signal even though it is not available to this connection.
I am sorry this graphics adapter is not well suited for this television and I would be happy to offer to return the adapter and refund your original purchase. If you would like to go ahead with a refund please send our support team an email at ‘support@plugable.com’ with the subject line ‘Ticket #286336 - Attention Pat’ and include your Amazon Order ID ( available from Amazon.com/orders ) for the original purchase of the adapter and I will provide a prepaid return shipping label for sending back the adapter.
I am sorry this adapter is not compatible with the television and this is causing the graphics issue, and possibly causing the audio issues as well. Unfortunately we do not have a DisplayPort to HDMI adapter that supports HDR at this time.
Thank you for the reply but I’m not enabling HDR, my receiver does not support HDR either, and I used your adapter with another television that doesn’t support HDR with the same results.
HDMI 2.0 and 2.0b are both 18gbps, so it would be strange for it to be a bandwidth issue, though my receiver is actually 2.0, and even while forced to only support 4:2:2 it doesn’t change the available audio channels.
https://imgur.com/VqAq87x here is your adapter being used with a 1080p television, it also doesn’t offer 4:4:4 nor does RGB full work
I don’t have any ATI graphics cards to test with that have displayport out, so maybe this is an Nvidia issue. I could probably test it with an Intel output later.
I’m not really after a refund at this point but thank you, I’m sure I can find another use for the adapter.
One last question, for future reference are there any receiver brands that you’ve found to generally be compatible with these adapters?
Thanks for sending the picture, does setting the drop down under “Dynamic Range” from limited to full resolve the YUV 422 issues?
HDMI 2.0 and 2.0b do have the same theoretical maximum bandwidth, however the encoding scheme changes to allow for HDR provide additional usable bandwidth to HDMI 2.0b, not much but enough to enable the 10-bits per color channel at 4K. HDR compatibility in televisions varies by manufacturer and model line, with some supporting HDMI 2.0b or HDMI 1.4, negotiating a HDMI 2.0 connection down to HDMI 1.4. Others will be much more forgiving and accept a HDMI 2.0 signal and operate with HDR disabled. Still some other televisions will not accept anything except a HDMI 2.0b signal and be non-functional with lower signal levels unless manually set to HDMI 1.4 with no option for HDMI 2.0.
I am sorry, we don’t test this adapter with receivers, from our customers reports some receivers will work just fine, until updating the receiver firmware when the audio or resolution will stop working. Normally I recommend not using this adapter with a receiver as most graphics cards have native HDMI output that can be used instead. This adapter is best suited for use with a HDMI monitor without DisplayPort input, or a television up to 4K without HDR capability.
Please let me know if the Intel DisplayPort has any different behavior, we haven’t seen any issues with NVIDIA in our testing and most cards still use the reference designs from NVIDIA.
Not possible to set full dynamic range for 4:2:2, in fact while using RGB with this adapter Full does show, but when you pick it it’s reset back to limited immediately after, this behavior is consistent across every nvidia graphics card I’ve tried with it; 980ti, 1070, 1080ti, 2080ti and across every television I’ve tried, Sony Samsung Vizio and LG
No difference in terms of audio output with an Intel displayport as far as number of channels go, but it does offer a lot more frequencies, while using nvidia i’m only offered 48khz not nothing lower or higher, but on intel i see the actual audio capabilities of the tv
As far as Nvidia I did manage to get 5.1 working earlier with my receiver and TV without a custom EDID in use, but after a few reboots it went away again, this was while using an hdmi splitter between the adapter and the receiver and running at 4k60
I believe I’ve gotten to the bottom of the stereo sound issue, seems to be related to the size of the EDID, I was able to get 5.1 working easily from an intel displayport using the adapter and my vertex, immediately plugged that right into my desktop and it went back to stereo. deleted some of the audio formats and resolutions from my EDID and now 5.1 is working and other frequencies are at least showing again.
That is very strange, the EDID should not be affecting the audio capability of either the graphics card or the pass through capability of the graphics adapter. Would it be possible to attach the good and bad EDID files to this thread or send them to me via email ( with the address ‘support@plugable.com’ with the subject line ‘Ticket #286336 - Attention Pat’ ).
When connected to the Intel graphics DisplayPort did the television support 4:4:4 or RGB mode, or was it still limited to 4:2:2?
Your attachment system only allows JPG images here so here’s a link to 2 files, an unedited copy of my original television’s edid which as soon as my pc shows the name of the tv be it through the receiver or my vertex it removes all pcm audio formats except 16bit 48khz. There’s also a copy of the edited edid which allows it to work. I could upload another edited file which also didn’t work right, but it works in the same way just a matter of it being stereo only or 7.1
Except for the last few days for the past month and a half I’ve had 5.1 working with this same tv and receiver with the adapter, though it was limited to 48khz 16bit and 24bit, it worked fine while using a custom edid, other frequencies showed, they just didn’t make any actual sound with the speakers when selected.
My television did have a new update around the end of last week, and nvidia released a new driver at the same time, once that update dropped which was meant to add official G-Sync support to the television (gsync already worked before, just it didn’t list itself as officially supported in the driver) I could no longer get 5.1 sound to work
I’ll probably edit the edid in a different way later and remove some things that’re less useful for me, but I know LG’s extension block is at a point where there’s almost no room to add any other resolutions. Using these edids directly with the hdmi port didn’t have the same issue.
Our adapter support LPCM up to 8-channels but does not support the additional formats used by this receiver:
AC-3, max channels 6
Supported sample rates (kHz): 48 44.1 32
Maximum bit rate: 640 kHz
Dolby Digital+, max channels 8
Supported sample rates (kHz): 48 44.1 32
MAT (MLP), max channels 8
Supported sample rates (kHz): 48
DTS, max channels 6
Supported sample rates (kHz): 96 88.2 48 44.1
Maximum bit rate: 1536 kHz
DTS-HD, max channels 8
Supported sample rates (kHz): 192 176.4 96 88.2 48 44.1
Most likely the receiver supports LPCM 6 channel but it is not being advertised, the incorrectly matched compatibility is falling back to the lowest common supported feature set, or LPCM 2 channel.
Also interestingly, the EDID shows a lot of modes and options I would not expect to see from this television.
Would it be possible to connect the television directly to the computer and capture the EDID file ( instead of through the receiver )? We can then see if the receiver is injecting additional modes into the EDID.
Actually that non-working one was with it directly connected to the television (it loses frequencies when used with the adapter vs direct), I had the best results doing it that way and then editing it, when I do it from the pc through receiver first I lose 120hz support, the receiver passes that 120hz signal along fine but it gets removed from the edid unless I inject it manually or place an edid emulator between.
Once I get home I can extract the receiver with the tv if you like, though in that case as well I lose 32khz and 44khz and everything above 48khz along with 24bit audio, and nothing shows but stereo
The receiver itself supports 8 channels but I just use it in a 5.1 setup so that was how I had set the channels. I’ve since then changed it to 8 channels to experiment a bit.
I’ve found that the most reliable setup is injecting the edid with a monitor inf file. which was how I used to use it. I still can’t can’t hear anything above 48khz, but 24bit works fine this way and 5.1 and 7.1 function.
Powering the tv off and on also doesn’t reset the audio this way
I could share those files as well, but it’s simply an edid override with a similar file to the one i shared before
Thanks for the additional details on that! Based on this it looks like an EDID issue then, if you would like to send the EDID file I can take a look but it sounds like you have found the solution by injecting a fixed EDID.
We’re closing this thread due to inactivity, but if you have any further questions please feel free to contact support@plugable.com and we’ll be happy to help.