View Single Post
  #2   Report Post  
Posted to sci.electronics.repair
Mike S[_4_] Mike S[_4_] is offline
external usenet poster
 
Posts: 62
Default Denon AVR HDMI question

On 5/11/2019 2:41 PM, root wrote:
Neither Denon nor NVidia has been able to resolve a problem I have.
My system consists of a Linux Box running MPlayer through an
NVidia GeForce 610 video card. The HDMI output of the card
goes to the DVD input to a Denon AVR-3313CI receiver. The
output of the Denon goes to a 65" LG 4K oled display. The
resulting display on the LG is as if there is a serious interlace
mismatch.

If the HDMI signal goes directly to the LG the picture is perfect.

I have tried other NVidia cards, several versions of the 610 and
all show the same problem. Other NVidia cards such as the 210 and
710 do not show the interlace problem.

As with any receiver, I would expect the Denon would query the
display for its edid capability and forward that information to
any source connected to the Denon's input. Apparently not so.

For the time being I have put a 2-way HDMI splitter in the
line coming from the computer to the Denon. One output of
the splitter goes to the Denon while the other output
goes directly to the HDMI1 of the LG display. Simultaneously
the HDMI output of the Denon goes to HDMI2 of the LG.

This arrangement works, and nvidia-settings shows the display
characteristics are set by the LG, but Denon is mentioned in
the nvidia-settings output.

Starting with power off on all three components, I first
turn on the LG, then the Denon, then I boot the computer.
The computer runs 24/7 thereafter.

Could there be a simple solution for connecting the
components together which might correct the problem?


could it be that the signal going from the Denon to the television is
HDMI 1.4 and the signal out of the video card is HDMI 2.0?

".... the primary reason for the switch to HDMI 2.0 is that 4K Ultra HD
televisions require much more bandwidth to realize their full potential.
Since 4K Ultra HD is four times the resolution of 1080p, the former HD
standard, it requires more throughput to handle extra data going back
and forth. Lots more.

HDMI 1.4 supported 4K resolutions, yes, but only at 24 or 30 frames per
second (fps). That works fine for movies but isnt useful for gaming and
many TV broadcasts, which require 50 or 60 fps. Also, HDMI 1.4 limited
4K Ultra HD content to 8-bit color, though it is capable of 10- or
12-bit color. HDMI 2.0 fixed all of that because it could handle up to
18 gigabits per second €” plenty enough to allow for 12-bit color and
video up to 60 frames per second."

https://www.digitaltrends.com/home-t...t-is-hdmi-2-0/