Mike S wrote:
could it be that the signal going from the Denon to the television is
HDMI 1.4 and the signal out of the video card is HDMI 2.0?
".... the primary reason for the switch to HDMI 2.0 is that 4K Ultra HD
televisions require much more bandwidth to realize their full potential.
Since 4K Ultra HD is four times the resolution of 1080p, the former HD
standard, it requires more throughput to handle extra data going back
and forth. Lots more.
HDMI 1.4 supported 4K resolutions, yes, but only at 24 or 30 frames per
second (fps). That works fine for movies but isn???t useful for gaming and
many TV broadcasts, which require 50 or 60 fps. Also, HDMI 1.4 limited
4K Ultra HD content to 8-bit color, though it is capable of 10- or
12-bit color. HDMI 2.0 fixed all of that because it could handle up to
18 gigabits per second ??? plenty enough to allow for 12-bit color and
video up to 60 frames per second."
https://www.digitaltrends.com/home-t...t-is-hdmi-2-0/
Thanks for responding. I bought a replacement (identical) video card
and the problem was fixed. The Denon receiver specs do not indicate
whether it is 1.4 or 2.x, but it talks about 3D passthrough and
4K upconversion. I added a HDMI splitter in front of the receiver
so that one output goes to the receiver, and the other goes
directly to the TV. As I understand HDMI the source interrogates
the display and sends the appropriate output. The TV will respond
with its characteristics, but the receiver is not able to respond
because it has no output. At least that is how I figure it to
work. With this arrangement, and a new video card, the system
is working very well.