View Single Post
  #23   Report Post  
Posted to uk.d-i-y
Dave Plowman (News) Dave Plowman (News) is offline
external usenet poster
 
Posts: 43,017
Default HDMI output from PC.

In article ,
Paul wrote:
My guess is, DVI single lane output at 1920x1200 max CRTRB
165MHz comes into the chip, there is HDCP present, the signal
is digitally scrambled, no sync can be extracted (DE or
digital enable), and so a black screen is passed through
to the output side at 330MHz.


Usually, companies are forbidden from giving details of HDCP,
even though the early version was cracked and the keys are
known. There is probably a fee per dongle to be charged,
if HDCP materials are included in the design. The only way
you'd get a full datasheet, is under NDA.


I would test with some other potential signal sources, and
see whether older or newer DVI kit, works with the thing.


I tried it with the only two DVI outputs I have. One from a pretty high
spec PC and graphics card. The other, an aftermarket graphics card for the
ancient Acorn called Viewfinder, and made by a cottage industry no doubt
based on what was then commercially available for a PC etc.

I've no idea why HDCP signals should be present on a DVI video output from
a PC.

So all the video chain I'm working with is DVI-D. The exception being the
TV set working as a 'mirror' to the PC monitor. And that works just fine
with a simple DVI-D to HDMI cable. The PC monitor is DVI_D too - and has a
simple splitter to two DVI outputs, one of which fed the TV via the DVD to
HDMI cable.

So it would appear this StarTech device (unless faulty) introduces the
HDCP signal. As neither video or audio got through it.

--
*Remember: First you pillage, then you burn.

Dave Plowman London SW
To e-mail, change noise into sound.