UK diy (uk.d-i-y) For the discussion of all topics related to diy (do-it-yourself) in the UK. All levels of experience and proficency are welcome to join in to ask questions or offer solutions.

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #1   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 43,017
Default HDMI output from PC.

Before I start chasing things - ie RTFM - should the HDMI feed from a
graphics card also be capable of carrying system audio? On Win10, if it
helps.

--
*Born free - taxed to death *

Dave Plowman London SW
To e-mail, change noise into sound.
  #2   Report Post  
Posted to uk.d-i-y
jon jon is offline
external usenet poster
 
Posts: 434
Default HDMI output from PC.

On Tue, 16 Mar 2021 14:57:35 +0000, Dave Plowman (News) wrote:

Before I start chasing things - ie RTFM - should the HDMI feed from a
graphics card also be capable of carrying system audio? On Win10, if it
helps.


For digital audio, if an HDMI device has audio, it is required to
implement the baseline format: stereo (uncompressed) PCM. ... With version
1.3, HDMI allows lossless compressed audio streams Dolby TrueHD and DTS-HD
Master Audio. As with the Y€˛CBCR video, audio capability is optional.
  #3   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 7,829
Default HDMI output from PC.


Dave Plowman wrote:

should the HDMI feed from a graphics card also be capable of carrying
system audio?


Yes, but it'll show up as a different output device from your analogue
output, but any worthwhile program should let you choose
  #4   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 25,191
Default HDMI output from PC.

On 16/03/2021 14:57, Dave Plowman (News) wrote:
Before I start chasing things - ie RTFM - should the HDMI feed from a
graphics card also be capable of carrying system audio? On Win10, if it
helps.


Yes. It will add an additional audio output device to the machine. So
when you look at the volume control in windows there will be a gadget at
the top to let you select the desired output device.



--
Cheers,

John.

/================================================== ===============\
| Internode Ltd - http://www.internode.co.uk |
|-----------------------------------------------------------------|
| John Rumm - john(at)internode(dot)co(dot)uk |
\================================================= ================/
  #5   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 870
Default HDMI output from PC.

Dave Plowman (News) wrote:
Before I start chasing things - ie RTFM - should the HDMI feed from a
graphics card also be capable of carrying system audio? On Win10, if it
helps.


If you put your mind to it, you can come up with a way to be
holding an HDMI connector in hand, which has no audio.

But normally, up to 8 channel LPCM can be carried.
And LPCM (linear pulse code modulation), is just
straight samples, with no A-law, u-law, or compression
scheme. There is plenty of bandwidth on HDMI, so it's not
pinched for bandwidth like SPDIF was.

Video cards generally come in three generations. For the
longest while, there was no audio at all, associated with
video cards.

The first video card with sound, actually relied on the
user running the SPDIF signal from the sound chip on
the motherboard, over to a connector on the top edge of
the NVidia video card.

https://www.geeks3d.com/20080810/gef...ctor-tutorial/

Then, and only then, did some flavor of digital audio go
across HDMI. It's possible DisplayPort didn't even exist yet.
Since SPDIF had limited bandwidth, you would not expect to
be running 8 channel LPCM with that generation of hardware.
SPDIF is 2 or 4 channel, with 4 channel being, um, nonexistent.
And running AC3 would ruin the effect (it's compressed).

The era today, the "CODEC" if you will, is on the video card.
It no longer relies on SPDIF passthru as a mechanism. And
more audio channels are available as a result. The logic block
that does it, might be considered HDAudio of some sort. Or
maybe it declares itself that way.

AMD chose a different route to get audio on theirs. The driver
package (the jumbo video driver), had a RealTek folder on it,
and the suspicion was, that AMD had bought an intellectual
property block from RealTek, to implement the HDAudio function.
It's not clear today, whether they're using an in-house design
and have dumped the RealTek interim solution. AMD is also rumored
to have bought a USB3 block for their Southbridge, rather than
design their own. These are time savers and mean hiring fewer
"specialists" to make stuff.

*******

https://www.audioholics.com/hdtv-for...-hdmi-versions

HDMI 1.0 December 2002 8-channels of 192kHz/24-bit audio (PCM)

HDMI 1.1 May 2004 + high resolution audio format???

HDMI 1.2 August 2005 + DSD (Direct Stream Digital), + Super Audio CD (SACD)

HDMI 1.2 December 2005

HDMI 1.3 June 2006 + native Dolby TrueHD and DTS-HD Master Audio streams
for external decoding by AV receivers

HDMI 1.3a November 2006
....
HDMI 2.0 September 2013 + Up to 32 audio channels
Up to 1536kHz audio sample frequency

Just because the standards say this, does not mean the computer
industry immediately implemented this "with haste". The passthru
SPDIF idea was particularly egregeous (it smacked of "we don't
give a ****"). I'm not aware of anyone wiring that up, so I don't
even know if it works (that's because the SPDIF output from
sound chips, is TTL level and not cable level, and the signal
might not be standardized well enough to be connecting it up!).
But once the video card got the onboard CODEC, that's when
people started using it.

You can start by checking whether a suitable "digital audio" item
is in the playback choices in Windows. I have some with the
word "NVidia" on mine. But no equipment to test with (no fancy
monitors or TV sets). The video card was purchased for OpenCL
or CUDA compatibility, and for the video SIP block (video
encoder). I tried cracking a password on the video card, but
the software said it would "take 13 years" :-)

Paul


  #6   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 529
Default HDMI output from PC.

On 16/03/2021 14:57, Dave Plowman (News) wrote:
Before I start chasing things - ie RTFM - should the HDMI feed from a
graphics card also be capable of carrying system audio? On Win10, if it
helps.

If not working, I would go to Control Panel/Sound and check that the
HDMI output is selected.

--
Michael Chare
  #7   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 43,017
Default HDMI output from PC.

In article ,
Paul wrote:
Dave Plowman (News) wrote:
Before I start chasing things - ie RTFM - should the HDMI feed from a
graphics card also be capable of carrying system audio? On Win10, if it
helps.


If you put your mind to it, you can come up with a way to be
holding an HDMI connector in hand, which has no audio.


But normally, up to 8 channel LPCM can be carried.
And LPCM (linear pulse code modulation), is just
straight samples, with no A-law, u-law, or compression
scheme. There is plenty of bandwidth on HDMI, so it's not
pinched for bandwidth like SPDIF was.


Video cards generally come in three generations. For the
longest while, there was no audio at all, associated with
video cards.


The first video card with sound, actually relied on the
user running the SPDIF signal from the sound chip on
the motherboard, over to a connector on the top edge of
the NVidia video card.


https://www.geeks3d.com/20080810/gef...ctor-tutorial/


Then, and only then, did some flavor of digital audio go
across HDMI. It's possible DisplayPort didn't even exist yet.
Since SPDIF had limited bandwidth, you would not expect to
be running 8 channel LPCM with that generation of hardware.
SPDIF is 2 or 4 channel, with 4 channel being, um, nonexistent.
And running AC3 would ruin the effect (it's compressed).


The era today, the "CODEC" if you will, is on the video card.
It no longer relies on SPDIF passthru as a mechanism. And
more audio channels are available as a result. The logic block
that does it, might be considered HDAudio of some sort. Or
maybe it declares itself that way.


AMD chose a different route to get audio on theirs. The driver
package (the jumbo video driver), had a RealTek folder on it,
and the suspicion was, that AMD had bought an intellectual
property block from RealTek, to implement the HDAudio function.
It's not clear today, whether they're using an in-house design
and have dumped the RealTek interim solution. AMD is also rumored
to have bought a USB3 block for their Southbridge, rather than
design their own. These are time savers and mean hiring fewer
"specialists" to make stuff.


*******


https://www.audioholics.com/hdtv-for...-hdmi-versions


HDMI 1.0 December 2002 8-channels of 192kHz/24-bit audio (PCM)


HDMI 1.1 May 2004 + high resolution audio format???


HDMI 1.2 August 2005 + DSD (Direct Stream Digital), + Super Audio CD (SACD)


HDMI 1.2 December 2005


HDMI 1.3 June 2006 + native Dolby TrueHD and DTS-HD Master Audio streams
for external decoding by AV receivers


HDMI 1.3a November 2006
...
HDMI 2.0 September 2013 + Up to 32 audio channels
Up to 1536kHz audio sample frequency


Just because the standards say this, does not mean the computer
industry immediately implemented this "with haste". The passthru
SPDIF idea was particularly egregeous (it smacked of "we don't
give a ****"). I'm not aware of anyone wiring that up, so I don't
even know if it works (that's because the SPDIF output from
sound chips, is TTL level and not cable level, and the signal
might not be standardized well enough to be connecting it up!).
But once the video card got the onboard CODEC, that's when
people started using it.


You can start by checking whether a suitable "digital audio" item
is in the playback choices in Windows. I have some with the
word "NVidia" on mine. But no equipment to test with (no fancy
monitors or TV sets). The video card was purchased for OpenCL
or CUDA compatibility, and for the video SIP block (video
encoder). I tried cracking a password on the video card, but
the software said it would "take 13 years" :-)


Thanks for the excellent explanation Paul. An HDMI cable from my laptop
to the TV carries audio just fine. But with the desktop here, it's not so
easy to try without moving TVs around. But it is a new and pricey MB and
graphics card so you've given me the hope to actually try it. The laptop
does identify the TV as a selectable sound output in the windows settings.

--
*A conclusion is the place where you got tired of thinking *

Dave Plowman London SW
To e-mail, change noise into sound.
  #8   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 1,591
Default HDMI output from PC.

On 16/03/2021 16:57, Dave Plowman (News) wrote:
In article ,
Paul wrote:
Dave Plowman (News) wrote:
Before I start chasing things - ie RTFM - should the HDMI feed from a
graphics card also be capable of carrying system audio? On Win10, if it
helps.


If you put your mind to it, you can come up with a way to be
holding an HDMI connector in hand, which has no audio.


But normally, up to 8 channel LPCM can be carried.
And LPCM (linear pulse code modulation), is just
straight samples, with no A-law, u-law, or compression
scheme. There is plenty of bandwidth on HDMI, so it's not
pinched for bandwidth like SPDIF was.


Video cards generally come in three generations. For the
longest while, there was no audio at all, associated with
video cards.


The first video card with sound, actually relied on the
user running the SPDIF signal from the sound chip on
the motherboard, over to a connector on the top edge of
the NVidia video card.


https://www.geeks3d.com/20080810/gef...ctor-tutorial/


Then, and only then, did some flavor of digital audio go
across HDMI. It's possible DisplayPort didn't even exist yet.
Since SPDIF had limited bandwidth, you would not expect to
be running 8 channel LPCM with that generation of hardware.
SPDIF is 2 or 4 channel, with 4 channel being, um, nonexistent.
And running AC3 would ruin the effect (it's compressed).


The era today, the "CODEC" if you will, is on the video card.
It no longer relies on SPDIF passthru as a mechanism. And
more audio channels are available as a result. The logic block
that does it, might be considered HDAudio of some sort. Or
maybe it declares itself that way.


AMD chose a different route to get audio on theirs. The driver
package (the jumbo video driver), had a RealTek folder on it,
and the suspicion was, that AMD had bought an intellectual
property block from RealTek, to implement the HDAudio function.
It's not clear today, whether they're using an in-house design
and have dumped the RealTek interim solution. AMD is also rumored
to have bought a USB3 block for their Southbridge, rather than
design their own. These are time savers and mean hiring fewer
"specialists" to make stuff.


*******


https://www.audioholics.com/hdtv-for...-hdmi-versions


HDMI 1.0 December 2002 8-channels of 192kHz/24-bit audio (PCM)


HDMI 1.1 May 2004 + high resolution audio format???


HDMI 1.2 August 2005 + DSD (Direct Stream Digital), + Super Audio CD (SACD)


HDMI 1.2 December 2005


HDMI 1.3 June 2006 + native Dolby TrueHD and DTS-HD Master Audio streams
for external decoding by AV receivers


HDMI 1.3a November 2006
...
HDMI 2.0 September 2013 + Up to 32 audio channels
Up to 1536kHz audio sample frequency


Just because the standards say this, does not mean the computer
industry immediately implemented this "with haste". The passthru
SPDIF idea was particularly egregeous (it smacked of "we don't
give a ****"). I'm not aware of anyone wiring that up, so I don't
even know if it works (that's because the SPDIF output from
sound chips, is TTL level and not cable level, and the signal
might not be standardized well enough to be connecting it up!).
But once the video card got the onboard CODEC, that's when
people started using it.


You can start by checking whether a suitable "digital audio" item
is in the playback choices in Windows. I have some with the
word "NVidia" on mine. But no equipment to test with (no fancy
monitors or TV sets). The video card was purchased for OpenCL
or CUDA compatibility, and for the video SIP block (video
encoder). I tried cracking a password on the video card, but
the software said it would "take 13 years" :-)


Thanks for the excellent explanation Paul. An HDMI cable from my laptop
to the TV carries audio just fine. But with the desktop here, it's not so
easy to try without moving TVs around. But it is a new and pricey MB and
graphics card so you've given me the hope to actually try it. The laptop
does identify the TV as a selectable sound output in the windows settings.



Being recent I would have thought the card should produce good audio.
Most HDMI ICs I have come across have audio and I guess the same should
be the case for a PCIe card.

Can you provide the make and model for the card?

Quite often the drivers shipped with windows are a subset of what you
can get from the board manufacturer. If you don't see audio output
device in Device Manager (with a suitable monitor attached, i.e. one
that can take audio[1]) then the appropriate drivers are not installed.

[1] Monitor's video details are provided in EDID through a DDC channel.
You can use:
https://www.entechtaiwan.com/util/moninfo.shtm
I think that this is also how audio details are conveyed to the host system.
  #9   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 7,829
Default HDMI output from PC.

Fredxx wrote:

Being recent I would have thought the card should produce good audio


There's no analogue audio circuitry involved, the graphics card just has
to insert audio data packets into "data islands" during blanking periods
  #10   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 1,591
Default HDMI output from PC.

On 16/03/2021 18:18, Andy Burns wrote:
Fredxx wrote:

Being recent I would have thought the card should produce good audio


There's no analogue audio circuitry involved, the graphics card just has
to insert audio data packets into "data islands" during blanking periods


Quite, the digital serial stream has to come from somewhere.

It all depends on what you call "audio circuitry" which in this case
would be LVDS drivers within the HDMI IC, a couple of tracks and some
protection against ESD.


  #11   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 870
Default HDMI output from PC.

Fredxx wrote:
On 16/03/2021 18:18, Andy Burns wrote:
Fredxx wrote:

Being recent I would have thought the card should produce good audio


There's no analogue audio circuitry involved, the graphics card just
has to insert audio data packets into "data islands" during blanking
periods


Quite, the digital serial stream has to come from somewhere.

It all depends on what you call "audio circuitry" which in this case
would be LVDS drivers within the HDMI IC, a couple of tracks and some
protection against ESD.


There isn't usually outboard converters on GPUs.

Even though, yes, there are Silicon Image chips for such
purposes, which have reclocking options giving x1 pixel
resolution capability (don't need to follow the divisible
by 8, divisible by 2 rule), those are not typically used.

Some of the output buses on the video card, are reconfigurable.
Some are not. There is usually an LVDS bus, but that sort
of thing hasn't been used in some time on desktop video.
Perhaps back in the dual lane DVI era, there was the odd
video card that used an outboard device to add capability.
But so many designs follow the NVidia reference design,
that anything like this is a low runner. There was some kind
of general purpose parallel bus, but I don't remember the name
of that bus which connects to external converters.

The GPU is, most of the time, direct drive, and the only
things you need on outputs, are ESD protection solutions,
capacitive coupling with ceramic caps or whatever. Now, this
design, the SMPS is on the wrong side, but you don't really
see any LSI between the GPU and the front connectors.

https://www.eevblog.com/forum/repair...short-circuit/

To keep software happy, there'd be a bus master so the
HDAudio bus could exist (inside the GPU, can't see or touch
it). That will have a DMA block associated with it, so
HDAudio FIFO gueues can be kept filled, and fed to the
serial bus the HDAudio uses. Then, the HDAudio codec block,
with registers, hooks to that. Then, the ADC and DAC are
stripped off, and the output is pure digital. The analog part
of the mixed analog-digital design is removed. Then there'd be the
GPU crossbar, where all the digital goodies are mixed and output.
What is this in aid of ? If you look in the hardware info,
there should be an HDAudio "bus number". The motherboard
HDAudio chip would be like "Bus 1" and the video card block
shown here, would be "Bus 2", and it's the serial high speed
interface which is the bus being enumerated. NVidia could
extract the HDAudio bus master block, from one of their
Southbridge projects and reuse it.

XBAR serialHS
=== Mix to ---- HDAudio --------- HDAudio bus master block --- DMA buffer fill
LCD HDMI Codec via PCIe
Mon via TSI minus DACs

TSI = time slot interchanger (shorthand for digital insertion in fixed pattern)
XBAR = crossbar for same purpose (digital routing of display channel to connector)
DAC = Digital to analog convert (as seen on motherboard HDAudio chips)
Bus Master Block = what a Southbridge would use, to drive an HDAudio codec 48 pin chip
DMA = direct memory access, which pulls buffers of audio data to keep
the FIFO feeding the HDAudio bus filled. Repeating sound effects
heard, if this ever crashes (in a game, you can hear the FIFO
replayed over and over again).

( This isn't current generation, but it's one of the few crossbar
diagrams I have to offer. Page 8, and be patient, as it can take
Acrobat Reader 20 seconds to render the diagram on the bottom of Pg.8)

https://web.archive.org/web/20061126...Whitepaper.pdf

Somewhere in the ROM onboard the video card, is a declaration
(that the driver reads), as to what wiring is doing what
feature. I don't know any of the details of that. Like if
three of five outputs are available on the faceplate,
something has to know the other two are No Connects.

A laptop GPU is slightly different, in that an LVDS bus
goes direct to the panel, has no Plug and Play information,
and the PNP has to be "faked". If you buy a laptop which
has low-res and high-res panel options, you crack your panel,
then buy the wrong replacement LCD, there will be black bars on two
sides of the displayed image. And that's because the laptop
doesn't know which panel is connected, and the information
is being faked. And to fix that, you would have to figure
out which bit needs updating, so that the fake declaration
matches the panel dimensions. The other buses on the laptop
GPU (single VGA or HDMI output), they would come direct from the
GPU as appropriate. As TMDS bus, for the HDMI.

There is a rich collection of converter devices, for TV sets,
and some of the whizzy gadgets we use, have been re-purposed
digital components from TV sets. And there's a subset of
components (ones with HDCP capability), that are restricted
and not for sale to punters and the datasheets are NDA only.
On the BlackMagic HDMI capture card, the first card, there
are three Analog Devices front ends that could be fitted
to the design, and only the non-HDCP version has info.
The others are hidden from view. And each had differences in
the analog capture features too. This is how some Chinese
designs slipped out, which functioned as HDCP strippers,
and those were stopped at the border by US Customs after
it was figured out. By reusing things that could function
as HDCP strippers.

But as far as the parts bin for video cards, a metric ton
of SMPS components and designs, a ROM for declaration and
read-in by the BIOS at startup, ESD protection components,
most of the heavy lifting is done by the GPU itself.
This keeps down the BOM cost. The most wasteful part, is
the SMPS (the Volterra stuff on high end designs). When
you want 1V @ 200A, who ya gonna call. Oh, and we can't
forget the VRAM.

Paul
  #12   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 7,829
Default HDMI output from PC.

Fredxx wrote:

Andy Burns wrote:

Fredxx wrote:

Being recent I would have thought the card should produce good audio


There's no analogue audio circuitry involved, the graphics card just
has to insert audio data packets into "data islands" during blanking
periods


Quite, the digital serial stream has to come from somewhere.

It all depends on what you call "audio circuitry" which in this case
would be LVDS drivers within the HDMI IC, a couple of tracks and some
protection against ESD.


But it shares those pins/tracks with video data, there aren't separate
ones for audio, so other than timing when to send audio rather than send
video, it doesn't take much.

  #13   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 2,699
Default HDMI output from PC.

Yes it bloody does, in fact stopping the darned thing is my biggest
nightmare. I had to use a vga to hdmi in the end to keep the audio going via
the sound card I wanted it to when the input was selected on my Samsung TV.

Brian

--

This newsgroup posting comes to you directly from...
The Sofa of Brian Gaff...

Blind user, so no pictures please
Note this Signature is meaningless.!
"Dave Plowman (News)" wrote in message
...
Before I start chasing things - ie RTFM - should the HDMI feed from a
graphics card also be capable of carrying system audio? On Win10, if it
helps.

--
*Born free - taxed to death *

Dave Plowman
London SW
To e-mail, change noise into sound.



  #15   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 870
Default HDMI output from PC.

Brian Gaff (Sofa) wrote:
Yes it bloody does, in fact stopping the darned thing is my biggest
nightmare. I had to use a vga to hdmi in the end to keep the audio going via
the sound card I wanted it to when the input was selected on my Samsung TV.

Brian


There's a control for setting where the output goes,
whether HDMI digital audio, sound chip SPDIF digital,
or sound chip analog audio. Each output has a name.

There's a tendency for the wrong thing to be selected,
if changing drivers, or, right after the OS gets installed.
You just change it to suit the situation.

The most evil situation I've run into, is installing two
sound cards, one driver had a bad registry setting behavior,
and damaged a shared setting -- both sound cards then
failed to work. That was on Windows XP. I haven't had that
happen in other situations, but that's because my collection
of junk sound cards is retired. The onboard HDAudio is sufficient.

Paul


  #16   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 43,017
Default HDMI output from PC.

In article ,
Brian Gaff \(Sofa\) wrote:
Yes it bloody does, in fact stopping the darned thing is my biggest
nightmare. I had to use a vga to hdmi in the end to keep the audio going via
the sound card I wanted it to when the input was selected on my Samsung TV.


Surely you could do this in software? Most progs give you the choice of
selecting audio in and out?

--
*All generalizations are false.

Dave Plowman London SW
To e-mail, change noise into sound.
  #17   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 3,080
Default HDMI output from PC.

On 17/03/2021 14:15, Paul wrote:
Brian Gaff (Sofa) wrote:
Yes it bloody does, in fact stopping the darned thing is my biggest
nightmare. I had to use a vga to hdmi in the end to keep the audio
going via the sound card I wantedÂ* it to when the input was selected
on my Samsung TV.

Brian


There's a control for setting where the output goes,
whether HDMI digital audio, sound chip SPDIF digital,
or sound chip analog audio. Each output has a name.

There's a tendency for the wrong thing to be selected,
if changing drivers, or, right after the OS gets installed.
You just change it to suit the situation.

The most evil situation I've run into, is installing two
sound cards, one driver had a bad registry setting behavior,
and damaged a shared setting -- both sound cards then
failed to work. That was on Windows XP. I haven't had that
happen in other situations, but that's because my collection
of junk sound cards is retired. The onboard HDAudio is sufficient.


The worst I had was also back in the days of XP. A new motherboard, with
onboard sound that didn't work. Replacing the motherboard didn't help. I
ended up disabling the onboard sound and plugging in a sound card -
which irritated me, because the onboard sound still showed up as present.

It was more than a year later that I discovered the source of the
problem, corrected it and switched to using the onboard sound - by
downloading a new driver ... for my *NON*-sound equipped *VIDEO* card!
  #18   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 43,017
Default HDMI output from PC.

Bit more info.

DVI to HDMI lead direct from PC graphic card DVI output to TV works fine.
This adaptor - DVI-D to HDMI + USB to add sound, no picture. PC does
recognise the adaptor USB connection in device manager, so I'd guess it is
getting power.

--
*Work is for people who don't know how to fish.

Dave Plowman London SW
To e-mail, change noise into sound.
  #19   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 43,017
Default HDMI output from PC.

In article ,
Dave Plowman (News) wrote:
Bit more info.


DVI to HDMI lead direct from PC graphic card DVI output to TV works
fine. This adaptor - DVI-D to HDMI + USB to add sound, no picture. PC
does recognise the adaptor USB connection in device manager, so I'd
guess it is getting power.


Contacted the seller, but I'd guess they are just a box shifter, and all
they could help with was a return for re-fund.

--
*A cartoonist was found dead in his home. Details are sketchy.*

Dave Plowman London SW
To e-mail, change noise into sound.
  #20   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 1,264
Default HDMI output from PC.

"Dave Plowman (News)" wrote:
In article ,
Dave Plowman (News) wrote:
Bit more info.


DVI to HDMI lead direct from PC graphic card DVI output to TV works
fine. This adaptor - DVI-D to HDMI + USB to add sound, no picture. PC
does recognise the adaptor USB connection in device manager, so I'd
guess it is getting power.


Contacted the seller, but I'd guess they are just a box shifter, and all
they could help with was a return for re-fund.


What's the model number of the adapter, OOI?

Theo


  #21   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 43,017
Default HDMI output from PC.

In article ,
Theo wrote:
"Dave Plowman (News)" wrote:
In article ,
Dave Plowman (News) wrote:
Bit more info.


DVI to HDMI lead direct from PC graphic card DVI output to TV works
fine. This adaptor - DVI-D to HDMI + USB to add sound, no picture. PC
does recognise the adaptor USB connection in device manager, so I'd
guess it is getting power.


Contacted the seller, but I'd guess they are just a box shifter, and
all they could help with was a return for re-fund.


What's the model number of the adapter, OOI?


StarTech DVI2HD. Talked to the StarTech help line, and as usual, I'd
already tried everything they suggested. And short of it being faulty,
reckoned it was just a combatibilty problem. And I'm not going to change
the TV or computers.

It's sad as it should have done everything I wanted in the simplest way.
Powered by USB which also provided the audio, and the right DVI and HDMI
connector.

I've returned it and ordered up a HMDI embedder at a higher cost which
will also need adaptors.

--
*Why do they put Braille on the drive-through bank machines?

Dave Plowman London SW
To e-mail, change noise into sound.
  #22   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 870
Default HDMI output from PC.

Dave Plowman (News) wrote:
In article ,
Theo wrote:
"Dave Plowman (News)" wrote:
In article ,
Dave Plowman (News) wrote:
Bit more info.
DVI to HDMI lead direct from PC graphic card DVI output to TV works
fine. This adaptor - DVI-D to HDMI + USB to add sound, no picture. PC
does recognise the adaptor USB connection in device manager, so I'd
guess it is getting power.
Contacted the seller, but I'd guess they are just a box shifter, and
all they could help with was a return for re-fund.


What's the model number of the adapter, OOI?


StarTech DVI2HD. Talked to the StarTech help line, and as usual, I'd
already tried everything they suggested. And short of it being faulty,
reckoned it was just a combatibilty problem. And I'm not going to change
the TV or computers.

It's sad as it should have done everything I wanted in the simplest way.
Powered by USB which also provided the audio, and the right DVI and HDMI
connector.

I've returned it and ordered up a HMDI embedder at a higher cost which
will also need adaptors.


LT8522EX

http://www.lontiumsemi.com/product/View_53.html

Technically, mentions HDCP, without even showing an
interface for an EEPROM on the chip diagram. Is it
doing passthru from the host, of some sort ?

My guess is, DVI single lane output at 1920x1200 max CRTRB
165MHz comes into the chip, there is HDCP present, the signal
is digitally scrambled, no sync can be extracted (DE or
digital enable), and so a black screen is passed through
to the output side at 330MHz.

Usually, companies are forbidden from giving details of HDCP,
even though the early version was cracked and the keys are
known. There is probably a fee per dongle to be charged,
if HDCP materials are included in the design. The only way
you'd get a full datasheet, is under NDA.

I would test with some other potential signal sources, and
see whether older or newer DVI kit, works with the thing.

The CM108 is a USB to stereo audio adapter, and it's
part of your product too. They use the I2S bus interface
on the CM108, to provide the USB audio to the design.
All four pins of the USB interface are being used.

But unless the original digital signal is in plaintext
(decrypted, no HDCP present), it's probably the front end
of the chip that has "nothing but digital snow" to work
with. Unless the device is equipped with either HDCP
stripping or HDCP passthru, then some steps could be
missing. Is the audio multiplexed in, covered by HDCP
copy prevention ? Dunno.

How do you check out technical capability on a dongle like this ?

Dunno. There are various adjunct items on HDMI, but I don't
know if they can be used in a plug and play sense, to
give details of standards supported. Presumably something
like that is required, when long chains of adapters
are tied together in series. CEC flows backwards for control,
but I don't know what additional information is sent that way.

Paul
  #23   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 43,017
Default HDMI output from PC.

In article ,
Paul wrote:
My guess is, DVI single lane output at 1920x1200 max CRTRB
165MHz comes into the chip, there is HDCP present, the signal
is digitally scrambled, no sync can be extracted (DE or
digital enable), and so a black screen is passed through
to the output side at 330MHz.


Usually, companies are forbidden from giving details of HDCP,
even though the early version was cracked and the keys are
known. There is probably a fee per dongle to be charged,
if HDCP materials are included in the design. The only way
you'd get a full datasheet, is under NDA.


I would test with some other potential signal sources, and
see whether older or newer DVI kit, works with the thing.


I tried it with the only two DVI outputs I have. One from a pretty high
spec PC and graphics card. The other, an aftermarket graphics card for the
ancient Acorn called Viewfinder, and made by a cottage industry no doubt
based on what was then commercially available for a PC etc.

I've no idea why HDCP signals should be present on a DVI video output from
a PC.

So all the video chain I'm working with is DVI-D. The exception being the
TV set working as a 'mirror' to the PC monitor. And that works just fine
with a simple DVI-D to HDMI cable. The PC monitor is DVI_D too - and has a
simple splitter to two DVI outputs, one of which fed the TV via the DVD to
HDMI cable.

So it would appear this StarTech device (unless faulty) introduces the
HDCP signal. As neither video or audio got through it.

--
*Remember: First you pillage, then you burn.

Dave Plowman London SW
To e-mail, change noise into sound.
  #24   Report Post  
Posted to uk.d-i-y
external usenet poster
 
Posts: 923
Default HDMI output from PC.

On Sat, 27 Mar 2021 16:21:52 +0000 (GMT), "Dave Plowman (News)"
wrote:


In article ,
Paul wrote:

My guess is, DVI single lane output at 1920x1200 max CRTRB
165MHz comes into the chip, there is HDCP present, the signal
is digitally scrambled, no sync can be extracted (DE or
digital enable), and so a black screen is passed through
to the output side at 330MHz.



Usually, companies are forbidden from giving details of HDCP,
even though the early version was cracked and the keys are
known. There is probably a fee per dongle to be charged,
if HDCP materials are included in the design. The only way
you'd get a full datasheet, is under NDA.



I would test with some other potential signal sources, and
see whether older or newer DVI kit, works with the thing.


I tried it with the only two DVI outputs I have. One from a pretty high
spec PC and graphics card. The other, an aftermarket graphics card for the
ancient Acorn called Viewfinder, and made by a cottage industry no doubt
based on what was then commercially available for a PC etc.

I've no idea why HDCP signals should be present on a DVI video output from
a PC.

So all the video chain I'm working with is DVI-D. The exception being the
TV set working as a 'mirror' to the PC monitor. And that works just fine
with a simple DVI-D to HDMI cable. The PC monitor is DVI_D too - and has a
simple splitter to two DVI outputs, one of which fed the TV via the DVD to
HDMI cable.

So it would appear this StarTech device (unless faulty) introduces the
HDCP signal. As neither video or audio got through it.


I didn't know what HDCP is so I've just read the Wikepedia article
about it. There is so much incredible complication there to stop some
things and not others depending on connections, I'm not surprised you
have unexplained effects.
--
Dave W
Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
OT Soundbars with multiple HDMI inputs and one output Andrew[_22_] UK diy 10 November 1st 20 09:30 AM
Solar Panel actual output over day/year versus theoretical output Chris Green UK diy 26 August 23rd 19 05:20 PM
Will RF output (transmitter) be the same wattage as audio output? [email protected] Electronics Repair 39 March 17th 17 12:43 AM
Will no output devie hurt output transistors? micky Electronics Repair 5 February 26th 17 07:39 PM
Sound on hdmi output video card root Electronics Repair 1 November 5th 11 06:16 PM


All times are GMT +1. The time now is 07:31 PM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 DIYbanter.
The comments are property of their posters.
 

About Us

"It's about DIY & home improvement"