View Single Post
  #113   Report Post  
Posted to comp.sys.ibm.pc.hardware.misc,alt.comp.hardware,rec.video.desktop,sci.electronics.repair
kony kony is offline
external usenet poster
 
Posts: 68
Default Observations on a UPS - follow up to a previous post

On Thu, 16 Aug 2007 23:03:26 GMT, "Arfa Daily"
wrote:


Only if it has DVI output, and you are making use of it.


False, while DVI is certainly better the higher the
resolution, it is a separate factor.

Many video cards
still in common use, output three analogue waveforms created by hi -speed
DACs with at least 16 bit inputs, via the VGA output socket, which the
monitor, CRT or LCD, displays via pixels made up either from phosphor
triads, or LC cells. As we live in an analogue world, I fail to see how you
can contend that something which looks "more digital" is not flawed.


It's pretty easy to understand once you realize that the
picture the video card is attempted to display that was
generated by the OS, IS DIGITAL. Anyone knows that
conversion back and forth between digital and analog causes
loss (to whatever extent, which must be a large extent if
you deem the conversion to change the image enough that you
feel it's better somehow).

If the
display looks anything different from how the real world looks, then it is
an inaccurate representation,


WRONG. An accurate representation is to preserve as much of
the input information as possible, not burring it so that it
becomes in some way closer to smooth but simultaneously
losing information in the process, becoming less detailed.

If all you want is blurry, smear some bacon grease on your
screen!

Sorry but you are 100% wrong.