View Single Post
  #108   Report Post  
Posted to comp.sys.ibm.pc.hardware.misc,alt.comp.hardware,rec.video.desktop,sci.electronics.repair
Arfa Daily Arfa Daily is offline
external usenet poster
 
Posts: 6,772
Default Observations on a UPS - follow up to a previous post


"kony" wrote in message
...
On Thu, 16 Aug 2007 05:13:09 GMT, "James Sweet"
wrote:


If you can't actually see it, does it matter if it exists?
I can play 50 FPS video or games running at over 50 FPS on a
19" LCD computer monitor and not see any problems except the
obvious lack of contrast (but with CRT I am spoiled in this
respect, having bought Diamondtron tube based monitors for
the last few I used myself before switching to primarily LCD
usage).



I sure can, maybe my eyes are just better than average, there's those
"golden ear" audiophools I always thought were nuts, but maybe some of
them
aren't as nutty as I thought. I've got a high end 20" flat panel on my
desk
at work, it looks really good, but still not as good as the 22" flat
Trinitron CRT I have at home. Geometry is flawless, but the picture
doesn't
look as smooth and clean as the CRT, it looks more "digital".


I did not write "some LCD", I wrote about current generation
19" and lower.

It doesn't matter if you see ghosting on 20"+, for the
purpose of the discusstion which is whether smaller
comparable resolutions exhibit it.

If we were taking about higher resolutions than native to
19", then CRTs lose on another front because their refresh
rate and pixel boundaries get so blurred it is no longer an
accurate output.

Looking more "digital" is not necessarily a flaw. A video
card does not transmit an infinitely high res, flawless
image, it transmits pixels. Accurately representing those
pixels is the monitor's job, not blurring them so they look
more lifelike.


Only if it has DVI output, and you are making use of it. Many video cards
still in common use, output three analogue waveforms created by hi -speed
DACs with at least 16 bit inputs, via the VGA output socket, which the
monitor, CRT or LCD, displays via pixels made up either from phosphor
triads, or LC cells. As we live in an analogue world, I fail to see how you
can contend that something which looks "more digital" is not flawed. If the
display looks anything different from how the real world looks, then it is
an inaccurate representation, which by definition, makes it flawed. If the
CRT display does anything to make the picture look closer to reality, then
that must make it more accurate, and thus less flawed.

I'm not too sure why you feel that a CRT monitor's refresh rate has any
impact on the accuracy of the displayed rendition of the input data. High
refresh rates are a necessity to facilitate high resolutions. The response
times of the phosphors are plenty short enough for this to not represent a
problem. I do not understand what you mean by a CRT's pixel boundaries (?)
getting blurred, and how that fits in with refresh rate.

The last thing that you say is a very odd statement. If the CRT monitor does
anything to make the image more lifelike, how do you make that out to be a
bad thing? By logical deduction, if any display technology reproduces the
data being sent to it more accurately than any other, and this actually
looks less lifelike than reality, then the data being sent must be
inaccurate, and thus flawed ...

Arfa