View Single Post
  #117   Report Post  
Posted to comp.sys.ibm.pc.hardware.misc,alt.comp.hardware,rec.video.desktop,sci.electronics.repair
kony kony is offline
external usenet poster
 
Posts: 68
Default Observations on a UPS - follow up to a previous post

On Fri, 17 Aug 2007 08:40:09 GMT, "Arfa Daily"
wrote:


Well I'm sorry too, but it is you who is wrong. You would be right if we
were talking a signal that was being converted back and forth between types
or standards, but in the case of a computer generated picture, we are not.
We are talking a digitally created image of something that needs to be an
analogue one for our eyes to see. Whether the conversion from digital to
analogue takes place at the video card, or at the face of the monitor, it is
still a necessity that it takes place. The ultimate goal is to make it look
as lifelike as possible. If you think that by making it look sharper or in
some way different (or in your opinion, better) than real life, then you
have a very odd understanding of what the word 'accuracy' means in this
context.

Bacon grease ?? What a silly thing to throw into a discussion.



It is your goal to blur the information, which is what the
grease would do.

Pixel data is output by a computer to a video card. Since
human vision has far higher granularity, it is not expected
to look like reality except to the depth of granularity
possible by that pixel data, resolution. If the pixel data
is not preserved but rather smoothed to reduce your
perception of the pixels, it is also removing "data" from
the image, it is less accurate than the output was intended
to be. Monitor manufacturers strive to accurately reproduce
the image, not make it asthetically pleasing.

The goal is accuracy, not "lifelike". Lifelike and accuracy
can coexist but it will come from higher resolution, not
degradation of the signal upon output as you propose.