Home |
Search |
Today's Posts |
![]() |
|
Electronics Repair (sci.electronics.repair) Discussion of repairing electronic equipment. Topics include requests for assistance, where to obtain servicing information and parts, techniques for diagnosis and repair, and annecdotes about success, failures and problems. |
Reply |
|
LinkBack | Thread Tools | Display Modes |
#81
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"Dave Plowman (News)" wrote in message
... In article , William Sommerwerck wrote: I don't have the time to discuss this at length, but NTSC's unfortunate reverse-acronym was the result of poor studio standards, and is not inherent in the system. PAL incorporated phase alternation to partly compensate for transmission problems (non-linear group delay) in Europe. IIRC, nowt to do with studios, but the transmission process. Hence the tint control on NTSC sets which is absent on PAL ones. The implication of "never twice the same color" was that there was something inherently unstable in the system. The US had high-quality microwave transmission systems with excellent timing and group delay characteristics. Europe did not. To those in the US... When was the last time you adjusted the hue control on your analog TV? If I remember my BBC training correctly, NTSC gives theoretically better 'studio' pictures than PAL. Yes, because it has wider chroma bandwidth. Other than that, they are essentially the same system. |
#82
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]() wrote in message ... On 24 May, tony sayer wrote: I don't have the time to discuss this at length, but NTSC's unfortunate reverse-acronym was the result of poor studio standards, and is not inherent in the system. PAL incorporated phase alternation to partly compensate for transmission problems (non-linear group delay) in Europe. Wasn't something done to either the NTSC transmission spec or the sets that largely alleviated that .. sometime after the original system started?.. Wasn't it the improved standards in receivers following the introduction of solid state technology? The transistorised sets didn't drift as much. No, tube sets were stable. Remember, the demodulator is locked to the burst signal. |
#83
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
In article , William
Sommerwerck scribeth thus "Dave Plowman (News)" wrote in message ... In article , William Sommerwerck wrote: I don't have the time to discuss this at length, but NTSC's unfortunate reverse-acronym was the result of poor studio standards, and is not inherent in the system. PAL incorporated phase alternation to partly compensate for transmission problems (non-linear group delay) in Europe. IIRC, nowt to do with studios, but the transmission process. Hence the tint control on NTSC sets which is absent on PAL ones. The implication of "never twice the same color" was that there was something inherently unstable in the system. The US had high-quality microwave transmission systems with excellent timing and group delay characteristics. Europe did not. Are you referring to the studio to transmitter links?... To those in the US... When was the last time you adjusted the hue control on your analog TV? If I remember my BBC training correctly, NTSC gives theoretically better 'studio' pictures than PAL. Yes, because it has wider chroma bandwidth. Other than that, they are essentially the same system. -- Tony Sayer |
#84
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
Dave Plowman (News) wrote:
In article , The Natural Philosopher wrote: But then different makes of transparencies give different results... And transparencies are usually used for top quality magazine prints not 'projected onto a screen' anyway. And are adjusted as part of the printing process. Precisely. |
#85
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
William Sommerwerck wrote:
I don't have the time to discuss this at length, but NTSC's unfortunate reverse-acronym was the result of poor studio standards, and is not inherent in the system. It is. Multipath effects caused unacceptable phase and color shifts. This is like saying that the design of eggs is fundamentally flawed, because if you drop them, they break. Its more akin to saying that if you want to play handball with eggs, don't do it on a concrete patio. |
#86
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
I don't have the time to discuss this at length, but NTSC's unfortunate
reverse-acronym was the result of poor studio standards, and is not inherent in the system. PAL incorporated phase alternation to partly compensate for transmission problems (non-linear group delay) in Europe. IIRC, nowt to do with studios, but the transmission process. Hence the tint [sic] control on NTSC sets which is absent on PAL ones. The implication of "never twice the same color" was that there was something inherently unstable in the system. The US had high-quality microwave transmission systems with excellent timing and group delay characteristics. Europe did not. Are you referring to the studio to transmitter links?... No, when I say "poor studio standards", I'm talking about such things as the failure to set up cameras correct, keep a close eye on burst phase, etc, etc, etc. Garbage in, garbage out. |
#87
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
And transparencies are usually used for top-quality
magazine prints not 'projected onto a screen' anyway. And are adjusted as part of the printing process. Precisely. Many years ago I read about the work at National Geographic that was put into making color separations and printing plates to produce extremely high-quality images in the magazine. It was not simple. |
#88
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
William Sommerwerck wrote:
Many years ago I read about the work at National Geographic that was put into making color separations and printing plates to produce extremely high-quality images in the magazine. It was not simple. At that time they only accepted kodachrome slides. Geoff. -- Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM |
#89
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"William Sommerwerck" wrote in message
... The LCD only filters light from the backlight. If you don't have a full spectrum white in the first place the you can't expect decent colour. Not so. All you have to do is hit the defined points in CIE diagram. The Pioneer plasma sets hit them dead-on. There is a little more to it than that, Bill. The points in the CIE diagram are only part of what makes for a proper image. If you are referring to the triangle of points on the colorimetry plot, you are only seeing the color of the points, not the luminance. The resulting image is a matter of saturation, hue, and luminance. You only see the first two with the CIE chart that shows gamut. The other aspect of getting the points on the colorimetry chart right is that it tells you nothing about the colors in between and at different levels. All it tells you is the color of the points you measure. How they are mixed and create intermediate colors depends on the spectrum of each of the primaries and the color decoding scheme in the display. The underlying assumption in video is that we have a spectrum for each primary that is similar to the CIE standard observer curves, which are approximations of how we perceive color. When you deviate from those spectra you have to compensate or you will be intermediate colors that have to much or too little energy in a particular primary. There are no standards for how this is done, because there are so many variations in backlighting and filters in the displays. There are not even good metrics for getting to the bottom of the problem yet. The result is that the LED backlit displays can look very good, but sometimes have a little strange color reproduction. As for the black level and contrast ratios, they are only an improvement to the degree that they can control backlighting locally. As the number of controlled areas increase, the useful contrast ratio in an actual image may begin to approach the on/off numbers that they advertise, but with less zones of control, those numbers are simply meaningless to real video. Leonard |
#90
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
wrote in message
... William Sommerwerck wrote: The LCD only filters light from the backlight. If you don't have a full spectrum white in the first place the you can't expect decent colour. Not so. All you have to do is hit the defined points in CIE diagram. The Pioneer plasma sets hit them dead-on. Indeed. None of the major display techologies deliver full spectrum, nor do they need to. NT This is true only if you have custom LUTs or decoding algorithms for a display based on the relationship between the spectra of the lighting and the CIE standard observer functions that cameras are generally aligned to approximate. The other thing that no one mentions is that trying to make up for spectral shortcomings with different filters and decoding reduces the efficiency of the lighting system. There is usually a "rest of the story" beyond the naive assumptions that get thrown around about reproducing color. This thread is full of examples. Leonard |
#91
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]() tony sayer wrote: In article , William Sommerwerck scribeth thus That may be a different story because PAL TV sets never had them. NTSC sets needed them because the phase of the color carrier wandered and often shifted to the green, while PAL sets reset the phase each line and therefore were always "correct". NTSC does not, and never had, an inherent problem with phase stability. I cant conclude anything, but I know 2 things: 1. NTSC is widely known as Never The Same Color twice 2. The PAL system includes measures to counter phase shift causing colour issues, so I can only conclude that the system engineers thought this was a problem with NTSC. I don't have the time to discuss this at length, but NTSC's unfortunate reverse-acronym was the result of poor studio standards, and is not inherent in the system. PAL incorporated phase alternation to partly compensate for transmission problems (non-linear group delay) in Europe. Wasn't something done to either the NTSC transmission spec or the sets that largely alleviated that .. sometime after the original system started?.. VIR was introduced decades ago. It inserted reference signals into the vertical interval, near the start of each field of video. That allowed automatic adjustment of the equipment, and eliminated the video gain, black level, chroma gain, and phase controls that each operator could adjust, to 'their' preference. NTSC wasn't the problem, it was that everyone along the signal path could play with it. A system that had VIR from the cameras to the transmitter had no problems. Of course, that doesn't stop opinionated people from bashing a system they don't understand. -- You can't have a sense of humor, if you have no sense! |
#92
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]() |
#93
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"Arfa Daily" wrote in message
... wrote in message ... Arfa Daily wrote: The LCD only filters light from the backlight. If you don't have a full spectrum white in the first place the you can't expect decent colour. White LEDs aren't quite there yet are they? Archie Absolutely true, except that this particular TV doesn't use white LEDs in its 'revolutionary' backlighting scheme. It uses small RGB arrays, which is why I was questioning whether there was any control over the individual elements in each array, such that the colour temperature of the nominally white light that they produce, could be varied. Which would then, of course, have a corresponding effect on the displayed colour balance. It just seemed to me that given they have gone to the trouble of using RGB arrays, rather than white LEDs, the reason for that might have been to get a full(er) spectrum white. Arfa colour temp can be controlled using the LEDs or the LCD, I'm not sure it makes any big difference which one. RGB LEDs would give the same white as a triphosphor&uv white LED, but with more colour control. The standard 2 colour white LED would be useless on a 3 channel display. And fwiw bichromic white LEDs have huge colour balance variation, way outside of whats acceptable for a display. NT Which is why, given that they've put these LEDs under at least some kind of control in order to implement their (claimed) enhanced black reproduction scheme, that I was questioning whether the scheme maybe allowed for a degree of user intervention under the guise of "tint" or whatever, and which might have accounted for why on this particular TV - the only example that I've seen on and working so far - the flesh tones were so poor compared to Pan and Sony offerings in the same display stack, showing the same picture. I'm trying to get a handle on why a company with the products and reputation of Sammy, are a) using advertising terminology that appears to be questionable in the context that it appears, and b) producing a set, claiming it to be the dog's ******** of display technology, which does not appear - to my eye at least - to be as good as their traditionally CCFL backlit offerings, or those of other manufacturers. I saw the latest all singing and dancing LCD HD Pan, just released, in my friend's shop yesterday. Uses conventional CCFL backlighting. Not as thin as the Sammy, but getting there. Apart from the usual slight gripes that you could direct at any LCD panel when examined closely, the picture was quite stunning, and the colour rendition was as close to 'perfect' as you could reasonably expect. Certainly, flesh tones *appeared* accurate, but I accept that is subjective. Anyway, whichever-whatever, more accurate than they appeared on the LED backlit Sammy ... Arfa The why is pretty clear. Samsung is a whore, like all of the other vendors, only a little more so than some others. They are interested in market share and will create whatever hype they think will help them sell sets. The degree to which it is actually better only has to matter up to the point that too many people figure it out and it hurts sales. As we get better at quantifying why the sets look a little weird on certain colors they will improve the spectrum of the backlighting and improve the color decoding to compensate. It won't happen if people keep spewing the nonsense that all you have to do is hit the primary and secondary colorimetry targets to get perfect color. That is just a starting point, and for some sets that do not have proper color decoding or gamut, may actually be the wrong compromise. Leonard |
#94
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]() tony sayer wrote: In article , William Sommerwerck scribeth thus "Dave Plowman (News)" wrote in message ... In article , William Sommerwerck wrote: I don't have the time to discuss this at length, but NTSC's unfortunate reverse-acronym was the result of poor studio standards, and is not inherent in the system. PAL incorporated phase alternation to partly compensate for transmission problems (non-linear group delay) in Europe. IIRC, nowt to do with studios, but the transmission process. Hence the tint control on NTSC sets which is absent on PAL ones. The implication of "never twice the same color" was that there was something inherently unstable in the system. The US had high-quality microwave transmission systems with excellent timing and group delay characteristics. Europe did not. Are you referring to the studio to transmitter links?... The cross country network feeds, that were owned & operated by AT&T. Those were replaced by C & KU band satellite feeds in the '80s. Some TV stations now feed CATV headends via fiber optic. They maintain the off air equipment as a backup, in case of a failure in the F-O path. I was a TV Broadcast Engineer in the '70s - '90s in the US. -- You can't have a sense of humor, if you have no sense! |
#95
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"William Sommerwerck" wrote in message
... The LCD only filters light from the backlight. If you don't have a full spectrum white in the first place the you can't expect decent colour. White LEDs aren't quite there yet are they? Absolutely true, except that this particular TV doesn't use white LEDs in its 'revolutionary' backlighting scheme. It uses small RGB arrays, which is why I was questioning whether there was any control over the individual elements in each array, such that the colour temperature of the nominally white light that they produce, could be varied. Which would then, of course, have a corresponding effect on the displayed colour balance. It just seemed to me that given they have gone to the trouble of using RGB arrays, rather than white LEDs, the reason for that might have been to get a full(er) spectrum white. In a very broad sense, the last thing you want is a "full-spectrum" light. The standard primaries are diluted with too much white as it is. In a very broad sense you are correct, but in terms of understanding what is going on with color reproduction in LCD displays ( and others) you are making a point that is the equivalent of trying to make D65 with an incandescent lamp. White is a rather useless term. All "white" has a color and is a mix of other colors. Primaries do not get diluted with white. They get desaturated by adding the other colors. What you want is a spectrum that is correct, not "white," nor "full spectrum," nor narrow band RGB. Correct depends upon the assumptions that are made in recording the image, as well as upon the filters and color decoding that you implement in the display. As I have said many times, the underlying assumption that video has used is that RGB spectral densities should follow the standard observer curves. When you violate that assumption on the display end, you get some unusual results with some colors and you have to compensate in your color decoder. The degree to which, and the techniques used are unclear in these sets. The results are mixed. Given the sloppy nature of color decoding and color management in consumer displays in general over the years, however, these sets are likely to be perfectly acceptable to most consumers. They are likely better than much of what they have been viewing in the past by quite a margin. That does not mean that they are going to accurately reproduce color. Most consumers and most of the posters here would not likely know ( and may not prefer ) accurate color reproduction in a display if they were to happen to see it. Leonard |
#96
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
In article ,
William Sommerwerck wrote: No, when I say "poor studio standards", I'm talking about such things as the failure to set up cameras correct, keep a close eye on burst phase, etc, etc, etc. Garbage in, garbage out. I'd be most surprised if all but the very smallest station with only one camera made mistakes like this. -- *Constipated People Don't Give A Crap* Dave Plowman London SW To e-mail, change noise into sound. |
#97
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"William Sommerwerck" wrote in message
... I guess it comes down to definitions and how 'full spectrum' is perceived. Rightly or wrongly, I tend to think of it as a spectrum which contains the same component colours in the same ratios, as natural daylight... That's a reasonable definition for a video display, but it's not sufficient for source lighting. It's difficult to make a "full spectrum" fluorescent lamp, especially one that produces good color rendition for photograpy. but I guess even that varies depending on filtering effects of cloud cover and haze and so on. Even so, I'm sure that there must be some definition of 'average spectrum daylight', and I would expect that any display technology would aim to reproduce any colour in as closely exact a way as it would appear if viewed directly under daylight. The standard is D6500, a 6500K continuous spectrum from a black-body source. What you suggest is, indeed, the intent. There is no such standard as D6500. One standard, the one used for most video for the color of white, is D65. D65 specifies NOTHING about the spectrum, only the x,y coordinates of the COLOR of light. It happens to be approximately 6504K. The term D6500 is slang and sloppy use that confuses the issues of colorimetry and coordinated color temperature. There are other standards for the color of white that are used for purposes other than video. Some specific purposes in film and cinema, as well as in video use other standards, but for the most part D65 is accepted as the color of white for modern video. The truth is that virtually no consumer displays come out of the box set anywhere near D65, nor producing correct color for any color, including white. What you see in showrooms and when you take a set out of the box is likely a color temp for white that is nearly twice what it should be. Leonard |
#98
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"William Sommerwerck" wrote in message
... Every lcd TV I have seen has colour temp adjustments. What, readily user accessible ? It depends on what you define as a color temperature adjustment. Many (if not most) sets do not have the detailed adjustments that make possible both correct color temperature and good grayscale tracking. When they do, these are not usually available to the customer. You have not looked at many modern displays carefully. Many, actually most of the better sets, have these controls in the user menus now. Some even have color management that goes far beyond gray scale and let you adjust the colorimetry (and perhaps luma) of the primaries and secondaries. Most professional calibrations these days never involve going into a service menu. Leonard |
#99
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
wrote in message
... Arfa Daily wrote: wrote in message ... William Sommerwerck wrote: I guess it comes down to definitions and how 'full spectrum' is perceived. Rightly or wrongly, I tend to think of it as a spectrum which contains the same component colours in the same ratios, as natural daylight... That's a reasonable definition for a video display, but it's not sufficient for source lighting. It's difficult to make a "full spectrum" fluorescent lamp, especially one that produces good color rendition for photograpy. but I guess even that varies depending on filtering effects of cloud cover and haze and so on. Even so, I'm sure that there must be some definition of 'average spectrum daylight', and I would expect that any display technology would aim to reproduce any colour in as closely exact a way as it would appear if viewed directly under daylight. The standard is D6500, a 6500K continuous spectrum from a black-body source. What you suggest is, indeed, the intent. TBH I think this is overplaying the significant of daylight. Almost any monitor is adjustable to suit preferences of anything from 5000K to 10,000K, and some go lower. None manke any attempt to copy the colour spectrum of daylight, they merely include the same colour temp as daylight as one of the options. None of the major display types have any ability to copy a daylight spectrum, as they're only RGB displays. NT But take account of the fact that we're talking domestic television sets here, not computer monitors. For the most part, TV sets do not display the same type of content as a computer monitor, and do not include user accessible colour temperature presets or adjustments, fwiw my main set does, and I'm sure its not unique. Generally though a TV is a much lower quality animal than a monitor, and displays much lower quality data. which is why I made the point earlier that in general, LCD TVs are set correctly 'out of the box'. because they can be. CRTs are more variable, and the circuits used to drive them a lot less precise, partly because CRT sets are generally older, and the sort of standards expected in monitors have only begun crossing over to tvs in recent years. As far as overplaying the significance of daylight goes, I'm not sure that I follow what you mean by that. If I look at my garden, and anything or anybody in it, the illumination source will be daylight, and the colours perceived will be directly influenced by that. If I then reproduce that image on any kind of artificial display, and use a different reference for the white, then no other colour will be correct either, what makes you think that just one specific colour temp is 'correct'? Real daylight is all over the place colour temp wise, and the end user experiences those changes without any problem. Also any self respecting monitor offers a range of colour temps, since its nothing but a taste matter which was ever the case when CRTs were set up to give whites which were either too warm or too cold, even by a fraction. but thats down to historic reasons, customers never expected precise colour temp, and screens were routinely set up by eye. The circuits involved couldnt set themselves up the way a modern LCD set can, there was normally no feedback on colour channels, just open loop CRT gun drive on top of a massive dc offset, so the systems were inherently variable. Plus the fact that CRT gamma was often way off from the real world made it hard, or should I say impossible, to set such sets to give a faithful reproduction in other respects anyway. Maybe we're talking at cross purposes here, or I'm not understanding something properly, but it seems to me that the colour temperature and CRI of the backlighting on an LCD TV, would be crucially important to correct reproduction of colours. It has almost nothing to do with it, because the level of each colour channel output on the screen depends on both the light source and the settings of the LCD R,G,B channels. Within reason, any temperature colour backlight can produce any temperature colour picture. All I know is, is that the flesh tones were poor on the example that I saw, compared to other LCD TVs which were showing the same picture. The fundamental difference between those sets and the Sammy, was the CCFL vs LED backlighting, so it seems reasonable to draw from that, the inference that the backlighting scheme may well be the cause, no ? Arfa Its just a guess. In fact any desired flesh tone can be reproduced using almost any colour temp backlight, certainly anything from 3,000K to 10,000K. Think about the process, you've got 3 colour channels, each of which has a given level of light from the backlight, which is then attenuated to any desired degree by the LCD pixel. NT While this is true, it would be virtually impossible to get all colors right with some arbitrary color backlight. You could get a subset right and get all the others completely wrong. Leonard |
#100
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
There is usually a "rest of the story" beyond the naive assumptions
that get thrown around about reproducing color. This thread is full of examples. This is not a simple subject. I have Mees' "The Reproduction of Color" (which is, what, 40+ years old?) and it's tough sledding. I had less trouble with integral calculus. |
#101
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"Dave Plowman (News)" wrote in message
... In article , William Sommerwerck wrote: No, when I say "poor studio standards", I'm talking about such things as the failure to set up cameras correct, keep a close eye on burst phase, etc, etc, etc. Garbage in, garbage out. I'd be most surprised if all but the very smallest station with only one camera made mistakes like this. -- *Constipated People Don't Give A Crap* Dave Plowman London SW To e-mail, change noise into sound. You would be very surprised. Leonard |
#102
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
Primaries do not get diluted with white.
What I was implying was that you could reproduce a wider range of colors if the primaries weren't as close to the center of the chart. Radial movement represents changes in saturation -- dilution with white. As for not knowing accurate color reproduction when you see it... What sorts of preferences does the average viewer have? If you don't have the original for comparison, it can be difficult to judge. I never considered myself an expert in color reproduction, but your comments have encouraged me to dig out Mees and give him another try. (I'm not promising anything.) |
#103
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
You have not looked at many modern displays carefully. Many, actually
most of the better sets, have these controls in the user menus now. My Pioneer does, but heck if I'm touching them without instrumentation. Some even have color management that goes far beyond gray scale and let you adjust the colorimetry (and perhaps luma) of the primaries and secondaries. The pioneer has six adustments, for RGB and CMY. |
#104
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"Dave Plowman (News)" wrote in message
... In article , William Sommerwerck wrote: No, when I say "poor studio standards", I'm talking about such things as the failure to set up cameras correct, keep a close eye on burst phase, etc, etc, etc. Garbage in, garbage out. I'd be most surprised if all but the very smallest station with only one camera made mistakes like this. If you look at the first season of "Barney Miller", you'll see poor camera convergence, and slight color shifts between the cameras. And this was in the 1970s, and at ABC's studios. |
#105
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"William Sommerwerck" wrote in message
... Primaries do not get diluted with white. What I was implying was that you could reproduce a wider range of colors if the primaries weren't as close to the center of the chart. Radial movement represents changes in saturation -- dilution with white. As for not knowing accurate color reproduction when you see it... What sorts of preferences does the average viewer have? If you don't have the original for comparison, it can be difficult to judge. I never considered myself an expert in color reproduction, but your comments have encouraged me to dig out Mees and give him another try. (I'm not promising anything.) Most modern consumers have been conditioned to higher and higher color temps for white and over saturated color over the last thirty years or so. Manufacturers realized years ago that in the first few seconds of viewing, where most impressions are made in showrooms, the impression is dominated by contrast and color saturation. This has nothing to do with perceiving color naturally, but everything to do with marketing and competing with a wall of other sets. It is not uncommon for displays to be sold with factory settings that have color temps in the 13000K range, completely crushed blacks and whites, and far to saturated color. Many consumers like this more VIVID look. Others prefer to see a more accurate reproduction of the product as it was produced, and more realistic portrayal of color. This requires substantial changes from OOB settings for most consumer displays, at least in the USA. Leonard |
#106
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"William Sommerwerck" wrote in message
... You have not looked at many modern displays carefully. Many, actually most of the better sets, have these controls in the user menus now. My Pioneer does, but heck if I'm touching them without instrumentation. Some even have color management that goes far beyond gray scale and let you adjust the colorimetry (and perhaps luma) of the primaries and secondaries. The pioneer has six adustments, for RGB and CMY. I would agree. The ability of most consumers to do more than make a mess is very unlikely. Even someone like myself, having calibrated displays for 30 years, can't do much to align a color management system without a GOOD meter. I can get gray scale improved, but not really accurate. Most sets now have RGB gains and cuts for gray scale in the user menu. Some have far more available. Leonard |
#107
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
William Sommerwerck wrote:
If you look at the first season of "Barney Miller", you'll see poor camera convergence, and slight color shifts between the cameras. And this was in the 1970s, and at ABC's studios. Look at the early shows of Star Trek: The Next Generation. They were lit and photgraphed as if they were films. There are many scenes where there is action in the shadows. You would have seen what was happening if you were watching it on film, on TV it was just a grayish blur. If I remember correctly, they were shot on film. Geoff. -- Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM |
#108
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
Most modern consumers have been conditioned to higher and higher color
temps for white and over saturated color over the last thirty years or so. Manufacturers realized years ago that in the first few seconds of viewing, where most impressions are made in showrooms, the impression is dominated by contrast and color saturation. This has nothing to do with perceiving color naturally, but everything to do with marketing and competing with a wall of other sets. It is not uncommon for displays to be sold with factory settings that have color temps in the 13000K range, completely crushed blacks and whites, and far to saturated color. Many consumers like this more VIVID look. Others prefer to see a more accurate reproduction of the product as it was produced, and more realistic portrayal of color. This requires substantial changes from OOB settings for most consumer displays, at least in the USA. You will be pleased to hear that my Pioneer is set to PURE, with all the controls at their default settings (except for a bit of Sharpness goosing). The image is just plain gaw-juss. I considered having a $350 calibration performed, but decided that I wasn't going to pay that much for a technician who knows even less about colorimetry than I to perform. The Pioneers are supposedly nearly correct out of the box. If you want a demo disk, get the Blu-ray of "The Searchers". I don't care for the movie, but the VistaVision photography is jaw-dropping. "Amadeus" and "2001" are almost as good. With the best material, you sometimes think you're looking through a sheet of glass at the thing itself. |
#109
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"Geoffrey S. Mendelson" wrote in message
... William Sommerwerck wrote: If you look at the first season of "Barney Miller", you'll see poor camera convergence, and slight color shifts between the cameras. And this was in the 1970s, and at ABC's studios. Look at the early shows of Star Trek: The Next Generation. They were lit and photgraphed as if they were films. There are many scenes where there is action in the shadows. You would have seen what was happening if you were watching it on film, on TV it was just a grayish blur. If I remember correctly, they were shot on film. I don't understand what you're talking about. "Barney Miller" was videotape, "ST TNG" was film. Regardless of whether tape or film is used, the cinematographer is likely to light the scenes according to what the thinks the average TV is able to reproduce. |
#110
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
The ability of most consumers to do more than make a mess is
very unlikely. Even someone like myself, having calibrated displays for 30 years, can't do much to align a color management system without a GOOD meter. I can get gray scale improved, but not really accurate. Does anyone make cheap-but-good instrumentation? I could justify a $500 investment. (I can hear you laughing now.) |
#111
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
William Sommerwerck wrote:
Regardless of whether tape or film is used, the cinematographer is likely to light the scenes according to what the thinks the average TV is able to reproduce. That was my point. They lit (and photographed it) as if it were going to be shown in theaters and not on TV. Geoff. -- Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM |
#112
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
In article ,
Leonard Caillouet wrote: No, when I say "poor studio standards", I'm talking about such things as the failure to set up cameras correct, keep a close eye on burst phase, etc, etc, etc. Garbage in, garbage out. I'd be most surprised if all but the very smallest station with only one camera made mistakes like this. You would be very surprised. Perhaps standards are higher in the UK, then. -- *Laugh alone and the world thinks you're an idiot. Dave Plowman London SW To e-mail, change noise into sound. |
#113
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"William Sommerwerck" wrote in message
... Most modern consumers have been conditioned to higher and higher color temps for white and over saturated color over the last thirty years or so. Manufacturers realized years ago that in the first few seconds of viewing, where most impressions are made in showrooms, the impression is dominated by contrast and color saturation. This has nothing to do with perceiving color naturally, but everything to do with marketing and competing with a wall of other sets. It is not uncommon for displays to be sold with factory settings that have color temps in the 13000K range, completely crushed blacks and whites, and far to saturated color. Many consumers like this more VIVID look. Others prefer to see a more accurate reproduction of the product as it was produced, and more realistic portrayal of color. This requires substantial changes from OOB settings for most consumer displays, at least in the USA. You will be pleased to hear that my Pioneer is set to PURE, with all the controls at their default settings (except for a bit of Sharpness goosing). The image is just plain gaw-juss. I considered having a $350 calibration performed, but decided that I wasn't going to pay that much for a technician who knows even less about colorimetry than I to perform. The Pioneers are supposedly nearly correct out of the box. If you want a demo disk, get the Blu-ray of "The Searchers". I don't care for the movie, but the VistaVision photography is jaw-dropping. "Amadeus" and "2001" are almost as good. With the best material, you sometimes think you're looking through a sheet of glass at the thing itself. There are lots of calibration techs out there that know little more than how to point a probe at the set and adjust gray scale. There are a few dozen, perhaps, that really understand what it takes to make an accurate display. I suggest you look at the list at ISF Forum. The couple of hundred members who subscribe there are among the best in the world, and all but a handful of the elite calibration pros are found there. Leonard |
#114
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"Geoffrey S. Mendelson" wrote in message
... William Sommerwerck wrote: If you look at the first season of "Barney Miller", you'll see poor camera convergence, and slight color shifts between the cameras. And this was in the 1970s, and at ABC's studios. Look at the early shows of Star Trek: The Next Generation. They were lit and photgraphed as if they were films. There are many scenes where there is action in the shadows. You would have seen what was happening if you were watching it on film, on TV it was just a grayish blur. If I remember correctly, they were shot on film. Geoff. -- Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM Your memory is incorrect, in this case. Leonard |
#115
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"William Sommerwerck" wrote in message
... The ability of most consumers to do more than make a mess is very unlikely. Even someone like myself, having calibrated displays for 30 years, can't do much to align a color management system without a GOOD meter. I can get gray scale improved, but not really accurate. Does anyone make cheap-but-good instrumentation? I could justify a $500 investment. (I can hear you laughing now.) The cheapest I would even consider for most current displays is the i1 Pro. None of the tristimulus colorimeters will be able to measure the narrow spectrum of many modern displays, nor likely match the filters in wider spectrum lighted systems. Even the i1 pro is marginal for the LED and Laser sets, from what I understand. Better meters will be many thousands of dollars. The best pricing that you will find is packaged with the CalMAN software. It is also one of the few software packages that is versatile enough to do just about everything that you might need with most meters. Leonard |
#116
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
"Dave Plowman (News)" wrote in message
... In article , Leonard Caillouet wrote: No, when I say "poor studio standards", I'm talking about such things as the failure to set up cameras correct, keep a close eye on burst phase, etc, etc, etc. Garbage in, garbage out. I'd be most surprised if all but the very smallest station with only one camera made mistakes like this. You would be very surprised. Perhaps standards are higher in the UK, then. -- *Laugh alone and the world thinks you're an idiot. Dave Plowman London SW To e-mail, change noise into sound. Probably. Not as many stations, either. Leonard |
#117
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
Leonard Caillouet wrote:
wrote in message ... William Sommerwerck wrote: The LCD only filters light from the backlight. If you don't have a full spectrum white in the first place the you can't expect decent colour. Not so. All you have to do is hit the defined points in CIE diagram. The Pioneer plasma sets hit them dead-on. Indeed. None of the major display techologies deliver full spectrum, nor do they need to. NT This is true only if you have custom LUTs or decoding algorithms for a display based on the relationship between the spectra of the lighting and the CIE standard observer functions that cameras are generally aligned to approximate. What that has to do with it I dont know. If you find an RGB display with violet output, I'm all ears. NT |
#118
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
Leonard Caillouet wrote:
wrote in message ... Arfa Daily wrote: wrote in message ... William Sommerwerck wrote: I guess it comes down to definitions and how 'full spectrum' is perceived. Rightly or wrongly, I tend to think of it as a spectrum which contains the same component colours in the same ratios, as natural daylight... That's a reasonable definition for a video display, but it's not sufficient for source lighting. It's difficult to make a "full spectrum" fluorescent lamp, especially one that produces good color rendition for photograpy. but I guess even that varies depending on filtering effects of cloud cover and haze and so on. Even so, I'm sure that there must be some definition of 'average spectrum daylight', and I would expect that any display technology would aim to reproduce any colour in as closely exact a way as it would appear if viewed directly under daylight. The standard is D6500, a 6500K continuous spectrum from a black-body source. What you suggest is, indeed, the intent. TBH I think this is overplaying the significant of daylight. Almost any monitor is adjustable to suit preferences of anything from 5000K to 10,000K, and some go lower. None manke any attempt to copy the colour spectrum of daylight, they merely include the same colour temp as daylight as one of the options. None of the major display types have any ability to copy a daylight spectrum, as they're only RGB displays. NT But take account of the fact that we're talking domestic television sets here, not computer monitors. For the most part, TV sets do not display the same type of content as a computer monitor, and do not include user accessible colour temperature presets or adjustments, fwiw my main set does, and I'm sure its not unique. Generally though a TV is a much lower quality animal than a monitor, and displays much lower quality data. which is why I made the point earlier that in general, LCD TVs are set correctly 'out of the box'. because they can be. CRTs are more variable, and the circuits used to drive them a lot less precise, partly because CRT sets are generally older, and the sort of standards expected in monitors have only begun crossing over to tvs in recent years. As far as overplaying the significance of daylight goes, I'm not sure that I follow what you mean by that. If I look at my garden, and anything or anybody in it, the illumination source will be daylight, and the colours perceived will be directly influenced by that. If I then reproduce that image on any kind of artificial display, and use a different reference for the white, then no other colour will be correct either, what makes you think that just one specific colour temp is 'correct'? Real daylight is all over the place colour temp wise, and the end user experiences those changes without any problem. Also any self respecting monitor offers a range of colour temps, since its nothing but a taste matter which was ever the case when CRTs were set up to give whites which were either too warm or too cold, even by a fraction. but thats down to historic reasons, customers never expected precise colour temp, and screens were routinely set up by eye. The circuits involved couldnt set themselves up the way a modern LCD set can, there was normally no feedback on colour channels, just open loop CRT gun drive on top of a massive dc offset, so the systems were inherently variable. Plus the fact that CRT gamma was often way off from the real world made it hard, or should I say impossible, to set such sets to give a faithful reproduction in other respects anyway. Maybe we're talking at cross purposes here, or I'm not understanding something properly, but it seems to me that the colour temperature and CRI of the backlighting on an LCD TV, would be crucially important to correct reproduction of colours. It has almost nothing to do with it, because the level of each colour channel output on the screen depends on both the light source and the settings of the LCD R,G,B channels. Within reason, any temperature colour backlight can produce any temperature colour picture. All I know is, is that the flesh tones were poor on the example that I saw, compared to other LCD TVs which were showing the same picture. The fundamental difference between those sets and the Sammy, was the CCFL vs LED backlighting, so it seems reasonable to draw from that, the inference that the backlighting scheme may well be the cause, no ? Arfa Its just a guess. In fact any desired flesh tone can be reproduced using almost any colour temp backlight, certainly anything from 3,000K to 10,000K. Think about the process, you've got 3 colour channels, each of which has a given level of light from the backlight, which is then attenuated to any desired degree by the LCD pixel. NT While this is true, it would be virtually impossible to get all colors right with some arbitrary color backlight. You could get a subset right and get all the others completely wrong. Leonard With each colour channel you've got everything available from backlight output x LCD max down to backlight output x LCD minimum. AFAIK that covers every flesh tone on this planet, unless one goes down to 2000K backlight or some other very extreme value. NT |
#119
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
wrote in message
... Leonard Caillouet wrote: wrote in message ... Arfa Daily wrote: wrote in message ... William Sommerwerck wrote: I guess it comes down to definitions and how 'full spectrum' is perceived. Rightly or wrongly, I tend to think of it as a spectrum which contains the same component colours in the same ratios, as natural daylight... That's a reasonable definition for a video display, but it's not sufficient for source lighting. It's difficult to make a "full spectrum" fluorescent lamp, especially one that produces good color rendition for photograpy. but I guess even that varies depending on filtering effects of cloud cover and haze and so on. Even so, I'm sure that there must be some definition of 'average spectrum daylight', and I would expect that any display technology would aim to reproduce any colour in as closely exact a way as it would appear if viewed directly under daylight. The standard is D6500, a 6500K continuous spectrum from a black-body source. What you suggest is, indeed, the intent. TBH I think this is overplaying the significant of daylight. Almost any monitor is adjustable to suit preferences of anything from 5000K to 10,000K, and some go lower. None manke any attempt to copy the colour spectrum of daylight, they merely include the same colour temp as daylight as one of the options. None of the major display types have any ability to copy a daylight spectrum, as they're only RGB displays. NT But take account of the fact that we're talking domestic television sets here, not computer monitors. For the most part, TV sets do not display the same type of content as a computer monitor, and do not include user accessible colour temperature presets or adjustments, fwiw my main set does, and I'm sure its not unique. Generally though a TV is a much lower quality animal than a monitor, and displays much lower quality data. which is why I made the point earlier that in general, LCD TVs are set correctly 'out of the box'. because they can be. CRTs are more variable, and the circuits used to drive them a lot less precise, partly because CRT sets are generally older, and the sort of standards expected in monitors have only begun crossing over to tvs in recent years. As far as overplaying the significance of daylight goes, I'm not sure that I follow what you mean by that. If I look at my garden, and anything or anybody in it, the illumination source will be daylight, and the colours perceived will be directly influenced by that. If I then reproduce that image on any kind of artificial display, and use a different reference for the white, then no other colour will be correct either, what makes you think that just one specific colour temp is 'correct'? Real daylight is all over the place colour temp wise, and the end user experiences those changes without any problem. Also any self respecting monitor offers a range of colour temps, since its nothing but a taste matter which was ever the case when CRTs were set up to give whites which were either too warm or too cold, even by a fraction. but thats down to historic reasons, customers never expected precise colour temp, and screens were routinely set up by eye. The circuits involved couldnt set themselves up the way a modern LCD set can, there was normally no feedback on colour channels, just open loop CRT gun drive on top of a massive dc offset, so the systems were inherently variable. Plus the fact that CRT gamma was often way off from the real world made it hard, or should I say impossible, to set such sets to give a faithful reproduction in other respects anyway. Maybe we're talking at cross purposes here, or I'm not understanding something properly, but it seems to me that the colour temperature and CRI of the backlighting on an LCD TV, would be crucially important to correct reproduction of colours. It has almost nothing to do with it, because the level of each colour channel output on the screen depends on both the light source and the settings of the LCD R,G,B channels. Within reason, any temperature colour backlight can produce any temperature colour picture. All I know is, is that the flesh tones were poor on the example that I saw, compared to other LCD TVs which were showing the same picture. The fundamental difference between those sets and the Sammy, was the CCFL vs LED backlighting, so it seems reasonable to draw from that, the inference that the backlighting scheme may well be the cause, no ? Arfa Its just a guess. In fact any desired flesh tone can be reproduced using almost any colour temp backlight, certainly anything from 3,000K to 10,000K. Think about the process, you've got 3 colour channels, each of which has a given level of light from the backlight, which is then attenuated to any desired degree by the LCD pixel. NT While this is true, it would be virtually impossible to get all colors right with some arbitrary color backlight. You could get a subset right and get all the others completely wrong. Leonard With each colour channel you've got everything available from backlight output x LCD max down to backlight output x LCD minimum. AFAIK that covers every flesh tone on this planet, unless one goes down to 2000K backlight or some other very extreme value. NT This is simply not true. Every display has a color gamut that is limited by the maximum saturation of its primaries. You can produce any color within that gamut but not any outside. Even if every flesh tone is in that gamut, that does not mean that you will get the right flesh tones for a given combination of RGB. In order to do so, you must have the same spectrum in the primaries that you have in the camera filters, the correct colorimetry for the white point, and the correct application of the decoding matrix. If you depart from any of these, you can adjust a display for ONE color to be correct, but everything else will be off. Leonard |
#120
![]()
Posted to sci.electronics.repair,uk.d-i-y
|
|||
|
|||
![]()
wrote in message
... Leonard Caillouet wrote: wrote in message ... William Sommerwerck wrote: The LCD only filters light from the backlight. If you don't have a full spectrum white in the first place the you can't expect decent colour. Not so. All you have to do is hit the defined points in CIE diagram. The Pioneer plasma sets hit them dead-on. Indeed. None of the major display techologies deliver full spectrum, nor do they need to. NT This is true only if you have custom LUTs or decoding algorithms for a display based on the relationship between the spectra of the lighting and the CIE standard observer functions that cameras are generally aligned to approximate. What that has to do with it I dont know. If you find an RGB display with violet output, I'm all ears. NT It has everything to do with accurate reproduction of color in video. What you seem to miss is that the underlying assumption in color reproduction in video is that the display and the camera both approximate the CIE standard observer curves for red, green, and blue spectral response. If this is the case, and you encode properly, you can use a standard decoding matrix on the display end and get a reasonable reproduction of what was recorded. If you have a very narrow spectrum on either end, some colors will be reproduced with less energy than with the proper spectrum. This can be compensated for using a customized matrix or LUTs. Again, while it is true that you can make any color within a given gamut with some combination of R,G, & B, it is NOT true that you will get the CORRECT color for ALL colors if the decoding matrix is not correct (very common in many consumer sets over the years, if the gamut is wrong, if the gray scale is wrong, or if the spectrum is wrong. To get the right mix of colors for all colors in a given system, you have to play by the rules for that system. If you change them, such as is the case when you deviate in spectral response from the CIE curves, you have to make it up somewhere else. This gets very complicated and is precisely why some people who are sensitive to color reproduction have noticed that LED based displays have had trouble with some colors. Leonard |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Follow-up on What is this? | Electronics Repair | |||
JD-455 fix follow-up | Metalworking | |||
Follow-up | Woodworking | |||
just a follow up | Home Repair |