Bit of a con, really ... ?
In fact, I'm pretty certain that most of what's being posted about color
TV and color analysis/reproduction is utter bilge. But I don't have a
comprehensive understanding of this material (it's not easy), so I'm
pretty much keeping my mouth shut.
Quite a bit of what I'm saying comes from working in TV production -
although I'm on the sound side. But hear plenty from those who work
on the vision side of things. ;-) And it was certainly the case that Grade
1 picture monitors continued with delta gun tubes long after PIL were
introduced domestically -- and stuck with the original NTSC phosphors.
As this was the standard the cameras were 'calibrated' to.
But there was no "original" NTSC red phosphor -- just a standard for it,
that the original red phosphors didn't meet.
I /remember/ this from 45 years ago. (I read "Radio-Electronics" and
"Electronics World".) The available red phosphor was not very efficient, and
when driven hard, it turned "orangey" at high current levels.
As for any differences between professional and consumer CRTs... It's true
that consumer CRTs were often designed for brightness * (rather than color
accuracy or gamut). About 15 years ago, Mitsubishi brought out a consumer
CRT with "filtered" phosphors that more-closely approached the NTSC standard
** -- at the expense of brightness. The sets using it quickly flopped,
because (at least then) people were more interested in brightness than
clarity.
* The default setting for most LCD and plasma sets is the "burn the viewer's
eyes" mode.
** The closer a primary is toward the edges of the chromaticity diagram, the
more saturated it is (ie, the less its output is diluted with white) -- and
it's therefore less bright.
|