View Single Post
  #3   Report Post  
Posted to sci.electronics.repair,sci.electronics.design
William Sommerwerck William Sommerwerck is offline
external usenet poster
 
Posts: 3,833
Default Question of TV technology, if anyone can answer two questions

wrote in message
...

And further, the other question, why don't they use electrostatic
deflection? At least for the horizontal. I am pretty sure that
today's transistors would have alot of trouble doing 1080i, if they
ever can at all, because the yoke is inductive. Start kicking H up to
67.2 kHz, it is no fun. But with a non-inductive load, wouldn't the
scan rate changes be easier to manage? They do it in spades in
scopes.


Using electrostatic deflection requires a CRT with deflection plates. Such a
tube would have (I believe) a thicker neck. Also, the output transistors
would have to swing at least a couple hundred volts to deflect the beam.

My guess is that Sony, et al, stick with magnetically deflected tubes
because they've been the standard for 60 years. That's the kind of tube they
build, and the kind of deflection circuits they design.

However "correct" your theories might or might not be, they run up against
industry practice. Electrostatic deflection is considered obsolete, at least
with respect to video displays. And pretty soon CRTs will be obsolete with
respect to video displays.

I have a Toshiba CZ-3299K HDTV that's about 12 years old. It has a 32"
magnetically deflected CRT, and runs at four times the normal scanning rate
(~ 63kHz) without problems.


Electrostatic deflection might be more affected by beam current
changes.


I don't think so. Try changing the screen brightness of a 'scope's CRT. Does
the deflection change?


I do not know enough CRT technology to know something of that
nature. However, they have already found out that steady deflection
along with precise HV regulation does not work. The raster will get
smaller because beam density affects deflection sensitivity.


Again, I don't think so. If this were true, the image on any magnetically
deflected CRT would show severe geometric distortion that varied with image
brightness. It doesn't.


That's why there are seperate resistors going to each CRT anode in a
high voltage splitter. That is also why they have abandoned extremely
tight HV regulation in favor of more precise and modulated control of the
deflection. They have integrated HV level with beam current, and also
use it to control the vertical drive now.


Are you sure? How can you change beam current without changing brightness?