On Wed, 30 Mar 2011 16:41:19 -0500, dpb wrote:
On 3/30/2011 4:30 PM, wrote:
...
Only one POSSIBLE result.
Well, if I think of it as a voltage divider, the 200W-er is roughly 4X
the R of the 50W and thus the voltage drop will be about 80% across
it...R1/(R1+R2)
You have it totally backwards. The 200 watt bulb is roughly 1/4 the
resistance of the 50, so will drop roughly 20% of the voltage across
ir, assuming equal temperatures. This means the 50 watt bulb gets 192
volts across it when cold - and as the temperature goes up and the
resistance increases, the current drops, causing the 50 watt bulb to
see closer to 210 volts - for a split second.