View Single Post
  #22   Report Post  
Posted to uk.d-i-y
Johnny B Good Johnny B Good is offline
external usenet poster
 
Posts: 1,491
Default Voltage optimisers

On Fri, 04 May 2018 15:46:37 +0100, ARW wrote:

Just fitted my first ever voltage optimiser.

Not much to it but I am puzzled.

I tapped of the 10% drop terminals. The incoming supply to the optimiser
is 251V and the outgoing load is 236V.

So where does the 10% come into this?


That was a 6% drop. Perhaps the other 4% kicks in at maximum rated load
when leakage inductance and transformer winding resistance volt drop take
their toll?

It sounds to me like it's a basic, if BFO[1], autotransformer. It also
sounds like it's intended to be installed between the meter and CU to
reduce the whole house supply, presumably to compensate for a PSU line
voltage that's consistently on the high side of the nominal 240v used in
the UK.

If it included automatic voltage adjustment to compensate for variations
in supply voltage, it would make sense. If it's just an autotransformer,
prewired to provide a fixed 10% reduction as your post implies, it
doesn't make much sense since the proper solution is to complain to your
PSU about the overvoltage and arrange for them to adjust the supply back
to the nominal 240v.

However, 251v is within the +10% upper limit for a notional (harmonised)
230v supply upon which all domestic kit, excluding incandescent lamps, is
optimised. If the supply never exceeds 253vac using an accurate
voltmeter, then it's within tolerance and unless the supply regularly
exceeds this, quite frankly, I can't really see the point of inserting
such a BFO autotransformer between the meter and the CU.

If this has been installed in the mistaken belief that it will reduce
electrical consumption by household appliances, as almost everyone else
has pointed out, it won't. Indeed the additional losses, even if quite
tiny, will increase consumption negating any small gain made with
incandescent lamps which will become noticeably dimmer, yet at the same
time may last twice as long as normal (so, not all bad news if you don't
minder the dimmer look).

If the average line voltage is above nominal, the 10% reduction
shouldn't be a problem to white goods appliances which use electric
motors, especially if they're modern 'Harmonised' appliances. However,
too little voltage can cause motors to overheat and burn out, especially
during startup surges with compressor loads such as fridges and freezers
and, to a lesser extent, washing machine drum motors.

Kit that uses smpsus such as TV sets, desktop computers and the like,
will automatically draw exactly the same power, typically over a supply
voltage that ranges from a low of 186vac to a high of 265vac. Some of
this kit may, like universal mains voltage laptop chargers and other
battery chargers, function over the range of 90vac to the 265vac limit,
drawing pretty much an almost constant power level according to demand
from their electronic loads.

Most appliances that rely on heating elements for their function will
simply use a longer duty cycle controlled by a thermostat. So, again, no
net saving in energy consumption there. Even electric kettles will simply
take longer to boil before their anti-boil dry sensor conveniently
switches the kettle off as a kindness to the user. The more protracted
heating up time will allow a little more heat to escape increasing the
energy consumption slightly as it will in the case of every other
appliance that uses energy for heating.

In short, if this "Voltage Optimiser" has been sold to your customer on
the basis of reducing electrical consumption, then it has been sold as a
"Snake Oil Solution" to a non-existent problem.

[1] Assuming a 10% drop on a 250v 100A supply, you'd need a 2.5KVA rated
autotransformer which would weigh in, afaicr, at around 25 to 30Kg which
in a domestic installation is one Big **** Off autotransformer! :-)

--
Johnny B Good