View Single Post
  #90   Report Post  
Posted to alt.energy.homepower,alt.energy.renewable,alt.home.repair
g g is offline
external usenet poster
 
Posts: 13
Default Feeding solar power back into municipal grid: Issues and finger-pointing

On 12/04/2011 16:50, Home Guy wrote:
g wrote:

So when a secondary electricity source comes on-line (like a
small PV system) then in order to push it's current into the
local grid it will have to *try* to raise it's output voltage
in order to see some current flow. It might only be a few volts,
maybe less.


1) The actual voltage increase will relate to the ratio of grid
impedance vs local impedance, i.e. your local power consumers
(fridges, heaters etc) has a much higher impedance relatively,
thus the grid will "take" the majority of the generated power.


I'm not arguing that the grid can't or won't take any, the majority, or
all of the generated power.

The question here is - what exactly must the invertors do in order to
get as much current as the PV system can supply into the grid.


The inverter must ensure that it transforms the DC from PV to the
frequency and voltage of the grid. To ensure flow of current into the
grid the voltage must be attempted to be raised. Because there are
losses between the inverter and the grid, the voltage will be higher
than the grid.


If our analogy is pipes, water, and water pressure, then we have some
pipes sitting at 120 PSI and we have a pump that must generate at least
121 PSI in order to push water into the already pressurized pipes.


Fairly good analogy, and due to internal resistance in the pipe then
that must be overcome by having a higher pressure. Don't forget that
somewhere someone else has to reduce the water flow into the pipe system
in order to avoid pressure buildup. Because the water in the pipe system
is used up as it is supplied, at the same rate.


The _only_ increase in voltage you will see results from the
voltage drop in the grid components.


Not sure I understand what you're trying to say there.


See the pipe analogy above, the power lines from the inverter has some
resistance, which results in a voltage drop. Therefore the voltage
measured at the inverter will be slightly higher than measured a
distance away.


But does that mean there will be a measurable net reduction
in the current being supplied by the high-voltage substation
for that corner of the city?


2) Pretty complex calculation, but yes, _somewhere_ one or
more generating pieces of machinery will reduce its output.
Makes sense intuitively, does it not?


No, I don't agree.


Why? take a hypothetical grid with 1 megawatt consumption. Generating
machinery produce that energy at a set voltage. Mr Homeowner connects to
the grid with a 10kW PV array. If no power utility adjustment took place
then the overall voltage of the grid will increase. OK for small
fluctuations, but if enough PV arrays came online, somewhere energy
production has to decrease or bad things will happen due to high grid
voltage.



Hypothetically speaking, let's assume the local grid load is just a
bunch of incandecent lights. A typical residential PV system might be,
say, 5 kw. At 120 volts, that's about 42 amps. How are you going to
push out 42 amps out to the grid?

You cannot unless your local load is zero. You must subtract the local
load from the generated PV array power if the house load is lower. If
the house load is higher than the PV array output then you will use all
the PV array power with the difference supplied from the grid.


They're going to burn a little brighter -

Correct, due to a slightly raised voltage if there is a voltage drop
between the inverter and the grid. (There is some drop)

they're going to use all of the current that the local grid was already
supplying to them, plus they're going to use your current as well.

Not possible, the current is controlled by the internal resistance in
the lamp. They will draw a current by the formula volt/resistance.
So when the PV array produces current, grid current is reduced.

The voltage increase you will see at the output of the inverter is very
small, but it does depend on the cables used.

An example: I have a 300 feet underground cable to the nearest utility
transformer and a 100A service panel.

If I max out the power, I will have a voltage drop over the cable of
about 6 Volts. Much higher than normal households.

When your PV array is producing full power, and your house load matches
that, then the voltage difference between the grid and inverter is zero.

at any other house load, current will flow in the power utility lines,
and the inverter voltage increase is a function of the loss in those
lines.