View Single Post
  #25   Report Post  
Posted to uk.d-i-y
Martin Brown Martin Brown is offline
external usenet poster
 
Posts: 2,701
Default 6 ohm speakers - uprate to 8 ohm?

On 20/02/2012 12:05, Terry Casey wrote:
In ,
says...

I see no reason that it should matter. The more the speaker's ohms
rating matches the output impedance of the amp, the more power will be
available for the speakers - AFAIK. However there are I think some here
who've actually designed this stuff so I defer to them.

The amplifier's output will be essentially a voltage source, and its
impedance will probably be only a small fraction of an ohm. A 6 ohm
speaker will draw around 8/6ths of the 8 ohm current, and the power will
be 8/6 squared more (appx 1.78 times more).


Not quite as much - but significant.

Assume, for simplicity, that the amplifier output is 8V RMS.

Power is E^2/R = 64/8 = 8W for an 8 ohm speaker

and 64/6 = 10.66W for a 6 ohm speaker, which is an increase of a third.

As the voltage is fixed, the current delivered by the amplifier also
increases by 33% - so the important bit here is whether the amplifier
can deliver this extra power without damage.


It would be a very poor amplifier that couldn't cope. Most are supposed
to shut down or decrease output power if they don't like their load.
Though some will oscillate if they really can't cope with exotic low
impedance speakers, silly types of audiophile cable or both.

A nominal 8ohm speaker can vary in actual impedance across the audio
range from something like 4 to 16 ohms or more depending on the actual
frequency and how the crossovers are implemented. It is not uncommon for
a nominal 8R speaker to go down to below 4R and up to 16R eg

http://www.stereophile.com/content/m...r-measurements

Fig 1 solid line is the impedance in ohms
(with a silly number of digits on the axes)

--
Regards,
Martin Brown