Alfa Romeo/Alfa Romeo Digest Archive

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[alfa] Alternator cable voltage drop



Hi All,

To answer the question of why you see a greater voltage at the alternator output terminal
than you see at the battery is quite simple.

To get a current to flow in a circuit there has to be a voltage difference between the
voltage source and the load, in electrical engineering parlance this is known as a
"Potential Difference".  In this case the source is the alternator and the load is
the battery and all the attached accessories.  Modern alternators have the voltage
regulator built in, so they sample the voltage at the source.  The resistance of the wire
between the alternator and the battery is small but still significant, with the charging
current an alternator can produce.

The voltage regulator tries to maintain a fixed voltage at the alternator output terminal.
It does this by varying the current flowing in the alternator excitation or field winding.
As the alternator load goes up the internal resistance of the alternator due to the stator
windings, rectifier diodes etc causes an internal voltage drop.  The biggest contributor
are the rectifier diodes which are arranged as a 3 phase bridge.  Rectifier diodes
typically have a forward voltage drop of about 0.6V at small currents and rises to
around 1.5V at full charging curent.  These "resistances" are before the alternator
regulator sampling point but cause the safe maximum current able to be sourced to be
limited to some value, mainly determined by the allowable dissipation in the diode
assembly.

The manufacturers insert a type of "current limiter" between the alternator and the load -
the battery - to help protect the internal diode assembly.  This "current -limiter" is the
wire between the alternator and the battery.  If the resistance of the wire between the
alternator and the battery is too low then the peak charging current can rise to a value
that endangers the diode assembly and a diode will overheat and go open circuit.  This
then becomes a catastrophic failure mode and the other diodes have to handle more current
and they then start to fail in sequence.  Under a short circuit condition on the
alternator output the only thing limiting the peak current is the diodes, and to a lesser
degree the stator windings.  The regulator is not a clever device it does not measure the
current flowing, it only measures the voltage at its output terminal and tries to maintain
it at a constant value.  If it sees the voltage below the set point it cranks up the field
winding current to address the imbalance, similarly if the voltage rises to above the set
point it backs off the field current to reduce the output current and bring the alternator
voltage down to the correct value.

So the wire between the alternator needs to have some resistance to act as a safety
device.

Most alternator regulators today aim to hold the voltage at about 14.5V, it depends on the
type of battery chemistry in use.  Earlier batteries required a lower terminal voltage
because of the chemical composition then used, typically 14V to 14.2V.  Later battery
technology, using lead-calcium for example, as used in "Maintenance-Free" batteries
require
a higher terminal voltage to ensure a full charge.  If you install a modern voltage
regulator and are still using an older technology battery then the terminal voltage -
often known as the "Cut-Off Voltage" - will cause the battery to be over charged and a
constant loss of electrolyte due to boiling will quickly destroy the battery.  High
battery temperatures, due to high under hood temperatures aggravate this problem.  Very
cold batteries, as you would experience during winter in some countries, fortunately not
here in Africa, require a higher charging voltage until the electrolyte temperature rises
to about 20C. Chrysler used a battery temperature monitoring sensor in the SBEC ECUs to
modify the alternator charging regime, the regulator was in the SBEC ECU and fed the
external field windings

Going back to my opening statement about voltage drop and why it occurs.  If the voltage
at the alternator output and the battery is exactly the same, then no current can flow
because there is no potential difference existing.  A battery when it is discharged is a
pretty low resistance and will take as much current as the alternator can deliver.  The
alternator is rated at a certain maximum charging current, many modern alternators can
deliver 100A or more into a discharged battery or other load.  If there was nothing to
limit the peak current the diodes and stator windings would be overstressed and likely to
fail.

Admittedly some manufacturers take this protection scenario a little too far and in some
cases you can prudently reduce the voltage drop some by increasing the wire size to lower
the resistance, but as with most things there is a safe limit.

John
Durban
South Africa
--
to be removed from alfa, see /bin/digest-subs.cgi
or email "unsubscribe alfa" to majordomo@domain.elided


Home | Archive | Main Index | Thread Index