On Sun, Nov 20, 2005 at 08:23:34PM +0000, Wayne Stallwood wrote:
On Fri, 2005-11-18 at 12:44 +0000, Chris Green wrote:
The 'constant voltage' is only an indication of how the charger controls the charging current. Unless the charger has a *ridiculously* low internal resistance the voltage will be dictated by the state of charge of the battery and little else.
I agree that lead acid battery chargers are nominally 'constant voltage' as they control their charging current according to the voltage being output and detect a fully charged battery by the voltage. NiCd and NiMh chargers are (basically, but with caveats) constant current devices.
I think perhaps we are talking about the same thing but using different terminology...hence the confusion.
By my understanding.
Most lead acid chargers charge at a constant voltage, a car battery charging circuit will for example charge at a constant 13.8v (or whatever, depending on the delivery capacity of the charging circuit) (12v being the nominal voltage of a car battery) This voltage is output regardless of battery charge condition...naturally the charge current then goes down as the charge progresses (and therefore as you say the battery voltage goes up) Because the output voltage is higher than the nominal voltage of the battery, once charged the offset is calculated to equal a acceptable trickle charge (given the internal resistance of the average battery)
No, I think this is where we differ. The charger will *attempt* to charge at a constant 13.8 volts (or whatever) but the actual voltage across the battery terminals will be determined by the state of charge of the battery. The charger will (hopefully) be designed so that the charging current reduces to a trickle charge when the battery voltage reaches the desired voltage.
This is one of the (many) reasons why we now use alternators in cars rather than dynamos, it is hard to build a constant voltage charge around a dynamo supply unless the dynamo is running at a constant speed. This is also why a current meter in a modern car is far more useful than a voltage meter.
Alternators are the most 'non constant voltage' thing you could imagine. It's the electronic regulator and rectifiers that have made the use of alternators possible. The one thing that the alternator does that's "clever" is limit the charging current by its internal resistance. The major reason, as I understand it, for the move from dyanamos to alternators is the availability of cheap electronics and the better reliability (because of their simplicity) of alternators compared with dynamos.
Most smart charging circuits don't actually measure the battery voltage because this is difficult to do during the charge process. They simply measure charge current and when it hits a low trigger the charge is considered complete. This is important in a fast charger because in order to achieve a fast charge the output voltage may be too high to drop to a acceptably low trickle charge (stick 15 volts into a 12 volt lead acid indefinitely and you will just boil off the acid, which apart from not being too good for the battery, is pretty dangerous in a confined space)
As I understand it what happens is that the 'fast' charger will be current limited (by it's own internal circuitry) until the battery voltage rises to a voltage which indicates that it is fully charged. You simply *can't* "stick 15 volts into a 12 volt lead acid", the voltage will stay obstinately at whatever voltage is dictated by the state of charge of the battery.
This is certainly what my Gunsons 'automatic' charger says it does in the blurb that comes with it. While the battery voltage is at any voltage significantly less than the voltage indicating fully charged the charger runs at it's maximum current output. Then, as the battery voltage approaches a 'fully charged' voltage, the current reduces. It has two modes then, one which will put a really full charge in the battery but which will make it gas slightly if left connected and one (which stops at a slightly lower voltage) which will not quite fully charge the battery but won't make it gas if left connected. It's very explicit that the charging current is determined by voltage, that's how it can be used an just about *any* capacity of lead acid battery. As long as the battery can cope with (for this particular charger) a charging current of around 8 amps then it will work. Any discharged lead-acid battery connected to it will get 8 amps pushed into it initially.
Dumb (quick and dirty) NiCd (and NiMh) chargers operate in the opposite way, you set a hard current limit (for trickle this is usually capacity/10, at least that is what I have set for the charging circuit I have just built for my portable mp3 streamer project). This current limit is held regardless of the internal resistance or voltage of the pack connected, although naturally if the pack voltage gets close to the charge voltage this may drop. You can determine the charge state by looking at the voltage across the pack (even when on charge) with this circuit because your Constant Current circuit will be adjusting the charge voltage to achieve the desired charge current, therefore the higher the charge voltage the nearer you are to full charge.
Yes, simple NiCd and NiMh chargers are designed to run at a constant current of capacity/10 (or better for long life capacity/14) because the batteries will survive this rate of charge for a long time without damage even if they're already fully charged.
I doubt it, the voltage across the battery will be dictated by the state of charge of the battery. It's the *voltage* of the battery while charging that the charger uses to detect how near fully charged the battery is. A 50% charged 100Ah battery will show the same voltage as a 50% charged 10Ah battery (some assumptions of course, similar battery type and construction, similar state of decay, etc.). The charger will push the same current into both, it's just that the voltage of the lower capacity one will rise faster as it gets charged.
Sorry I wasn't very clear, my point was that I bet a lot of the cheaper wall wart chargers are duty cycle limited. Given a constant voltage charge the charger will be delivering the same current at the start of a charge for a 10Ah pack as a 100Ah pack but it will be delivering that maximum current for 10 times as long, given manufactures habits of designing everything down to the minimum this could quite likely result in meltdown.
It could do I suppose but I would have thought most small chargers reach thermal equlibrium long before the battery they're charging, whatever its capacity, has been charged. I suppose being hot for 20 hours will be more likely to kill the charger than being hot for 2 hours but I doubt if the temperature reached will be much different.