PDA

View Full Version : Best way to determine free amp load?



Painless
07-15-2008, 01:44 PM
I was wondering if any of the vehicle electrics guru's out there could explain what the best way is to determine how much of the alternator load is being used to run and vehicle and how much is 'free' to pump into an HHO system without compromising battery charge?

I certainly think it would be worth making these calculations to make the most of what current is available, but am unsure as to how to calculate it or if I'm on the wrong track and heading down a dangerous road?

Any input appreciated.

Smith03Jetta
07-15-2008, 02:18 PM
I'm no expert but I don't think more electricity can be put into a circuit than is used. It's that pesky Theory of Conservation of Energy at work again. If more electricity is put into a circuit than can be used, something will burn up or circuitry will overheat and electricity will be wasted in the form of heat energy.

Working on that premise, I would theorize that there is NO electricity coming out of the alternator/Generator that is not being used. Once the battery has reached it's charging potential, demand is reduced so the alternator quits making so much electricity.

The job of the Voltage Regulator on a charging system is to regulate the amount coming out of the alternator/generator so that too much electricity is not put into the circuit.

Now... If you introduce a new electrical device such as a Hydrogen on Demand system, the demand for electricity increases. The voltage regulator/alternator/generator system will request more electricity to be produced. The magnetism will be increased on the alternator to create more electricity. The increased alternator demand will reduce available engine power a little bit. Based on my studies, the miles per gallon will be reduced .4 mpg for every 10 amps extra needed.

Now let's change gears. Let's use an example... Assume the electrical load for running a hypothetical automobile including keeping the battery charged = 75 amps and the alternator can produce 200 amps. You can add additional electrical devices to the car up to a theoretical 125 amps without discharging the battery. With this level of demand on an alternator and engine, you can expect a large reduction in gas mileage and your alternator will not last very long.

If I were you, I would look up the production potential for your car's alternator (Should be written on the alternator label) then subtract the amps used by your car with ALL accessories on. This will give you the available amperage.

Does anybody else have another theory?

timetowinarace
07-15-2008, 04:57 PM
No Mr. Smith, I think your right. If accuracy is needed, the old amp gauges will measure the amperage output to the whole electrical system minus the starter. The one I bought at carquest goes from -60amps to +60amps and included instructions on how to wire it for monitoring the total current usage of the electrical system, by-passing the starter of course.

I use mine to monitor my electrolyzer only.

Painless
07-15-2008, 08:53 PM
Thanks for your replies guys, that definitely clears things up in my mind.

I couldn't make my mind up as to whether the alternator always puts out the same ampage or increases it's output with respect to demand. Your explanation, Smith03Jetta, makes perfect sense.

I had been confused by some tests I'd run myself on my ram pickup, I tested my mpg (using the built in meter) over the same stretch of free road over a couple of days to see how much difference their was between using the air conditioning and running with it off and the windows shut (wasn't too fun a test, it was almost 90f that day!). My results could not discern any difference, other than a tenth of an mpg or two, which could easily have been due to other factors. On each run I got up to speed (50mph), set the cruise-control and then reset the mpg meter. The total distance was about 9 miles.

dennis13030
07-16-2008, 01:23 AM
I'm no expert but I don't think more electricity can be put into a circuit than is used. It's that pesky Theory of Conservation of Energy at work again. If more electricity is put into a circuit than can be used, something will burn up or circuitry will overheat and electricity will be wasted in the form of heat energy.

Working on that premise, I would theorize that there is NO electricity coming out of the alternator/Generator that is not being used. Once the battery has reached it's charging potential, demand is reduced so the alternator quits making so much electricity.

The job of the Voltage Regulator on a charging system is to regulate the amount coming out of the alternator/generator so that too much electricity is not put into the circuit.

Now... If you introduce a new electrical device such as a Hydrogen on Demand system, the demand for electricity increases. The voltage regulator/alternator/generator system will request more electricity to be produced. The magnetism will be increased on the alternator to create more electricity. The increased alternator demand will reduce available engine power a little bit. Based on my studies, the miles per gallon will be reduced .4 mpg for every 10 amps extra needed.

Now let's change gears. Let's use an example... Assume the electrical load for running a hypothetical automobile including keeping the battery charged = 75 amps and the alternator can produce 200 amps. You can add additional electrical devices to the car up to a theoretical 125 amps without discharging the battery. With this level of demand on an alternator and engine, you can expect a large reduction in gas mileage and your alternator will not last very long.

If I were you, I would look up the production potential for your car's alternator (Should be written on the alternator label) then subtract the amps used by your car with ALL accessories on. This will give you the available amperage.

Does anybody else have another theory?

This is mostly correct.

The alternator is a voltage source that requires mechanical motion to operate. This mechanical motion comes from the engine.

If we ran a normal car without using any accessory power, the alternator would only be providing power for the coil and the recharging of the battery. The more accessory items we power up, the greater demands there are on the alternator. As demands on the alternator increase so does the mechanical loading on the alternator/engine.

So when we add an electrolyzer to a vehicle, we will lose some efficiency due to the mechanical loading on the alternator/engine.

Painless
07-16-2008, 11:23 AM
Ok, moving outside of the realms of what we are discussing here, but wanting to satisfy my own curiosity :)

The alternator is effectively driven by the crankshaft of the vehicle and in that manner is a full time load on the engine, however, how is this load made 'harder to turn' by increases in amp requirements?

Smith03Jetta mentions changes in magnetism in reaction to load requirements, is it this magnetism that directly increases the force needed to turn the alternator?

Q-Hack!
07-29-2008, 10:49 PM
I'll take a stab at that...

Since the alternator is effectively a coil of wire spinning through a magnetic field, the more amperage you pull the greater the electromagnetic field the harder it is to spin the coil of wire through it.

I am sure that we all played with various magnets as kids (I still do...) the larger the magnets the more force required to pull them apart. Similar concept.

dennis13030
07-30-2008, 10:36 AM
I'll take a stab at that...

Since the alternator is effectively a coil of wire spinning through a magnetic field, the more amperage you pull the greater the electromagnetic field the harder it is to spin the coil of wire through it.

I am sure that we all played with various magnets as kids (I still do...) the larger the magnets the more force required to pull them apart. Similar concept.

Well said....

This is the way that many pieces of exercise equipment function. Like with an exercise bike that has programmable exercise routines, the turning action provided by the user is converted to electricity via a small generator. This generator powers a microcontroller that gives the user program selections and it switches in various loads to simulate flat land or steep mountains.

HHOhoper
07-30-2008, 12:15 PM
I actually used to work at an auto parts store for a while and worked with alternators a little bit. I'm no expert by any means, but have a little experience with them. If you've ever tried to spin the wheel on an alternator by hand you can easily tell a difference between an alternator that is rated for high amp output compared to a lower amp style. If you have your engine running and you have someone inside the vehicle turn on accessories, you can (most of the time) actually hear the alternator make a little more of a whirring noise. Q-Hack put it perfectly, "Since the alternator is effectively a coil of wire spinning through a magnetic field, the more amperage you pull the greater the electromagnetic field the harder it is to spin the coil of wire through it."
All alternators have a range that they are designed to work in. The more time to spend in the higher range, the less life your alternator will have.
With all the accessories that are put on vehicles today, (additional lighting, larger stereo systems, etc) I would think a hydrogen generator wouldn't be much of a concern as long as you don't go crazy.