Distance from battery to inverter

Anything electric, AC or DC

Distance from battery to inverter

Postby misterW » Tue Sep 14, 2010 11:04 am

I have been reading that is more efficient to have your inverter close to your battery and have your AC wires run more distance. My original plans had the inverter probably a good 10 feet from the battery. I have come up with an alternative plan, but I am wondering:

Exactly how much difference does it make? Is it simply the correct thing to do in theory? Or would I notice a significant loss of efficiency if the inverter was 10 feet away?
misterW
Teardrop Builder
 
Posts: 39
Joined: Sat Feb 13, 2010 3:25 pm
Location: Upstate NY

Postby Cliffmeister2000 » Tue Sep 14, 2010 12:01 pm

AC does travel more efficiently than DC, and higher voltage helps, so I'm guessing you got some good advice.

I use just a portable inverter that plugs into cigarette lighter type sockets I have in both the galley and the cabin. I only use it to charge my cell phones and laptop.
God Bless

Cliff

♥God. ♥People.
1 John 4:9-11

My Teardrop build pictures
User avatar
Cliffmeister2000
Titanium Donating Member
 
Posts: 3622
Images: 157
Joined: Thu Jul 26, 2007 10:18 pm
Location: Phoenix, AZ

Postby misterW » Tue Sep 14, 2010 12:45 pm

Oh, I don't doubt that it does... I was just curious how MUCH of a difference it makes. If its barely enough to detect over that distance, I probably won't revise my plans. If it makes a noticable difference, I'll change things around.
misterW
Teardrop Builder
 
Posts: 39
Joined: Sat Feb 13, 2010 3:25 pm
Location: Upstate NY
Top

Postby ERV » Mon Oct 04, 2010 8:04 pm

I run a 2000 watt. in the back of the pickup, I think that is about 12-15 feet from the battery. You have to use big wire tho. I also have a fuse front and back, if it grounds out any where along the frame it is safe.
I also have a switch under the hood to lock out the two batteries in the back. This does three things, it lets me start if I run the inverter to much with out the truck running, and if the truck battery goes dead I can use the back batteries to start the truck, and it charges the back batteries as I run down the road.
In the back of the truck I have two marine batteries I feed off of. Total of about 3 foot of wire to the inverter.
If I am at camp just using the hot plate or coffee maker, it works great. Can also charge the tear battery at the same time by just plugging into the trailer.
If I am working a side job in some ones back yard, I have the truck running. I build decks, so I have lights going, and power tools. Just a lot better than a loud generator running.
My first one was under the hood so I didn't have to run the large wire to the back. But inverters do not like heat. With the motor running it would shut down from the heat of working and the engine running. And it didn't last long.
I don't use the inverter a lot during the colder months. I keep a maintainer on them at night when I am at home.
Hope this helps.
ERV & JAN
Medina, Ohio
User avatar
ERV
Silver Donating Member
 
Posts: 434
Images: 90
Joined: Wed Oct 15, 2008 9:31 pm
Location: Medina, Ohio
Top

Postby wannabefree » Mon Oct 04, 2010 10:19 pm

Time to get acquainted with Ohms Law (don't forget, you asked for this):

V = I*R, where V is volts, I is current, and R is resistance.

You can calculate the voltage drop across your wiring if you know how much current you are drawing, and the resistance of the wire.

Get the resistance from a wire table: http://www.thelearningpit.com/elec/tools/tables/Wire_table.htm

This will tell you Ohms per 1000 feet. For example, 16 gauge wire is about 4 Ohms/1000 ft, or .004 Ohms/ft.

So, let's assume you use 16 gauge wire and the inverter is 10 feet from the battery. That's 20 feet of wire (you have to account for both power and ground wires).

Now, you want to power a modest 120W light bulb. That bulb draws 1 Amp at 120V (P = V*I - P is power, V and I you have already met).

Now that inverter is going to require 120W on its input side, assuming that it is 100% efficient, which it wont be. 120W at 12V requires 10A (12V * 10A = 120W) to produce an equivalent 120V at 1A.

Now plug and chug:
(20ft * .004 Ohms/ft) * 10A = 0.8V dropped across the wire.

That's not bad if all you need is 120W. But suppose you want to power a 10A load, like a small tablesaw. Now you will drop 8V across the wire, and well, that just won't work. As long as you can keep the wiring losses below 10% you can probably make it work. So you would have to use 6 gauge wire, or cut the distance from the battery to less than a foot. There are some other considerations like heat rise, but we won't go there.

In summary, it is the resistance of the DC wiring that is the limiting factor. Use either larger wire or shorten the distance to get better efficiency.
In anything at all, perfection is finally attained not when there is no longer anything to add, but when there is no longer anything to take away.
-- Antoine de Saint-Exupery
User avatar
wannabefree
The 300 Club
 
Posts: 380
Images: 82
Joined: Fri Jul 11, 2008 11:00 pm
Location: Phoenix
Top


Return to Electrical Secrets

Who is online

Users browsing this forum: No registered users and 2 guests