Time to get acquainted with Ohms Law (don't forget, you asked for this):
V = I*R, where V is volts, I is current, and R is resistance.
You can calculate the voltage drop across your wiring if you know how much current you are drawing, and the resistance of the wire.
Get the resistance from a wire table:
http://www.thelearningpit.com/elec/tools/tables/Wire_table.htm
This will tell you Ohms per 1000 feet. For example, 16 gauge wire is about 4 Ohms/1000 ft, or .004 Ohms/ft.
So, let's assume you use 16 gauge wire and the inverter is 10 feet from the battery. That's 20 feet of wire (you have to account for both power and ground wires).
Now, you want to power a modest 120W light bulb. That bulb draws 1 Amp at 120V (P = V*I - P is power, V and I you have already met).
Now that inverter is going to require 120W on its input side, assuming that it is 100% efficient, which it wont be. 120W at 12V requires 10A (12V * 10A = 120W) to produce an equivalent 120V at 1A.
Now plug and chug:
(20ft * .004 Ohms/ft) * 10A = 0.8V dropped across the wire.
That's not bad if all you need is 120W. But suppose you want to power a 10A load, like a small tablesaw. Now you will drop 8V across the wire, and well, that just won't work. As long as you can keep the wiring losses below 10% you can probably make it work. So you would have to use 6 gauge wire, or cut the distance from the battery to less than a foot. There are some other considerations like heat rise, but we won't go there.
In summary, it is the resistance of the DC wiring that is the limiting factor. Use either larger wire or shorten the distance to get better efficiency.
In anything at all, perfection is finally attained not when there is no longer anything to add, but when there is no longer anything to take away.
-- Antoine de Saint-Exupery