View Single Post
  #4   Report Post  
posted to rec.boats.electronics
Wayne B Wayne B is offline
external usenet poster
 
First recorded activity by BoatBanter: Sep 2009
Posts: 1,638
Default Temperature Rise in Cable Based on Watts per Foot

On Thu, 23 Jun 2011 08:39:24 +0700, Bruce
wrote:

On Wed, 22 Jun 2011 13:33:09 -0400, Wayne B
wrote:

Does anyone know how to calculate temperature rise above ambient in a
cable which is dissipating power due to voltage drop and current flow?

For example, a 12 ft length of #4 cable carrying 100 amps dissipates
about 35 watts total, approximately 3 watts per foot. That causes a
certain amount of heating. Assuming free space and decent air flow,
how warm is the cable likely to get at 100 degrees F ambient
temperature?

Are there any good web sites for that calculation?

Thanks in advance.


Try:
http://www.standard-wire.com/current...ty_copper.html
http://donsnotes.com/home_garden/wire.html
http://www.delcowireus.com/current-c...g-capacity.php

I think that your calculations are off. A quick look shows a
resistance for #1 cable at 0.12 ohms/1,000 ft. or .00012 ohms/ft. so a
12 ft. length of #1 has a resistance of .00144 ohms. Voltage through
the cable is: V = IR or V = 12 x .00144 = 0.144 volts. Power absorbed
by the cable is then P = V x A = 0.144 x 100 = 14 watts.


===

Bruce, lots of good information there, thanks. My calculations don't
add up to yours because of an allowance for connector resistance in
addition to the cable.

It looks like the maximum current should not exceed 140 amps in order
to keep the temperature below 110 C. That's within my expected
charging limit so should be OK with #4 cable but I'll have to keep an
eye on it for a while.