View Single Post
  #5   Report Post  
posted to rec.boats.electronics
Bruce[_3_] Bruce[_3_] is offline
external usenet poster
 
First recorded activity by BoatBanter: Feb 2009
Posts: 503
Default Temperature Rise in Cable Based on Watts per Foot

On Thu, 23 Jun 2011 08:39:24 +0700, Bruce
wrote:

On Wed, 22 Jun 2011 13:33:09 -0400, Wayne B
wrote:

Does anyone know how to calculate temperature rise above ambient in a
cable which is dissipating power due to voltage drop and current flow?

For example, a 12 ft length of #4 cable carrying 100 amps dissipates
about 35 watts total, approximately 3 watts per foot. That causes a
certain amount of heating. Assuming free space and decent air flow,
how warm is the cable likely to get at 100 degrees F ambient
temperature?

Are there any good web sites for that calculation?

Thanks in advance.


Try:
http://www.standard-wire.com/current...ty_copper.html
http://donsnotes.com/home_garden/wire.html
http://www.delcowireus.com/current-c...g-capacity.php

I think that your calculations are off. A quick look shows a
resistance for #1 cable at 0.12 ohms/1,000 ft. or .00012 ohms/ft. so a
12 ft. length of #1 has a resistance of .00144 ohms. Voltage through
the cable is: V = IR or V = 12 x .00144 = 0.144 volts. Power absorbed
by the cable is then P = V x A = 0.144 x 100 = 14 watts.

Cheers,

Bruce



Correction. I should have written "voltage loss through the cable"
rather then "voltage through the cable"..

Cheers,

Bruce