View Single Post
  #1   Report Post  
posted to rec.boats.electronics
Wayne B Wayne B is offline
external usenet poster
 
First recorded activity by BoatBanter: Sep 2009
Posts: 1,638
Default Temperature Rise in Cable Based on Watts per Foot

Does anyone know how to calculate temperature rise above ambient in a
cable which is dissipating power due to voltage drop and current flow?

For example, a 12 ft length of #4 cable carrying 100 amps dissipates
about 35 watts total, approximately 3 watts per foot. That causes a
certain amount of heating. Assuming free space and decent air flow,
how warm is the cable likely to get at 100 degrees F ambient
temperature?

Are there any good web sites for that calculation?

Thanks in advance.