I responded privately to the OP earlier in the day with the following.  Since 
this thread has taken on new meaning. I'm posting it here:

Frank,

Maybe a little thought experiment would help.

Consider that a K3/100 (I assume that is your worry) puts out at best 100 W and 
with most modes and operating styles, the average is much much less.

If you were running a full-carrier mode like FM or RTTY, the duty cycle is 100% 
while transmitting, but assuming a 50-50 split between transmitting and 
listening, it's 50%.  So in the worst case, let's say the average output is 50 
W.  On CW or SSB it will be even lower.

Now this is at the output connector.  With almost any run of cable, there will 
be some loss, but to be conservative, assume that it is zero.  So at the 
antenna feedpoint the average power is 50 W.  Hopefully, most of this is 
radiated and escapes the confines of your attic.  In fact, a dipole is very 
nearly 100% efficient, but assume for a moment that it isn't and 50% of the 
available power is converted to heat.

This means that 25 W is dissipated as heat, and the other 25 is radiated.  Most 
of the heat will be in the neighborhood of the feedpoint because that is the 
area of higher current.  However, it isn't a point loss but spread over some 
length of the wire.

But let's pretend that all of the heat is generated in a very small spot on the 
wire and contemplate this.

I like to sometimes turn these things into other questions, so I would ask 
myself, "Self, if I was trying to heat up that wire to solder the transmission 
line to the wire, what is the likelihood of doing the job with a 25-W soldering 
iron?

Pretty unlikely isn't it?  And that is the worst case scenario.  With a small 
amount of power distributed over any appreciable length of copper there simply 
isn't going to be any significant temperature rise.

End Quote.

Now to address David's remarks that follow:

A lot of this is conjecture at this point since the size and composition of the 
attic, the electrical length of the dipole and the frequency are all unknowns 
at this point.

However, using my "backwards" logic from above, consider actually trying to 
absorb the transmitter power in the building without it leaking to the 
outside.  Having done measurements in anechoic chambers and free-space antenna 
ranges, I can tell you that it's difficult.

Or try to imagine purposely attempting to heat the attic with a 50 W heating 
element.  If it's a cold climate, it's not going to happen and if it's like 
where I live where last week the air temperature was 102F and the solar 
insolation on the roof was 1 KW/m^2, what effect would another 50 W make?

Answer: none.  This is a non-issue.

Wes Stewart  N7WS


 
> Frank MacDonell wrote:
> > I am using a center fed dipole in the attic for a K3.
> Does the antenna
> > generate any measurable amount of heat during TX?
> Thanks.
> > 
> 
> Yes.  Subject to using suitable measuring
> instruments.
> 
> If the antenna is a reasonable length, a lot more heat will
> be generated 
> in the building structure (conceivably more than 50% of the
> power) than 
> in the antenna wire.  If the antenna is, in
> particular, electrically 
> very short, a lot of the power could go to directly heating
> it.
> 
> I will also depend on the size and construction of the
> wire.
> 
> Why?  If you are considering the total thermal load, I
> think the 
> building structure dissipation will be the most important
> factor.





      
______________________________________________________________
Elecraft mailing list
Home: http://mailman.qth.net/mailman/listinfo/elecraft
Help: http://mailman.qth.net/mmfaq.htm
Post: mailto:Elecraft@mailman.qth.net

This list hosted by: http://www.qsl.net
Please help support this email list: http://www.qsl.net/donate.html

Reply via email to