View Single Post
  #204   Report Post  
Roger
 
Posts: n/a
Default

The message
from Andy Wade contains these words:

I'm not convinced that the marginally greater radiation from warmer
walls would have any significant effect (see final para for the reverse
effect) but to the extent that it does it makes diMMs conjecture even
less tenable.


I'm having difficulty following your argument. I agree that the
difference in surface temperature, at least in the steady state, is
pretty small. If you compare two walls, one with a U-value of 0.4
W/(m^2.K) and the other with U = 2.2, assuming 21 deg. inside temp and
-3 outside, and using the usual value of 0.06 m^2.K/W for the resistance
of the internal boundary layer, the difference in the inside surface
temperature works out at only 2.6 K, according to my back-of-envelope
calculation. However that 2.6 K difference is about 15% of the
temperature difference between the couch-potato-body and the wall, so it
will have a fairly significant effect on the heat flux. With
intermittent heating in the poorly insulated place the difference will
tend to be larger, due to the lag introduced by the thermal mass of (for
example) solid brick walls.


I may be venturing into the unknown (or at least somewhere I haven't
been for the best part of 40 years) but the difference between say 290 K
and 287.6 K is just about significant at the 4th power but it pales into
insignificance when the temperature difference is say 15 K.


The lower the air temperature (at which the body feels comfortable)
the greater the temperature difference between it and the warm body
and hence the greater the heat loss.


By "warm body" here I presume you mean the heat source. In "the
greater the temperature difference between it and the warm body" does
"it" refer to the air mass or the (human) body? If the latter, it's at
a fairly well-regulated 37 deg. or so and the air temperature doesn't
affect the heat flow very much from the radiation point if view.


No. The warm body in question is the human body that dIMM maintains
loses heat very much faster when the ambient temperature is raised to
compensate for the discomfort caused by having a poorly insulated house.

Have you considered the fact that in a poorly insulated room there is
much more radiant energy about that in a well insulated room?


Well, no, my argument was constructed on the principle of there being
less. Why is there more? It does depend on the heating source of
course. A good blazing fire can make you feel quite warm in a very cold
room, so in some cases you may be right, but so what? - it doesn't alter
the radiation to the walls argument.


Well for a start a poorly heated room needs much more heat to keep it up
to temperature than a well insulated room and if that heat is supplied
by a radiator some 40% is probably radient heat. If supplied by a fire
it will of course be a much higher percentage.

Secondly the radiator (if that is the heat source) being considerably
hotter is a much better radiator than the cool walls. If the walls are
say at 16 C, the warm body at 39 C and the radiator at say 60 C I will
leave you to work out what those proportions would mean in degrees K at
the 4th power.

--
Roger