View Single Post
  #50   Report Post  
Posted to uk.d-i-y
NY[_2_] NY[_2_] is offline
external usenet poster
 
Posts: 1,062
Default It was fifty years ago today (well, yesterday)

"nightjar" wrote in message
...

I never had a feel for Fahrenheit. It didn't really matter to me as a kid
what the outside temperature was beyond whether I needed to wrap up warm
or not. It was only when I started science at school that exact
temperatures mattered and we used the cgs system for that.


Fahrenheit has to be the most hare-brained temperature scale ever devised -
apart from the one which used an inverse scale so a higher temperature was a
lower number. It seems very obvious that you devise a temperature scale that
is based on (and is easily compared with) the properties of the most
abundant liquid on Earth: water. Make 0 the freezing point of water and some
larger number the boiling point. given that we count in base 10, it's
probably sensible to make that larger number 10, 100 or 1000. But at a pinch
I could cope with a power of 12 or 16 as the interval, as long as boiling
water is a round number of that base (eg 144, if you're using base 12, or
256 if you're using base 16).

Units should be devised for ease (rather than complexity) of calculations.
If we had twelve fingers/thumbs, we'd count in base 12 and our units would
related by base 12 ratios. But we have 10 digits and are taught to count in
base 10, so that is the *only* ratio that should be used for a units system.

Computers are an exception, but there are good scientific principles why
binary is used - because logic gates can have two states. Because binary is
exceptionally tedious for calculation and for expressing large quantities,
we have evolved octal and hexadecimal. Given that computers have generally
standardised on "molecules" (bytes) of 8 "atoms" (bits), hexadecimal is more
logical than octal, in that you are dividing a byte into an *even* number of
*same-size* chunks (4 and 4), rather than unequal chunks of 3, 3 and 2 bits.