View Single Post
  #53   Report Post  
Posted to uk.d-i-y
The Natural Philosopher[_2_] The Natural Philosopher[_2_] is offline
external usenet poster
 
Posts: 39,563
Default It was fifty years ago today (well, yesterday)

On 17/02/2021 11:26, NY wrote:
"nightjar" wrote in message
...

I never had a feel for Fahrenheit. It didn't really matter to me as a
kid what the outside temperature was beyond whether I needed to wrap
up warm or not. It was only when I started science at school that
exact temperatures mattered and we used the cgs system for that.


Fahrenheit has to be the most hare-brained temperature scale ever
devised - apart from the one which used an inverse scale so a higher
temperature was a lower number. It seems very obvious that you devise a
temperature scale that is based on (and is easily compared with) the
properties of the most abundant liquid on Earth: water. Make 0 the
freezing point of water and some larger number the boiling point. given
that we count in base 10, it's probably sensible to make that larger
number 10, 100 or 1000. But at a pinch I could cope with a power of 12
or 16 as the interval, as long as boiling water is a round number of
that base (eg 144, if you're using base 12, or 256 if you're using base
16).


Fahrenheit was extremely logical. Mr Fahrenheit put ticks on his
thermometer for the hottest ever summer day and the coldest ever winter
night and divided it into 100.

It was a perfect scale. For weather



Units should be devised for ease (rather than complexity) of
calculations. If we had twelve fingers/thumbs, we'd count in base 12 and
our units would related by base 12 ratios. But we have 10 digits and are
taught to count in base 10, so that is the *only* ratio that should be
used for a units system.

we have 12 finger joints on each hand.

and, with toes, twenty digits on both hands and feet
We have two eyes and one arsehole

Computers are an exception, but there are good scientific principles why
binary is used - because logic gates can have two states.


Logic gates cam have a lot more than that, and its a bit sad that people
now think in boolean logic rather then real world stuff

But that's art students and lefty****s for you,
4 legs good two legs bad


Because binary
is exceptionally tedious for calculation


not with a computer it aint. its a doddle

and for expressing large
quantities, we have evolved octal and hexadecimal.


that is simply a more compact way to display binary

Given that computers
have generally standardised on "molecules" (bytes) of 8 "atoms" (bits),
hexadecimal is more logical than octal, in that you are dividing a byte
into an *even* number of *same-size* chunks (4 and 4), rather than
unequal chunks of 3, 3 and 2 bits.


And avoids having to memorise 256 symbols


--
There is something fascinating about science. One gets such wholesale
returns of conjecture out of such a trifling investment of fact.

Mark Twain