View Single Post
  #9   Report Post  
Posted to alt.home.repair
rbowman rbowman is offline
external usenet poster
 
Posts: 9,074
Default Thermometers: What's the Problem with Accuracy?

On 02/04/2018 11:01 AM, wrote:
Digital stuff itself is what I call the great lie. They give you a
very precise reading, that is usually wrong. I have an $8000 (MSRP)
digital water tester from the state that we have to calibrate before
and after every use and about 10-15% of the time, they do not pass the
verification calibration after the test on at least one parameter.


There are a lot of slips betwixt that cup and lip. I worked for a
company that made pH and ion concentration meters. First you start with
the sensor, which was a Ross electrode in this case. They aren't
identical out of the gate and their characteristics change over time.
The output is fed to an AD converter, which introduces sampling errors
depending on how good the converter is and the rate at which you read
the output. There is noise that has to be cleaned up and finally a
number is obtained. That is fed through the algorithm is question to get
a ion concentration or ph. Finally you wind up with the magic 5.376,
which is precisely displayed on the readout.

Thermistors, thermocouples, glass electrodes, LVDTs and so forth are a
pain in the butt to calibrate and keep in calibration.

Philosophically speaking the entire project to convert the real world to
mathematical equations has been a product of human hubris. It's been
handy at times but it isn't real.