View Single Post
  #23   Report Post  
Posted to sci.electronics.repair
mike mike is offline
external usenet poster
 
Posts: 634
Default any way to calibrate digital thermometer?

On 5/30/2012 11:06 AM, John Robertson wrote:
Jeff Liebermann wrote:
On Tue, 29 May 2012 22:00:36 -0700, John Robertson
wrote:

So is that how some researchers get accuracy to .001 using
hundreds/thousands of devices calibrated to 0.1?


They probably bribed the peer reviewers or made some manner of quid
pro quo deal. The lab assistant that ran the numbers probably didn't
care about signifigant figures or the difference between resolution
and accuracy. If it fits in the speadsheet box, it must be correct.

Incidentally, at 0.001C resolution, the heat emitted by the observer
becomes signifigant.

I'll confess to having done the ice and boiling water calibration
ceremony to various thermometers while in college, but not to a
wireless sensor.
http://www.in.gov/isdh/files/ThermometerCalibration__3_.pdf
Put the sensor in a baggie and suck the air out then see how it
measures up...


If you let everything equalize to ambient temperature, you'll
eventually get an accurate reading. Incidentally, many black plastic
shipping bags are somewhat transparent to IR.


I meant to seal the device in a bag and then put in boiling (@ sea
level) or ice water for calibration. Sorry that I wasn't clear enough,
my bad!

John :-#)#

This is a bad idea on many levels.
Check the operating temperature range. it's unlikely that it included
100C.
Even if it survives...
Most electronic temperature measurement methods rely on some form
of linearization. Calibrating at twice the intended operating temperature
range is unlikely to improve the readings at normal temps.

Do the math on boiling temperature. You've just converted your
inability to measure temperature into an inability to measure
atmospheric pressure.