View Single Post
  #111   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
Robert Baer Robert Baer is offline
external usenet poster
 
Posts: 69
Default Calibration Of Electronic Equipment In The Home Workshop

MassiveProng wrote:

On Sat, 03 Mar 2007 05:22:58 GMT, ehsjr Gave
us:


Too_Many_Tools wrote:

I have a well stocked test bench at home containing a range of analog,
digital and RF test equipment as I am sure most of you also do.

Well the question I have is how do you handle the calibration of your
equipment? What do you use for calibration standards for resistance,
voltage, current and frequency?

Links to recommended circuits, pictures and sources would be
appreciated.

Since this is a need for anyone who has test equipment, I hope to see
a good discussion on this subject.

Thanks

TMT


The real question is how much precision do you
really need in the home "lab"? How often have
you needed to use your DMM with how many
*accurate* significant digits? 100 minus some
*very* small percent of the time, 2 significant
digits is all you need. Do you _really_ care
if your 5.055 volt reading is really 5.06 or 5.04?

Oh hell yes, I want to puff out my chest like everyone
else and think I have *accurate* equipment.

But I'm curious as to what home circuits need meters
that can read voltage accurately to 3 decimal places?
2 decimal places? The question for current measurement:
in what home brew circuit design/troubleshooting do you
need accuracy below the tens of mA digit ? *Need*, not
*want*. Do you even trust your DMM on an amps setting
for those measurements, or do you measure the current
indirectly? How about ohms? Would you trust any
DMM, regardless of who calibrated it, to measure
down in the miliohm numbers?

To me, the design of the circuit being mesured has
to take care of all of that crap. If it is so
poorly designed that a 10 mV departure from nominal
(that is missed by my innaccurate meter) will keep
it from working, that suggests other problems.
Yes, the home "lab" person wants extreme accuracy
to as many decimal places as he can get. But when does
he ever really need it?

None of this is to argue against having the best
instrumentation you can afford, or references to
check it against, or paying for calibration and so
forth. But for myself, I need a dose of reality


from time to time when I start drooling over some


accuracy specs that I will never need at home. My
bet is that most of us are seduced by that same muse.




Modern instrument accuracies are so good, and keep their setup so
well, to open one up and tweak it with less than a professional
calibration standard available is ludicrous in the extreme.

No matter how smart one is, if one has an instrument, and wants to
test accuracy, one should make an appearance somewhere where an
already recently calibrated instrument is available to EXAMINE your
instrument against.

NONE should be "adjusted" at all ever if the variance is too small
to warrant it, and even pro calibrators follow this creed. If at all
possible, their main task is to VERIFY an instrument's accuracy
WITHOUT making ANY adjustment. ANY that DO need adjustments are
typically marked "defective" and require a factory inspection/repair.

I speak from experience, so I don't care what the ToolTard thinks
about his capacity for the task, he is a ****ing retard if he tries it
without first checking his gear against known good gear.

It really is THAT SIMPLE.

Check, and check mate; end game.