Electronics Repair (sci.electronics.repair) Discussion of repairing electronic equipment. Topics include requests for assistance, where to obtain servicing information and parts, techniques for diagnosis and repair, and annecdotes about success, failures and problems.

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #81   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On Sat, 03 Mar 2007 05:22:58 GMT, ehsjr Gave
us:

Too_Many_Tools wrote:
I have a well stocked test bench at home containing a range of analog,
digital and RF test equipment as I am sure most of you also do.

Well the question I have is how do you handle the calibration of your
equipment? What do you use for calibration standards for resistance,
voltage, current and frequency?

Links to recommended circuits, pictures and sources would be
appreciated.

Since this is a need for anyone who has test equipment, I hope to see
a good discussion on this subject.

Thanks

TMT


The real question is how much precision do you
really need in the home "lab"? How often have
you needed to use your DMM with how many
*accurate* significant digits? 100 minus some
*very* small percent of the time, 2 significant
digits is all you need. Do you _really_ care
if your 5.055 volt reading is really 5.06 or 5.04?

Oh hell yes, I want to puff out my chest like everyone
else and think I have *accurate* equipment.

But I'm curious as to what home circuits need meters
that can read voltage accurately to 3 decimal places?
2 decimal places? The question for current measurement:
in what home brew circuit design/troubleshooting do you
need accuracy below the tens of mA digit ? *Need*, not
*want*. Do you even trust your DMM on an amps setting
for those measurements, or do you measure the current
indirectly? How about ohms? Would you trust any
DMM, regardless of who calibrated it, to measure
down in the miliohm numbers?

To me, the design of the circuit being mesured has
to take care of all of that crap. If it is so
poorly designed that a 10 mV departure from nominal
(that is missed by my innaccurate meter) will keep
it from working, that suggests other problems.
Yes, the home "lab" person wants extreme accuracy
to as many decimal places as he can get. But when does
he ever really need it?

None of this is to argue against having the best
instrumentation you can afford, or references to
check it against, or paying for calibration and so
forth. But for myself, I need a dose of reality
from time to time when I start drooling over some
accuracy specs that I will never need at home. My
bet is that most of us are seduced by that same muse.



Modern instrument accuracies are so good, and keep their setup so
well, to open one up and tweak it with less than a professional
calibration standard available is ludicrous in the extreme.

No matter how smart one is, if one has an instrument, and wants to
test accuracy, one should make an appearance somewhere where an
already recently calibrated instrument is available to EXAMINE your
instrument against.

NONE should be "adjusted" at all ever if the variance is too small
to warrant it, and even pro calibrators follow this creed. If at all
possible, their main task is to VERIFY an instrument's accuracy
WITHOUT making ANY adjustment. ANY that DO need adjustments are
typically marked "defective" and require a factory inspection/repair.

I speak from experience, so I don't care what the ToolTard thinks
about his capacity for the task, he is a ****ing retard if he tries it
without first checking his gear against known good gear.

It really is THAT SIMPLE.
  #82   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 203
Default Calibration Of Electronic Equipment In The Home Workshop

MassiveProng wrote:
On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave


It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.


The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter. Now that the number are back where they belong, please
procede to restate your case. The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......


  #83   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
Gave us:

So you don't know how to access the service menu and make changes to the
setup of your boob tube.



Sorry, you dumb****, but you assuming that all TVs have this
capacity proves even further how little you know about it.
  #84   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
Gave us:

Good for you, please explain how the OP was going to use a DVD to
calibrate his test equipment.



Stupid ****. The suggestion I posed mine against was some twit
suggesting WWV and a 1kHz tone, which is about as old hat as it gets.

You should really learn to read ENTIRE threads before you mouth off,
jackass.
  #85   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
Gave us:

No, you don't, all the adjustments are done via menu now.


Wrong again, dumbass. You'd like to think that your guess is
correct, but it is not, dip****.


  #86   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
Gave us:

I think, you just rant. Please get it right. Maybe you could use that
DVD to calibrate your anger response, maybe you could eBay it and your
home audio system to pay for some anger management?



**** you, you ****ing retard. Meet up with me, and I'll show you
how I manage it.
  #87   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On 3 Mar 2007 17:20:54 -0800, "Too_Many_Tools"
Gave us:

Good comments Ed.

I want to thank everyone else who has offered *positive* comments
also.


I want you to leave the group and never return, you top posting
Usenet RETARD!

Like I said, I think this is a need for anyone who has equipment at
home.


Like I said, anyone as dumb as you are, regardless of your "tool
count", should not be futzing with perfectly good instruments.

You are simply too ****ing stoopid to do it correctly.

TMT


Yes, YOU!

Learn about top posting, asswipe, and how it is frowned upon in
Usenet, or are you just another pants down past the asscrack, gang boy
retard?
  #88   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On Sat, 3 Mar 2007 20:44:19 -0600, "Anthony Fremont"
Gave us:

MassiveProng wrote:
On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave


It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.


The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter.

]
Nope. READ HIS replies. He was talking about using a 3% meter.

Now that the number are back where they belong, please
procede to restate your case.


**** you. Read HIS criteria, dip****, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So **** off.

The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......


That is NOT what the retarded ******* said, you retarded *******.
  #89   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 40
Default Calibration Of Electronic Equipment In The Home Workshop

On Sat, 03 Mar 2007 19:21:44 -0800, MassiveProng
wrote:

On Sat, 3 Mar 2007 20:44:19 -0600, "Anthony Fremont"
Gave us:

MassiveProng wrote:
On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave

It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.


The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter.

]
Nope. READ HIS replies. He was talking about using a 3% meter.

Now that the number are back where they belong, please
procede to restate your case.


**** you. Read HIS criteria, dip****, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So **** off.

The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......


That is NOT what the retarded ******* said, you retarded *******.


Ahhh, who actually uses a scope to make accurate measurements?
  #90   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 3,103
Default Calibration Of Electronic Equipment In The Home Workshop

"Anthony Fremont" wrote in
:

MassiveProng wrote:
On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave


It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy
of the standard to which you are setting it. You CANNOT get any
closer than that. So, a 0.5% meter, and a 0.5% scope cannot be used
together to make the scope that accurate. You need a *finer*
standard than the accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.


The "basic fact" here is that we were talking about adjusting a 3%
scope with a .03% meter. Now that the number are back where they
belong, please procede to restate your case. The scope's vertical
sensitivity could easily be adjusted to within 3% using said meter,
now can't it? Just like Keith says......




Actually,one CAN calibrate an instrument to a greater accuracy than it's
specified accuracy,-for a short time-;it's called a transfer standard.
Of course,there are limits to how much greater accuracy you can
achieve,based on resolution and repeatability.

For ordinary cals,your standard should be at least 4x better than the DUT.
10x is great.


--
Jim Yanik
jyanik
at
kua.net


  #91   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 493
Default Calibration Of Electronic Equipment In The Home Workshop


"The Real Andy"


Ahhh, who actually uses a scope to make accurate measurements?



** Anyone who needs to.

Low frequency ( 1 to 30Hz), single shot, asymmetrical or pulse waves and
high frequencies are all " grist for the mill " even with a CRT based
scope.

Shame what happens with a DMM used on the same.



........ Phil






  #92   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 47
Default Calibration Of Electronic Equipment In The Home Workshop

On Mar 4, 12:38 pm, MassiveProng
wrote:
On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.


Dave


It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.


LMAO!
If I use 0.5% accurate meter to adjust a something, then the accuracy
of that adjusted device at that point in time at that adjusted value
*becomes* 0.5%. The device that was adjusted only gets it's accuracy
figure of 0.5% *after* the adjustment. The 0.5% of the device does NOT
get added to the 0.5% of the meter in this particular case!

Dave

  #93   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On Sun, 04 Mar 2007 14:14:56 +1000, The Real Andy
Gave us:

On Sat, 03 Mar 2007 19:21:44 -0800, MassiveProng
wrote:

On Sat, 3 Mar 2007 20:44:19 -0600, "Anthony Fremont"
Gave us:

MassiveProng wrote:
On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave

It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.

The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter.

]
Nope. READ HIS replies. He was talking about using a 3% meter.

Now that the number are back where they belong, please
procede to restate your case.


**** you. Read HIS criteria, dip****, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So **** off.

The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......


That is NOT what the retarded ******* said, you retarded *******.


Ahhh, who actually uses a scope to make accurate measurements?



I guess the same idiots that claim they can calibrate one with a 3%
meter.

Also, if you do NOT know how to make accurate measurements with
scopes, you should be in some other industry.
  #94   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On 3 Mar 2007 23:08:30 -0800, "David L. Jones"
Gave us:

On Mar 4, 12:38 pm, MassiveProng
wrote:
On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.


Dave


It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.


LMAO!
If I use 0.5% accurate meter to adjust a something, then the accuracy
of that adjusted device at that point in time at that adjusted value
*becomes* 0.5%.


Absolutely incorrect!

If you do that, the MINIMUM error is 0.5%. It is ALWAYS greater
than that value by that value plus the error of the device you think
you set.

How can you not understand that basic fact?

The device that was adjusted only gets it's accuracy
figure of 0.5% *after* the adjustment.


Absolutely INCORRECT!

The error of a device is NOT tied to how it got set or what it got
set with, dip****, it is tied to precision of the circuits the device
are based upon.

The 0.5% of the device does NOT
get added to the 0.5% of the meter in this particular case!


Wanna bet?
  #95   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 99
Default Calibration Of Electronic Equipment In The Home Workshop

Anthony Fremont wrote:
ehsjr wrote:

Too_Many_Tools wrote:

I have a well stocked test bench at home containing a range of
analog, digital and RF test equipment as I am sure most of you also
do. Well the question I have is how do you handle the calibration of your
equipment? What do you use for calibration standards for resistance,
voltage, current and frequency?

Links to recommended circuits, pictures and sources would be
appreciated.

Since this is a need for anyone who has test equipment, I hope to see
a good discussion on this subject.

Thanks

TMT


The real question is how much precision do you
really need in the home "lab"? How often have
you needed to use your DMM with how many
*accurate* significant digits? 100 minus some
*very* small percent of the time, 2 significant
digits is all you need. Do you _really_ care
if your 5.055 volt reading is really 5.06 or 5.04?



Oh hell yes, I want to puff out my chest like everyone
else and think I have *accurate* equipment.

But I'm curious as to what home circuits need meters
that can read voltage accurately to 3 decimal places?
2 decimal places? The question for current measurement:
in what home brew circuit design/troubleshooting do you
need accuracy below the tens of mA digit ? *Need*, not



You surely didn't mean tens of _mA_, did you?


I surely meant tens of mA.

I build stuff with PICs as
you know, and some of it is designed to run on batteries and needs to go for
long periods of time unattended. The current draw for a 12F683 running at
31kHz is 11uA, sleep current is 50nA. If I could only measure current to
"tens of mA", I'd never know if the PIC was setup right for low current draw
and I certainly couldn't have any idea of expected battery life. I wouldn't
even know if it was sleeping until it ate thru some batteries in a few days
instead of six or eight months. I think I have a need to measure fractions
of a uA.


You may, but not accuracy below the tens of _mA_ digit.
When you need accuracy below tens of mA, you measure
voltage across a resistance. It doesn't make a lot of
sense to look for your meter to be accurate to 8 decimal
places for your .00000005 amp reading.

Here's how you do it with accuracy at the tens of _mV_ digit:

For 11 uA, put a 10K .01% resistor in series with
the supply and measure .11 volts across it. The voltage
would range from 0.109989 to 0.110011. Keep only
2 decimal places. Your computed current, worst case,
would be off by 1 uA

For 50 nA, use a 2 meg 1% resistor and measure .10
volts across it. The voltage would range from .099
to .101 taking the 1% into account. Throw out the
last digit. Your current computation would be off
worst case, by 5 nA.

With a voltmeter accurate to 2 decimal places.
I don't know why you would



*want*. Do you even trust your DMM on an amps setting
for those measurements, or do you measure the current
indirectly? How about ohms? Would you trust any
DMM, regardless of who calibrated it, to measure
down in the miliohm numbers?

To me, the design of the circuit being mesured has
to take care of all of that crap. If it is so
poorly designed that a 10 mV departure from nominal
(that is missed by my innaccurate meter) will keep
it from working, that suggests other problems.
Yes, the home "lab" person wants extreme accuracy
to as many decimal places as he can get. But when does
he ever really need it?



When he needs it he needs it, what can I say?


I asked, looking for concrete cases. Your case
with the PIC is an excellent example of when a
person needs to know about really small currents.
It definitely fits into the difference I had in mind
between "needs" and "wants". But it does not mean he
needs accuracy out to 8 decimal places. He needs it to
2 decimal places, as was shown. Three decimal places
would be nice. :-)

Do I really "need" a new DSO?


I have no opinion on that, and it would be irrelevant
if I did. I don't know what your situation is.

Well I've managed to get by all this time without one, so maybe you think I
don't really "need" one. I see it like this though, I don't get allot of
time to tinker anymore. I'd like to spend it more productively. Instead of
fumbling around and trying to devise silly methods to make my existing
equipment do something it wasn't designed to (like going off on a tangent to
build a PIC circuit that will trigger my scope early so I can try to see
some pre-trigger history).


None of this is to argue against having the best
instrumentation you can afford, or references to
check it against, or paying for calibration and so



I don't know if I really agree with that. ;-)


Well, you're free to argue against having the best
instrumentation you can afford, or having references
to check it against or getting it calibrated or
whatever, if that's how you feel. I tend to err on
the side of wanting the best even when it is
not the best fit for what I really need.

Ed



forth. But for myself, I need a dose of reality
from time to time when I start drooling over some
accuracy specs that I will never need at home. My
bet is that most of us are seduced by that same muse.

Ed






  #96   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On Sun, 04 Mar 2007 08:25:08 GMT, ehsjr Gave
us:


You may, but not accuracy below the tens of _mA_ digit.
When you need accuracy below tens of mA, you measure
voltage across a resistance. It doesn't make a lot of
sense to look for your meter to be accurate to 8 decimal
places for your .00000005 amp reading.



ALL handheld meters use voltage read across a precision shunt
resistor for current readings. I am not talking about inductive
probes. Standard current.
  #97   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 40
Default Calibration Of Electronic Equipment In The Home Workshop

On Sun, 04 Mar 2007 00:04:08 -0800, MassiveProng
wrote:

On Sun, 04 Mar 2007 14:14:56 +1000, The Real Andy
Gave us:

On Sat, 03 Mar 2007 19:21:44 -0800, MassiveProng
wrote:

On Sat, 3 Mar 2007 20:44:19 -0600, "Anthony Fremont"
Gave us:

MassiveProng wrote:
On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave

It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.

The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter.
]
Nope. READ HIS replies. He was talking about using a 3% meter.

Now that the number are back where they belong, please
procede to restate your case.

**** you. Read HIS criteria, dip****, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So **** off.

The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......

That is NOT what the retarded ******* said, you retarded *******.


Ahhh, who actually uses a scope to make accurate measurements?



I guess the same idiots that claim they can calibrate one with a 3%
meter.


My point exactly.


Also, if you do NOT know how to make accurate measurements with
scopes, you should be in some other industry.


More to the point, if you dont understand the concept of error then
you should be in another industry.
  #98   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On Sun, 04 Mar 2007 18:32:57 +1000, The Real Andy
Gave us:

On Sun, 04 Mar 2007 00:04:08 -0800, MassiveProng
wrote:

On Sun, 04 Mar 2007 14:14:56 +1000, The Real Andy
Gave us:

On Sat, 03 Mar 2007 19:21:44 -0800, MassiveProng
wrote:

On Sat, 3 Mar 2007 20:44:19 -0600, "Anthony Fremont"
Gave us:

MassiveProng wrote:
On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave

It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.

The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter.
]
Nope. READ HIS replies. He was talking about using a 3% meter.

Now that the number are back where they belong, please
procede to restate your case.

**** you. Read HIS criteria, dip****, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So **** off.

The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......

That is NOT what the retarded ******* said, you retarded *******.

Ahhh, who actually uses a scope to make accurate measurements?



I guess the same idiots that claim they can calibrate one with a 3%
meter.


My point exactly.


Good thing I never made that claim.

Also, if you do NOT know how to make accurate measurements with
scopes, you should be in some other industry.


More to the point, if you dont understand the concept of error then
you should be in another industry.


That is about the gist of what I have been trying to tell them.

Some dope thinking he can adjust his meter accurately with a damned
drifty voltage reference chip should have his head examined, not his
instruments!
  #99   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 40
Default Calibration Of Electronic Equipment In The Home Workshop

On 2 Mar 2007 12:14:20 -0800, "Too_Many_Tools"
wrote:

On Mar 2, 1:46 am, MassiveProng
wrote:
On 1 Mar 2007 19:28:47 -0800, "Too_Many_Tools"
Gave us:

Laugh....laugh....laugh....emptying trash cans is NOT working in cal
labs and QA.


Said the utter retard that needed to ask in a BASIC electronics
groups about something which he should already know if he planned to
attempt such a procedure.

Nice try, retard boy. Too bad you are wrong.... again.

Hey MiniPrick....you done with the homework assignment yet?


That of calling you the retarded ****head that you are? Sure...
done.

TMT, the total Usenet retard


Yep... that'd be you. Your nym is more correct than you'll ever
know. You're a jack-of-no-trades.

You're a real piece of ****... errr... work, there, bub.

My first advice was spot on. To make a proper cal, the source has
to be ten times better than the accuracy you wish to claim for the
instrument.

NONE of the circuits given in this thread are good enough. ALL of
those IC chips drift with T so much that calling them a cal source is
ludicrous. So are you if you think I don't now quality assurance, and
proper procedure.

You ain't it.


So it sounds like you are having a problem finding two brain cells
MiniPrick...try harder.

No more of your excuses....SHOW us how great you are.

Laugh...laugh...laugh....

TMT


At the end of the day, why do you need to calibrate your instruments?
Do you need to do it, or is it just for self satisfaction? Are your
trying to prove a point or do you need traceable calibaration? What
are you trying to acheive?
  #100   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 203
Default Calibration Of Electronic Equipment In The Home Workshop

MassiveProng wrote:
On Sat, 3 Mar 2007 20:44:19 -0600, "Anthony Fremont"
Gave us:

MassiveProng wrote:
On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave

It doesn't matter how many "places" you "spot check" it, you are
not going to get the accuracy of your comparison standard on the
device you intend to set with it. What you do is take the basic
INaccuracy of the device needing to be set, and add to it the basic
INaccuracy of the standard to which you are setting it. You CANNOT
get any closer than that. So, a 0.5% meter, and a 0.5% scope cannot
be used together to make the scope that accurate. You need a
*finer* standard than the accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.


The "basic fact" here is that we were talking about adjusting a 3%
scope with a .03% meter.

]
Nope. READ HIS replies. He was talking about using a 3% meter.


I believe we were talking about scopes only being about 3% accurate in the
vertical. You are the one that pulled that garbage out of the air about
leaving the scope 6% off. You should try reading what people write instead
of what you wish they wrote.

Now that the number are back where they belong, please
procede to restate your case.


**** you. Read HIS criteria, dip****, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So **** off.


Are you really that incompetent? I AM THE ONE that stated that I could use
..03% meter to adjust it. It was in my very first post in this thread. Now
stop lying.


The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like
Keith says......


That is NOT what the retarded ******* said, you retarded *******.





  #101   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 203
Default Calibration Of Electronic Equipment In The Home Workshop

MassiveProng wrote:
On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
Gave us:

I think, you just rant. Please get it right. Maybe you could use that
DVD to calibrate your anger response, maybe you could eBay it and
your home audio system to pay for some anger management?



**** you, you ****ing retard. Meet up with me, and I'll show you
how I manage it.


Man does that ever sound like a personal threat.


  #102   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On Sun, 4 Mar 2007 03:10:34 -0600, "Anthony Fremont"
Gave us:

I believe we were talking about scopes only being about 3% accurate in the
vertical.


I think you should read a ****ing thread before you mouth off,
dip****. You ain't too bright, boy.

QUOTED:

Me:
Your micronta? Bwuahahahahahah!



The ditz:
Since 3% accuracy is considered good in the scope world, I think it
would do
fine.
  #103   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On Sun, 4 Mar 2007 03:16:04 -0600, "Anthony Fremont"
Gave us:

MassiveProng wrote:
On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
Gave us:

I think, you just rant. Please get it right. Maybe you could use that
DVD to calibrate your anger response, maybe you could eBay it and
your home audio system to pay for some anger management?



**** you, you ****ing retard. Meet up with me, and I'll show you
how I manage it.


Man does that ever sound like a personal threat.

You're an idiot.
  #104   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 69
Default Calibration Of Electronic Equipment In The Home Workshop

Anthony Fremont wrote:

ehsjr wrote:

Too_Many_Tools wrote:

I have a well stocked test bench at home containing a range of
analog, digital and RF test equipment as I am sure most of you also
do. Well the question I have is how do you handle the calibration of your
equipment? What do you use for calibration standards for resistance,
voltage, current and frequency?

Links to recommended circuits, pictures and sources would be
appreciated.

Since this is a need for anyone who has test equipment, I hope to see
a good discussion on this subject.

Thanks

TMT


The real question is how much precision do you
really need in the home "lab"? How often have
you needed to use your DMM with how many
*accurate* significant digits? 100 minus some
*very* small percent of the time, 2 significant
digits is all you need. Do you _really_ care
if your 5.055 volt reading is really 5.06 or 5.04?



Oh hell yes, I want to puff out my chest like everyone
else and think I have *accurate* equipment.

But I'm curious as to what home circuits need meters
that can read voltage accurately to 3 decimal places?
2 decimal places? The question for current measurement:
in what home brew circuit design/troubleshooting do you
need accuracy below the tens of mA digit ? *Need*, not



You surely didn't mean tens of _mA_, did you? I build stuff with PICs as
you know, and some of it is designed to run on batteries and needs to go for
long periods of time unattended. The current draw for a 12F683 running at
31kHz is 11uA, sleep current is 50nA. If I could only measure current to
"tens of mA", I'd never know if the PIC was setup right for low current draw
and I certainly couldn't have any idea of expected battery life. I wouldn't
even know if it was sleeping until it ate thru some batteries in a few days
instead of six or eight months. I think I have a need to measure fractions
of a uA.


*want*. Do you even trust your DMM on an amps setting
for those measurements, or do you measure the current
indirectly? How about ohms? Would you trust any
DMM, regardless of who calibrated it, to measure
down in the miliohm numbers?

To me, the design of the circuit being mesured has
to take care of all of that crap. If it is so
poorly designed that a 10 mV departure from nominal
(that is missed by my innaccurate meter) will keep
it from working, that suggests other problems.
Yes, the home "lab" person wants extreme accuracy
to as many decimal places as he can get. But when does
he ever really need it?



When he needs it he needs it, what can I say? Do I really "need" a new DSO?
Well I've managed to get by all this time without one, so maybe you think I
don't really "need" one. I see it like this though, I don't get allot of
time to tinker anymore. I'd like to spend it more productively. Instead of
fumbling around and trying to devise silly methods to make my existing
equipment do something it wasn't designed to (like going off on a tangent to
build a PIC circuit that will trigger my scope early so I can try to see
some pre-trigger history).


None of this is to argue against having the best
instrumentation you can afford, or references to
check it against, or paying for calibration and so



I don't know if I really agree with that. ;-)


forth. But for myself, I need a dose of reality
from time to time when I start drooling over some
accuracy specs that I will never need at home. My
bet is that most of us are seduced by that same muse.

Ed




Here is a good "trick" to measure low currents with your DVM.
Make a switchable shunt box with (at least) the following full scale
ranges: 200nA (shunt resistor 1.11 megs), 2uA (shunt resistor 101K),
20uA (shunt resistor 10.0K), 200uA (shunt resistor 1.00K).
Put a twisted pair of leads (red, black) with banana plugs (red,
black) running out of the box via a small grommet, to plug into your DVM
set to the 200mV scale; a pair of (red, black) banana jacks with 0.75
"spacing is mounted on the box for your test leads.
Hint: add to the legend the parallel resistance of the system
(200nA/1M, 2uA/100K, etc) as a reminder of the resistance of this
current meter scheme.
Added hint: the 200MV scale is good for 20nA full scale, just
remember the meter resistance is 10 megs.
  #105   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 677
Default Calibration Of Electronic Equipment In The Home Workshop

In message , MassiveProng
writes
On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
Gave us:

So you don't know how to access the service menu and make changes to the
setup of your boob tube.



Sorry, you dumb****, but you assuming that all TVs have this
capacity proves even further how little you know about it.

So your TV doesn't have a service mode and it doesn't have any pots to
tweak? How, exactly, does it get adjusted in the factory or by a service
tech then? Just because *you* don't have access to it doesn't mean it
doesn't have a service mode where adjustments can be made via menu.

--
Clint Sharp


  #106   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 203
Default Calibration Of Electronic Equipment In The Home Workshop

ehsjr wrote:
Anthony Fremont wrote:
ehsjr wrote:


But I'm curious as to what home circuits need meters
that can read voltage accurately to 3 decimal places?
2 decimal places? The question for current measurement:
in what home brew circuit design/troubleshooting do you
need accuracy below the tens of mA digit ? *Need*, not



You surely didn't mean tens of _mA_, did you?


I surely meant tens of mA.

I build stuff with PICs as
you know, and some of it is designed to run on batteries and needs
to go for long periods of time unattended. The current draw for a
12F683 running at 31kHz is 11uA, sleep current is 50nA. If I could
only measure current to "tens of mA", I'd never know if the PIC was
setup right for low current draw and I certainly couldn't have any
idea of expected battery life. I wouldn't even know if it was
sleeping until it ate thru some batteries in a few days instead of
six or eight months. I think I have a need to measure fractions of
a uA.


You may, but not accuracy below the tens of _mA_ digit.
When you need accuracy below tens of mA, you measure
voltage across a resistance. It doesn't make a lot of


Isn't that exactly how my DMM does it?

sense to look for your meter to be accurate to 8 decimal
places for your .00000005 amp reading.


Now come on, the 8 decimal places is only assuming that the scale is in an
Amps range. The meter would be in the 500uA full scale range where 50nA is
only 2 decimal places.

Here's how you do it with accuracy at the tens of _mV_ digit:

For 11 uA, put a 10K .01% resistor in series with
the supply and measure .11 volts across it. The voltage
would range from 0.109989 to 0.110011. Keep only
2 decimal places. Your computed current, worst case,
would be off by 1 uA


For 50 nA, use a 2 meg 1% resistor and measure .10
volts across it. The voltage would range from .099
to .101 taking the 1% into account. Throw out the
last digit. Your current computation would be off
worst case, by 5 nA.


Those are fine ways to measuring static current levels, but they will not
work for me. Until the PIC goes to sleep, the current draw is much higher.
So much so that it would never power up thru a 2M resistor.

With a voltmeter accurate to 2 decimal places.
I don't know why you would


If your volt meter has a 1V maximum at full scale and one can live with 10%
error, then I agree. If it has a 100V range, then you need .01% accuracy on
your equipment to make your measurements, right?



*want*. Do you even trust your DMM on an amps setting
for those measurements, or do you measure the current
indirectly? How about ohms? Would you trust any
DMM, regardless of who calibrated it, to measure
down in the miliohm numbers?

To me, the design of the circuit being mesured has
to take care of all of that crap. If it is so
poorly designed that a 10 mV departure from nominal
(that is missed by my innaccurate meter) will keep
it from working, that suggests other problems.
Yes, the home "lab" person wants extreme accuracy
to as many decimal places as he can get. But when does
he ever really need it?



When he needs it he needs it, what can I say?


I asked, looking for concrete cases. Your case
with the PIC is an excellent example of when a
person needs to know about really small currents.
It definitely fits into the difference I had in mind
between "needs" and "wants". But it does not mean he
needs accuracy out to 8 decimal places. He needs it to
2 decimal places, as was shown. Three decimal places
would be nice. :-)


Aren't you arbitrarily relocating your base measurement scale to uA or nA
and then claiming that you're only being accurate to two decimal places?
You are still measuring current to the same "8 decimal places" in terms of
whole Amps, you just moved the decimal around.

IMO, it's not about decimal places at all, that's just a matter of scale.
It's about accuracy. 10% ain't good enough, and that's only accounting for
the error in your shunt resistors. :-)

Do I really "need" a new DSO?


I have no opinion on that, and it would be irrelevant
if I did. I don't know what your situation is.

Well I've managed to get by all this time without one, so maybe you
think I don't really "need" one. I see it like this though, I don't
get allot of time to tinker anymore. I'd like to spend it more
productively. Instead of fumbling around and trying to devise silly
methods to make my existing equipment do something it wasn't
designed to (like going off on a tangent to build a PIC circuit that
will trigger my scope early so I can try to see some pre-trigger
history).
None of this is to argue against having the best
instrumentation you can afford, or references to
check it against, or paying for calibration and so



I don't know if I really agree with that. ;-)


Well, you're free to argue against having the best
instrumentation you can afford, or having references
to check it against or getting it calibrated or
whatever, if that's how you feel. I tend to err on


Actually, I'm all for that part.

the side of wanting the best even when it is
not the best fit for what I really need.


And this is what I do as well. I'd rather have a margin of overkill than to
be constantly living with sacrifice by saving a couple of bucks on a NRE.
What I wasn't "sure" about was whether _you_ really felt that way. ;-)


Ed



forth. But for myself, I need a dose of reality
from time to time when I start drooling over some
accuracy specs that I will never need at home. My
bet is that most of us are seduced by that same muse.

Ed



  #107   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 677
Default Calibration Of Electronic Equipment In The Home Workshop

In message , MassiveProng
writes
On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
Gave us:

Good for you, please explain how the OP was going to use a DVD to
calibrate his test equipment.



Stupid ****. The suggestion I posed mine against was some twit
suggesting WWV and a 1kHz tone, which is about as old hat as it gets.

You should really learn to read ENTIRE threads before you mouth off,
jackass.

And I asked, does the stability of the clock in a Dvd player affect the
accuracy of the tones replayed, you still haven't given a proper answer
to that. Maybe because you know it does and that blows down your house
of cards.
--
Clint Sharp
  #108   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 677
Default Calibration Of Electronic Equipment In The Home Workshop

In message , MassiveProng
writes
On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
Gave us:

I think, you just rant. Please get it right. Maybe you could use that
DVD to calibrate your anger response, maybe you could eBay it and your
home audio system to pay for some anger management?



**** you, you ****ing retard. Meet up with me, and I'll show you
how I manage it.

For the moment I really think you need to avoid situations where your
'intellect' and experience could be challenged as it seems to provoke an
anger response which must be detrimental to whichever course of therapy
you are in. If you're not in therapy, you should consider it. Life's too
short to be that angry all the time.
--
Clint Sharp
  #109   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 203
Default Calibration Of Electronic Equipment In The Home Workshop

MassiveProng wrote:
On Sun, 4 Mar 2007 03:10:34 -0600, "Anthony Fremont"
Gave us:

I believe we were talking about scopes only being about 3% accurate
in the vertical.


I think you should read a ****ing thread before you mouth off,
dip****. You ain't too bright, boy.

QUOTED:

Me:
Your micronta? Bwuahahahahahah!



The ditz:
Since 3% accuracy is considered good in the scope world, I think it
would do
fine.


That's not the part I had in mind, but that will do for now. You really
didn't understand the statement did you? I'm talking about scope accuracy,
nothing about the meter at this point because I already brought that up IN
MY FIRST POST IN THIS THREAD.

QUOTED:
"Are you suggesting that I should drag it across town, spend $200 and be
without it for 2 weeks just to get it adjusted by some obstinate, E-1 grade
line tech, instead of using a brand new DMM w .03% accuracy to tweak it
myself? I'm quite sure that my Micronta is up to the task to be honest."

What part of that don't you understand? PLEASE READ MY POSTS BEFORE ****ING
YOURSELF!!!!! If you don't understand something just ask for help. A .03%
METER CAN BE USED, ESPECIALLY WHEN THAT METER IS AVAILABLE WITH NIST CERTS,
NOW CAN'T IT????

Now, go off, change your nym again and see how many posts you can make
before I recognize you.

Now, please FOAD unless you can learn to control yourself better.



  #110   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 69
Default Calibration Of Electronic Equipment In The Home Workshop

MassiveProng wrote:

On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:


Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave



It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.

Furthermore, an analog scope cannot measure better than 1% (ie 0ne
part in 100 of what is on the scope face).
Now one can "cheat" by using a precision offset differenced with an
input and that difference amplified to *display* (part of) that
difference: note the "Z", the "W", and the more modern "7A13" type plugins.
But *on the screen*, i defy anyone to consistently "read" better than
one part in 100 (ie if 10 divisions on screen, read to better than 1
division on a consistent basis.
Thus, for a scope, one might use standards good to 5 or more places,
but the result will be no better than what has been called "slide rule
accuracy".
Do you believe all 15 digits of each and every number in a computer
printout?


  #111   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 69
Default Calibration Of Electronic Equipment In The Home Workshop

MassiveProng wrote:

On Sat, 03 Mar 2007 05:22:58 GMT, ehsjr Gave
us:


Too_Many_Tools wrote:

I have a well stocked test bench at home containing a range of analog,
digital and RF test equipment as I am sure most of you also do.

Well the question I have is how do you handle the calibration of your
equipment? What do you use for calibration standards for resistance,
voltage, current and frequency?

Links to recommended circuits, pictures and sources would be
appreciated.

Since this is a need for anyone who has test equipment, I hope to see
a good discussion on this subject.

Thanks

TMT


The real question is how much precision do you
really need in the home "lab"? How often have
you needed to use your DMM with how many
*accurate* significant digits? 100 minus some
*very* small percent of the time, 2 significant
digits is all you need. Do you _really_ care
if your 5.055 volt reading is really 5.06 or 5.04?

Oh hell yes, I want to puff out my chest like everyone
else and think I have *accurate* equipment.

But I'm curious as to what home circuits need meters
that can read voltage accurately to 3 decimal places?
2 decimal places? The question for current measurement:
in what home brew circuit design/troubleshooting do you
need accuracy below the tens of mA digit ? *Need*, not
*want*. Do you even trust your DMM on an amps setting
for those measurements, or do you measure the current
indirectly? How about ohms? Would you trust any
DMM, regardless of who calibrated it, to measure
down in the miliohm numbers?

To me, the design of the circuit being mesured has
to take care of all of that crap. If it is so
poorly designed that a 10 mV departure from nominal
(that is missed by my innaccurate meter) will keep
it from working, that suggests other problems.
Yes, the home "lab" person wants extreme accuracy
to as many decimal places as he can get. But when does
he ever really need it?

None of this is to argue against having the best
instrumentation you can afford, or references to
check it against, or paying for calibration and so
forth. But for myself, I need a dose of reality


from time to time when I start drooling over some


accuracy specs that I will never need at home. My
bet is that most of us are seduced by that same muse.




Modern instrument accuracies are so good, and keep their setup so
well, to open one up and tweak it with less than a professional
calibration standard available is ludicrous in the extreme.

No matter how smart one is, if one has an instrument, and wants to
test accuracy, one should make an appearance somewhere where an
already recently calibrated instrument is available to EXAMINE your
instrument against.

NONE should be "adjusted" at all ever if the variance is too small
to warrant it, and even pro calibrators follow this creed. If at all
possible, their main task is to VERIFY an instrument's accuracy
WITHOUT making ANY adjustment. ANY that DO need adjustments are
typically marked "defective" and require a factory inspection/repair.

I speak from experience, so I don't care what the ToolTard thinks
about his capacity for the task, he is a ****ing retard if he tries it
without first checking his gear against known good gear.

It really is THAT SIMPLE.

Check, and check mate; end game.
  #112   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 677
Default Calibration Of Electronic Equipment In The Home Workshop

In message , Anthony Fremont
writes
MassiveProng wrote:
On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
Gave us:

I think, you just rant. Please get it right. Maybe you could use that
DVD to calibrate your anger response, maybe you could eBay it and
your home audio system to pay for some anger management?



**** you, you ****ing retard. Meet up with me, and I'll show you
how I manage it.


Man does that ever sound like a personal threat.


I'm not the least worried by him. He's just an interesting diversion,
provoking an anger response from him is a bit like shooting fish in a
barrel but it's starting to get boring as it's so easy and his
vocabulary is pretty small really.
--
Clint Sharp
  #113   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 69
Default Calibration Of Electronic Equipment In The Home Workshop

MassiveProng wrote:

On 3 Mar 2007 23:08:30 -0800, "David L. Jones"
Gave us:


On Mar 4, 12:38 pm, MassiveProng
wrote:

On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:


Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave

It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.


LMAO!
If I use 0.5% accurate meter to adjust a something, then the accuracy
of that adjusted device at that point in time at that adjusted value
*becomes* 0.5%.



Absolutely incorrect!

If you do that, the MINIMUM error is 0.5%. It is ALWAYS greater
than that value by that value plus the error of the device you think
you set.

How can you not understand that basic fact?


The device that was adjusted only gets it's accuracy
figure of 0.5% *after* the adjustment.



Absolutely INCORRECT!

The error of a device is NOT tied to how it got set or what it got
set with, dip****, it is tied to precision of the circuits the device
are based upon.


The 0.5% of the device does NOT
get added to the 0.5% of the meter in this particular case!



Wanna bet?

Further more, if one did this procedure using thousands of meters to
"calibrate" thousands of other meters, the net resulting error is *NOT*
the sum; it is the square root of the sum of the squares!
But taking only *one* reference ("standard") and using it to
"calibrate" only one device, the result is technically indeterminate but
may be bounded by the sum of the (instrument) errors - and could be
*worse* (anybody hear of "cockpit errors"?).
  #114   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 69
Default Calibration Of Electronic Equipment In The Home Workshop

ehsjr wrote:

Anthony Fremont wrote:

ehsjr wrote:

Too_Many_Tools wrote:

I have a well stocked test bench at home containing a range of
analog, digital and RF test equipment as I am sure most of you also
do. Well the question I have is how do you handle the calibration of
your
equipment? What do you use for calibration standards for resistance,
voltage, current and frequency?

Links to recommended circuits, pictures and sources would be
appreciated.

Since this is a need for anyone who has test equipment, I hope to see
a good discussion on this subject.

Thanks

TMT


The real question is how much precision do you
really need in the home "lab"? How often have
you needed to use your DMM with how many
*accurate* significant digits? 100 minus some
*very* small percent of the time, 2 significant
digits is all you need. Do you _really_ care
if your 5.055 volt reading is really 5.06 or 5.04?




Oh hell yes, I want to puff out my chest like everyone
else and think I have *accurate* equipment.

But I'm curious as to what home circuits need meters
that can read voltage accurately to 3 decimal places?
2 decimal places? The question for current measurement:
in what home brew circuit design/troubleshooting do you
need accuracy below the tens of mA digit ? *Need*, not




You surely didn't mean tens of _mA_, did you?



I surely meant tens of mA.

I build stuff with PICs as

you know, and some of it is designed to run on batteries and needs to
go for long periods of time unattended. The current draw for a 12F683
running at 31kHz is 11uA, sleep current is 50nA. If I could only
measure current to "tens of mA", I'd never know if the PIC was setup
right for low current draw and I certainly couldn't have any idea of
expected battery life. I wouldn't even know if it was sleeping until
it ate thru some batteries in a few days instead of six or eight
months. I think I have a need to measure fractions of a uA.



You may, but not accuracy below the tens of _mA_ digit.
When you need accuracy below tens of mA, you measure
voltage across a resistance. It doesn't make a lot of
sense to look for your meter to be accurate to 8 decimal
places for your .00000005 amp reading.

Here's how you do it with accuracy at the tens of _mV_ digit:

For 11 uA, put a 10K .01% resistor in series with
the supply and measure .11 volts across it. The voltage
would range from 0.109989 to 0.110011. Keep only
2 decimal places. Your computed current, worst case,
would be off by 1 uA

For 50 nA, use a 2 meg 1% resistor and measure .10
volts across it. The voltage would range from .099
to .101 taking the 1% into account. Throw out the
last digit. Your current computation would be off
worst case, by 5 nA.

With a voltmeter accurate to 2 decimal places.
I don't know why you would



*want*. Do you even trust your DMM on an amps setting
for those measurements, or do you measure the current
indirectly? How about ohms? Would you trust any
DMM, regardless of who calibrated it, to measure
down in the miliohm numbers?

To me, the design of the circuit being mesured has
to take care of all of that crap. If it is so
poorly designed that a 10 mV departure from nominal
(that is missed by my innaccurate meter) will keep
it from working, that suggests other problems.
Yes, the home "lab" person wants extreme accuracy
to as many decimal places as he can get. But when does
he ever really need it?




When he needs it he needs it, what can I say?



I asked, looking for concrete cases. Your case
with the PIC is an excellent example of when a
person needs to know about really small currents.
It definitely fits into the difference I had in mind
between "needs" and "wants". But it does not mean he
needs accuracy out to 8 decimal places. He needs it to
2 decimal places, as was shown. Three decimal places
would be nice. :-)

Do I really "need" a new DSO?



I have no opinion on that, and it would be irrelevant
if I did. I don't know what your situation is.

Well I've managed to get by all this time without one, so maybe you
think I don't really "need" one. I see it like this though, I don't
get allot of time to tinker anymore. I'd like to spend it more
productively. Instead of fumbling around and trying to devise silly
methods to make my existing equipment do something it wasn't designed
to (like going off on a tangent to build a PIC circuit that will
trigger my scope early so I can try to see some pre-trigger history).


None of this is to argue against having the best
instrumentation you can afford, or references to
check it against, or paying for calibration and so




I don't know if I really agree with that. ;-)



Well, you're free to argue against having the best
instrumentation you can afford, or having references
to check it against or getting it calibrated or
whatever, if that's how you feel. I tend to err on
the side of wanting the best even when it is
not the best fit for what I really need.

Ed



forth. But for myself, I need a dose of reality
from time to time when I start drooling over some
accuracy specs that I will never need at home. My
bet is that most of us are seduced by that same muse.

Ed





Please see my earlier post regarding the use of a shunt box for a DVM.
  #115   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 69
Default Calibration Of Electronic Equipment In The Home Workshop

The Real Andy wrote:

On Sun, 04 Mar 2007 00:04:08 -0800, MassiveProng
wrote:


On Sun, 04 Mar 2007 14:14:56 +1000, The Real Andy
Gave us:


On Sat, 03 Mar 2007 19:21:44 -0800, MassiveProng
wrote:


On Sat, 3 Mar 2007 20:44:19 -0600, "Anthony Fremont"
Gave us:


MassiveProng wrote:

On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:


Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave

It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.

The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter.

]
Nope. READ HIS replies. He was talking about using a 3% meter.


Now that the number are back where they belong, please
procede to restate your case.

**** you. Read HIS criteria, dip****, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So **** off.


The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......

That is NOT what the retarded ******* said, you retarded *******.

Ahhh, who actually uses a scope to make accurate measurements?



I guess the same idiots that claim they can calibrate one with a 3%
meter.



My point exactly.


Also, if you do NOT know how to make accurate measurements with
scopes, you should be in some other industry.



More to the point, if you dont understand the concept of error then
you should be in another industry.

*POLITICS*! !oops! did not mean to swear!


  #116   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 69
Default Calibration Of Electronic Equipment In The Home Workshop

MassiveProng wrote:

On Sun, 04 Mar 2007 18:32:57 +1000, The Real Andy
Gave us:


On Sun, 04 Mar 2007 00:04:08 -0800, MassiveProng
wrote:


On Sun, 04 Mar 2007 14:14:56 +1000, The Real Andy
Gave us:


On Sat, 03 Mar 2007 19:21:44 -0800, MassiveProng
g wrote:


On Sat, 3 Mar 2007 20:44:19 -0600, "Anthony Fremont"
Gave us:


MassiveProng wrote:

On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:


Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave

It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.

The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter.

]
Nope. READ HIS replies. He was talking about using a 3% meter.


Now that the number are back where they belong, please
procede to restate your case.

**** you. Read HIS criteria, dip****, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So **** off.


The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......

That is NOT what the retarded ******* said, you retarded *******.

Ahhh, who actually uses a scope to make accurate measurements?


I guess the same idiots that claim they can calibrate one with a 3%
meter.


My point exactly.



Good thing I never made that claim.


Also, if you do NOT know how to make accurate measurements with
scopes, you should be in some other industry.


More to the point, if you dont understand the concept of error then
you should be in another industry.



That is about the gist of what I have been trying to tell them.

Some dope thinking he can adjust his meter accurately with a damned
drifty voltage reference chip should have his head examined, not his
instruments!

Shoot, he could hae a precision, very stable voltage reference, and
still bollix up the works!
  #117   Report Post  
Posted to sci.electronics.repair
external usenet poster
 
Posts: 203
Default Calibration Of Electronic Equipment In The Home Workshop

Jim Yanik wrote:
"Anthony Fremont" wrote in
:

MassiveProng wrote:
On 2 Mar 2007 15:09:30 -0800, "David L. Jones"
Gave us:

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave

It doesn't matter how many "places" you "spot check" it, you are
not going to get the accuracy of your comparison standard on the
device you intend to set with it. What you do is take the basic
INaccuracy of the device needing to be set, and add to it the basic
INaccuracy of the standard to which you are setting it. You CANNOT
get any closer than that. So, a 0.5% meter, and a 0.5% scope cannot
be used together to make the scope that accurate. You need a
*finer* standard than the accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.


The "basic fact" here is that we were talking about adjusting a 3%
scope with a .03% meter. Now that the number are back where they
belong, please procede to restate your case. The scope's vertical
sensitivity could easily be adjusted to within 3% using said meter,
now can't it? Just like Keith says......




Actually,one CAN calibrate an instrument to a greater accuracy than
it's specified accuracy,-for a short time-;it's called a transfer
standard.
Of course,there are limits to how much greater accuracy you can
achieve,based on resolution and repeatability.

For ordinary cals,your standard should be at least 4x better than the
DUT. 10x is great.


Thanks Jim. It's nice to read something that makes me feel like I still
retain some semblance of sanity. That all seems completely reasonable, and
you didn't even have to curse or make threats. ;-)


  #118   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On Sun, 04 Mar 2007 10:11:09 GMT, Robert Baer
Gave us:

Here is a good "trick" to measure low currents with your DVM.
Make a switchable shunt box with (at least) the following full scale
ranges: 200nA (shunt resistor 1.11 megs), 2uA (shunt resistor 101K),
20uA (shunt resistor 10.0K), 200uA (shunt resistor 1.00K).
Put a twisted pair of leads (red, black) with banana plugs (red,
black) running out of the box via a small grommet, to plug into your DVM
set to the 200mV scale; a pair of (red, black) banana jacks with 0.75
"spacing is mounted on the box for your test leads.
Hint: add to the legend the parallel resistance of the system
(200nA/1M, 2uA/100K, etc) as a reminder of the resistance of this
current meter scheme.
Added hint: the 200MV scale is good for 20nA full scale, just
remember the meter resistance is 10 megs.



Tell us, oh master... what does placing a 1,1 meg resistor in
series with a circuit's power source do to the voltage presented to
the circuit?

Shunt resistors are typically less than an ohm. Show me where
ANYONE uses a 1.1 meg resistor os a current shunt.
  #119   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On Sun, 4 Mar 2007 10:12:09 +0000, Clint Sharp
Gave us:



Sorry, you dumb****, but you assuming that all TVs have this
capacity proves even further how little you know about it.

So your TV doesn't have a service mode and it doesn't have any pots to
tweak? How, exactly, does it get adjusted in the factory or by a service
tech then? Just because *you* don't have access to it doesn't mean it
doesn't have a service mode where adjustments can be made via menu.


More proof that you know very little if anything at all about FPDs.
  #120   Report Post  
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
external usenet poster
 
Posts: 632
Default Calibration Of Electronic Equipment In The Home Workshop

On Sun, 4 Mar 2007 10:14:47 +0000, Clint Sharp
Gave us:

In message , MassiveProng
writes
On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
Gave us:

Good for you, please explain how the OP was going to use a DVD to
calibrate his test equipment.



Stupid ****. The suggestion I posed mine against was some twit
suggesting WWV and a 1kHz tone, which is about as old hat as it gets.

You should really learn to read ENTIRE threads before you mouth off,
jackass.

And I asked, does the stability of the clock in a Dvd player affect the
accuracy of the tones replayed, you still haven't given a proper answer
to that. Maybe because you know it does and that blows down your house
of cards.



You're a ****ing retard. I could take a hundred different brand and
quality DVD players and that one set-up disc, and all 100 of the
players would produce the EXACT same tone with less than 1Hz error.

Try again, you totally retarded ****!
Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
1st SMART HOME WORKSHOP and ICHIT 2006 [email protected] Home Repair 0 June 7th 06 12:34 PM
DVD home theater identification/calibration [email protected] Electronics Repair 5 January 20th 06 03:07 AM
Home Workshop Parkerizing - book review Dave Hinz Metalworking 2 April 26th 05 11:50 AM
Myford ML7 Tri-Leva and model workshop equipment for sale Herbie Metalworking 26 August 25th 04 04:21 AM
Resell electronic equipment and more online! MusiLetter Electronics Repair 0 August 5th 03 03:25 AM


All times are GMT +1. The time now is 02:56 PM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 DIYbanter.
The comments are property of their posters.
 

About Us

"It's about DIY & home improvement"