Home |
Search |
Today's Posts |
![]() |
|
Electronics Repair (sci.electronics.repair) Discussion of repairing electronic equipment. Topics include requests for assistance, where to obtain servicing information and parts, techniques for diagnosis and repair, and annecdotes about success, failures and problems. |
Reply |
|
|
LinkBack | Thread Tools | Display Modes |
|
#1
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
MassiveProng wrote:
On Thu, 01 Mar 2007 06:09:35 GMT, Robert Baer Gave us: David L. Jones wrote: On Mar 1, 12:27 pm, "Too_Many_Tools" wrote: I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? Links to recommended circuits, pictures and sources would be appreciated. Since this is a need for anyone who has test equipment, I hope to see a good discussion on this subject. If you've got that sort of gear at home then usually you have better (and calibrated) gear at work as well, in which case most of us would simply bring in our gear from home and spot check it against the good gear. In the absense of this gear, you can simply use precision components. Voltage reference chips with 0.05% or better are cheap and readilly available. 0.01% resistors are available too. If you have multiple meters for example, you can also keep an eye on them by comparison. Using any old component, if all three meters read the same then you can be pretty confident they haven't drifted. Checking scope horizontal timebases is easy with a crystal oscillator and divider. There are various methods for getting an accurate frequency standard, but one of the newst methods is using a GPS derived reference. Second hand Rubidium standards can also be had on eBay. Generally though, good quality test gear does not drift out of spec, so the need for regular calibration is minimal. Dave ![]() True, 0.01% resistors are available, *but* they are extremely expensive (over $100 each) and they are made when and if the manufacturer sees fit to do so. Or if the order is large enough. Ahhh yesss....the Golden Rule. |
#2
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
On Feb 28, 8:27 pm, "Too_Many_Tools" wrote:
I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? Links to recommended circuits, pictures and sources would be appreciated. Since this is a need for anyone who has test equipment, I hope to see a good discussion on this subject. Thanks TMT Thanks for the (positive) comments so far. I look forward to any more you might want to offer. Any circuits or examples others have done? Any cal boxes that anyone have built? TMT |
#3
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]() "Too_Many_Tools" wrote in message oups.com... I look forward to any more you might want to offer. If it was me, I would start by reading Scroggie's Radio Laboratory Handbook. -- .. -- .. .. .. .. .. .. .. .. -- |
#4
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
On 28 Feb 2007 21:50:49 -0800 "Too_Many_Tools"
wrote in Message id: .com: I look forward to any more you might want to offer. Any circuits or examples others have done? For a cheap voltage reference, I would look into Analog device's AD780 series. The AD780BN has an initial error of +-1mV and is available in a plastic 8 pin DIP for easy assembly. Download the data sheet and you'll find sample circuit diagrams. Be sure to use a nice clean power supply and use good decoupling practices around the device. For resistance, see my post with a link to Digikey. |
#5
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
On 28 Feb 2007 21:50:49 -0800, "Too_Many_Tools"
Gave us: Thanks for the (positive) comments so far. I look forward to any more you might want to offer. Any circuits or examples others have done? Any cal boxes that anyone have built? TMT All the circuits suggested here, while appearing good on the surface, are all likely to introduce more error into your instruments than correct. THAT IS WHY REAL calibration services are used, and why I suggested that if you do not intend to have it professionally calibrated, you... YOU IN PARTICULAR, should just leave the gear alone, as you are too ****ing stupid to do it without introducing the aforementioned error. In other words, they are better off UNcalibrated and still reliable, than after any futzing around a twit like you will do with them. You LACK the competence. You want to attack my adulthood, fine, little boy. You are BRAINLESS for this task. I have worked in cal labs and in QA for years. You lose, sonny. |
#6
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]() You want to attack my adulthood, fine, little boy. You are BRAINLESS for this task. I wasn't attacking your adulthood MiniPrick....an adult would not behave in the manner you are. Now a child....yes a child could easily be acting this way....so from now on we will call you MiniPrick. I have worked in cal labs and in QA for years. You lose, sonny. Laugh....laugh....laugh....emptying trash cans is NOT working in cal labs and QA. Hey MiniPrick....you done with the homework assignment yet? TMT On Mar 1, 6:23 am, MassiveProng wrote: On 28 Feb 2007 21:50:49 -0800, "Too_Many_Tools" Gave us: Thanks for the (positive) comments so far. I look forward to any more you might want to offer. Any circuits or examples others have done? Any cal boxes that anyone have built? TMT All the circuits suggested here, while appearing good on the surface, are all likely to introduce more error into your instruments than correct. THAT IS WHY REAL calibration services are used, and why I suggested that if you do not intend to have it professionally calibrated, you... YOU IN PARTICULAR, should just leave the gear alone, as you are too ****ing stupid to do it without introducing the aforementioned error. In other words, they are better off UNcalibrated and still reliable, than after any futzing around a twit like you will do with them. You LACK the competence. You want to attack my adulthood, fine, little boy. You are BRAINLESS for this task. I have worked in cal labs and in QA for years. You lose, sonny. You want to attack my adulthood, fine, little boy. You are BRAINLESS for this task. I have worked in cal labs and in QA for years. You lose, sonny. |
#7
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
On 1 Mar 2007 19:28:47 -0800, "Too_Many_Tools"
Gave us: Laugh....laugh....laugh....emptying trash cans is NOT working in cal labs and QA. Said the utter retard that needed to ask in a BASIC electronics groups about something which he should already know if he planned to attempt such a procedure. Nice try, retard boy. Too bad you are wrong.... again. Hey MiniPrick....you done with the homework assignment yet? That of calling you the retarded ****head that you are? Sure... done. TMT, the total Usenet retard Yep... that'd be you. Your nym is more correct than you'll ever know. You're a jack-of-no-trades. You're a real piece of ****... errr... work, there, bub. My first advice was spot on. To make a proper cal, the source has to be ten times better than the accuracy you wish to claim for the instrument. NONE of the circuits given in this thread are good enough. ALL of those IC chips drift with T so much that calling them a cal source is ludicrous. So are you if you think I don't now quality assurance, and proper procedure. You ain't it. |
#8
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
On Mar 2, 1:46 am, MassiveProng
wrote: On 1 Mar 2007 19:28:47 -0800, "Too_Many_Tools" Gave us: Laugh....laugh....laugh....emptying trash cans is NOT working in cal labs and QA. Said the utter retard that needed to ask in a BASIC electronics groups about something which he should already know if he planned to attempt such a procedure. Nice try, retard boy. Too bad you are wrong.... again. Hey MiniPrick....you done with the homework assignment yet? That of calling you the retarded ****head that you are? Sure... done. TMT, the total Usenet retard Yep... that'd be you. Your nym is more correct than you'll ever know. You're a jack-of-no-trades. You're a real piece of ****... errr... work, there, bub. My first advice was spot on. To make a proper cal, the source has to be ten times better than the accuracy you wish to claim for the instrument. NONE of the circuits given in this thread are good enough. ALL of those IC chips drift with T so much that calling them a cal source is ludicrous. So are you if you think I don't now quality assurance, and proper procedure. You ain't it. So it sounds like you are having a problem finding two brain cells MiniPrick...try harder. No more of your excuses....SHOW us how great you are. Laugh...laugh...laugh.... TMT |
#9
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
Too_Many_Tools wrote:
I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? Links to recommended circuits, pictures and sources would be appreciated. Since this is a need for anyone who has test equipment, I hope to see a good discussion on this subject. Thanks TMT You left many things unsaid: (a) Traceable to NBS/NTIS or not; (b) if not, how many reliable digits; (c) at what cost. You can make a 5V "standard" that any loading will not damage with error better than 0.5mV and runs for at least 6 months with no observable change - and the cost is only a few dollars (uses off-the-shelf parts). You can buy thru DigiKey, resistors rated at 0.05% and at 0.1% - rather decent as references. You can buy a Fluke bench meter rated at 6.5 digits and even pay a bit more for traceability. |
#11
![]()
Posted to sci.electronics.repair
|
|||
|
|||
![]()
Dr. Anton T. Squeegee wrote in
: In article m, (known to some as Too_Many_Tools) scribed... I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. snippety I could post pictures... ;-) Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? Hmm. Excellent question. For frequency, I actually have three different references, all GPS- locked. One is my primary reference, an HP Z3801, as retired from a cellphone site. The second and third ones are both combination clocks and freq-references, one from Trak Systems (now Trak Microwave) and the other from Odetics/Zypher. All three use a very stable OCXO that is constantly disciplined by the GPS receiver. Long-term accuracy is on the order of 1E10 -11th or so. In other words, about as good as you can get without being NIST certified. I don't have good primary voltage or current references as yet. That's on the 'Acquire' list for scrounging this year. For resistance, simple Pomona plugs with 0.01% tolerance resistors work pretty well for 2-wire. For anything more, I will probably have to rent one of the Fluke all-in-ones. I'm just beginning to gather the goodies I need for calibrating my O-scope collection. That will eventually consist of Tektronix leveled sine-wave generators, and one of their CG5xxx series calibration generators. Keep the peace(es). TEK sold their TM500/5000 line to TEGAM years ago,they may still make some calibration products. www.tegam.com -- Jim Yanik jyanik at kua.net |
#12
![]()
Posted to sci.electronics.repair
|
|||
|
|||
![]()
In article ,
(known to some as Jim Yanik) scribed... snippety www.tegam.com Unfortunately, according to their product list, they have discontinued ALL their O-Scope calibration products. However, that's not necessarily the end of the world, as it were. This simply means that there is a better chance of such showing up on the surplus market, which consequently creates a much better chance of my finding something useful. ;-) Thanks. -- Dr. Anton T. Squeegee, Director, Dutch Surrealist Plumbing Institute (Known to some as Bruce Lane, KC7GR) http://www.bluefeathertech.com -- kyrrin a/t bluefeathertech d-o=t calm "Salvadore Dali's computer has surreal ports..." |
#13
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
On Feb 28, 9:27 pm, "Too_Many_Tools" wrote:
I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? It depends entirely on what you need the equipment for. If for any legal reason you need NBS traceability, then the question of how and how often is already answered by your regulatory agencies. If you don't, then I cannot imagine that a couple off-the-shelf precision resistors, voltage references, and frequency references (total cost: $10) would not be good enough for sanity checking for almost any pedestrian uses. If you're the sort who keeps equipment on your bench just to calibrate equipment on your bench just to calibrate equipment on your bench, then any rational argument about traceability is pointless because you've already set yourself up in an infinite circular loop. Tim. |
#14
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
Tim Shoppa wrote:
On Feb 28, 9:27 pm, "Too_Many_Tools" wrote: I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? It depends entirely on what you need the equipment for. If for any legal reason you need NBS traceability, then the question of how and how often is already answered by your regulatory agencies. If you don't, then I cannot imagine that a couple off-the-shelf precision resistors, voltage references, and frequency references (total cost: $10) would not be good enough for sanity checking for almost any pedestrian uses. If you're the sort who keeps equipment on your bench just to calibrate equipment on your bench just to calibrate equipment on your bench, then any rational argument about traceability is pointless because you've already set yourself up in an infinite circular loop. Tim. A lot of good points have been made already so I'll just add a small one. Don't mess with calibration of quality equipment unless you have reason to believe the calibration is off AND THAT IS ADVERSELY AFFECTING YOUR WORK PRODUCTS. An amazing amount of electronics work has been done using equipment with non-current calibration stickers, some of which was out of calibration. If metrology is something that interests you as a hobby, then jump into it and have fun. Tim's last paragraph ought to be printed and framed. Chuck ----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==---- http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups ----= East and West-Coast Server Farms - Total Privacy via Encryption =---- |
#15
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
chuck wrote in
: Tim Shoppa wrote: On Feb 28, 9:27 pm, "Too_Many_Tools" wrote: I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? It depends entirely on what you need the equipment for. If for any legal reason you need NBS traceability, then the question of how and how often is already answered by your regulatory agencies. If you don't, then I cannot imagine that a couple off-the-shelf precision resistors, voltage references, and frequency references (total cost: $10) would not be good enough for sanity checking for almost any pedestrian uses. If you're the sort who keeps equipment on your bench just to calibrate equipment on your bench just to calibrate equipment on your bench, then any rational argument about traceability is pointless because you've already set yourself up in an infinite circular loop. Tim. reminds me of the local TV station techs who insisted that the video gear of theirs I serviced and calibrated was off,and it turned out their 75 Ohm termination was 87ohms.Other techs double-terminated monitors and complained of low brightness,tried to tweak it in,screwed it all up. Or they would have a "reference" generator at the end of 100's of feet of coax and complain it was a few percent off. A lot of good points have been made already so I'll just add a small one. Don't mess with calibration of quality equipment unless you have reason to believe the calibration is off AND THAT IS ADVERSELY AFFECTING YOUR WORK PRODUCTS. An amazing amount of electronics work has been done using equipment with non-current calibration stickers, some of which was out of calibration. If metrology is something that interests you as a hobby, then jump into it and have fun. Tim's last paragraph ought to be printed and framed. Chuck this is good advice,because without a service manual and cal procedure,you have no way of knowing what adjustments INTERACT with others. Adjust a power supply,and gain and timing goes out the window. Freq.response tweaks can affect more than one area of the signal. for example, TEK 475s have multiple vertical gain adjustments,and different adjustments for the 2/5-10mv ranges.And the gain affects F-response. -- Jim Yanik jyanik at kua.net |
#16
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]() |
#17
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
Jim Yanik wrote:
chuck wrote in : Tim Shoppa wrote: On Feb 28, 9:27 pm, "Too_Many_Tools" wrote: I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? It depends entirely on what you need the equipment for. If for any legal reason you need NBS traceability, then the question of how and how often is already answered by your regulatory agencies. If you don't, then I cannot imagine that a couple off-the-shelf precision resistors, voltage references, and frequency references (total cost: $10) would not be good enough for sanity checking for almost any pedestrian uses. If you're the sort who keeps equipment on your bench just to calibrate equipment on your bench just to calibrate equipment on your bench, then any rational argument about traceability is pointless because you've already set yourself up in an infinite circular loop. Tim. reminds me of the local TV station techs who insisted that the video gear of theirs I serviced and calibrated was off,and it turned out their 75 Ohm termination was 87ohms.Other techs double-terminated monitors and complained of low brightness,tried to tweak it in,screwed it all up. Or they would have a "reference" generator at the end of 100's of feet of coax and complain it was a few percent off. A lot of good points have been made already so I'll just add a small one. Don't mess with calibration of quality equipment unless you have reason to believe the calibration is off AND THAT IS ADVERSELY AFFECTING YOUR WORK PRODUCTS. An amazing amount of electronics work has been done using equipment with non-current calibration stickers, some of which was out of calibration. If metrology is something that interests you as a hobby, then jump into it and have fun. Tim's last paragraph ought to be printed and framed. Chuck this is good advice,because without a service manual and cal procedure,you have no way of knowing what adjustments INTERACT with others. Adjust a power supply,and gain and timing goes out the window. Freq.response tweaks can affect more than one area of the signal. for example, TEK 475s have multiple vertical gain adjustments,and different adjustments for the 2/5-10mv ranges.And the gain affects F-response. Some TV stations had so much RFI that even a VOM had trouble reading properly on any scale! |
#18
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
On Mar 1, 10:40 am, Jim Yanik wrote:
chuck wrote : Tim Shoppa wrote: On Feb 28, 9:27 pm, "Too_Many_Tools" wrote: I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? It depends entirely on what you need the equipment for. If for any legal reason you need NBS traceability, then the question of how and how often is already answered by your regulatory agencies. If you don't, then I cannot imagine that a couple off-the-shelf precision resistors, voltage references, and frequency references (total cost: $10) would not be good enough for sanity checking for almost any pedestrian uses. If you're the sort who keeps equipment on your bench just to calibrate equipment on your bench just to calibrate equipment on your bench, then any rational argument about traceability is pointless because you've already set yourself up in an infinite circular loop. Tim. reminds me of the local TV station techs who insisted that the video gear of theirs I serviced and calibrated was off,and it turned out their 75 Ohm termination was 87ohms.Other techs double-terminated monitors and complained of low brightness,tried to tweak it in,screwed it all up. Or they would have a "reference" generator at the end of 100's of feet of coax and complain it was a few percent off. A lot of good points have been made already so I'll just add a small one. Don't mess with calibration of quality equipment unless you have reason to believe the calibration is off AND THAT IS ADVERSELY AFFECTING YOUR WORK PRODUCTS. An amazing amount of electronics work has been done using equipment with non-current calibration stickers, some of which was out of calibration. If metrology is something that interests you as a hobby, then jump into it and have fun. Tim's last paragraph ought to be printed and framed. Chuck this is good advice,because without a service manual and cal procedure,you have no way of knowing what adjustments INTERACT with others. Adjust a power supply,and gain and timing goes out the window. Freq.response tweaks can affect more than one area of the signal. for example, TEK 475s have multiple vertical gain adjustments,and different adjustments for the 2/5-10mv ranges.And the gain affects F-response. -- Jim Yanik jyanik at kua.net- Hide quoted text - - Show quoted text -- Hide quoted text - - Show quoted text - I just got done calibrating a AM503 & A6302 current probe / amp somebody took a screwdriver to. Without the manual and all required gear (PG506), and cal fixtures it would never have worked properly again. I work in a cal lab and the best part about iso9002 was requiring the sealing stickers (cal void if seal is broken). We never used them prior to iso certification. |
#19
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
On Mar 2, 6:57 am, "carneyke" wrote:
On Mar 1, 10:40 am, Jim Yanik wrote: chuck wrote : Tim Shoppa wrote: On Feb 28, 9:27 pm, "Too_Many_Tools" wrote: I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? It depends entirely on what you need the equipment for. If for any legal reason you need NBS traceability, then the question of how and how often is already answered by your regulatory agencies. If you don't, then I cannot imagine that a couple off-the-shelf precision resistors, voltage references, and frequency references (total cost: $10) would not be good enough for sanity checking for almost any pedestrian uses. If you're the sort who keeps equipment on your bench just to calibrate equipment on your bench just to calibrate equipment on your bench, then any rational argument about traceability is pointless because you've already set yourself up in an infinite circular loop. Tim. reminds me of the local TV station techs who insisted that the video gear of theirs I serviced and calibrated was off,and it turned out their 75 Ohm termination was 87ohms.Other techs double-terminated monitors and complained of low brightness,tried to tweak it in,screwed it all up. Or they would have a "reference" generator at the end of 100's of feet of coax and complain it was a few percent off. A lot of good points have been made already so I'll just add a small one. Don't mess with calibration of quality equipment unless you have reason to believe the calibration is off AND THAT IS ADVERSELY AFFECTING YOUR WORK PRODUCTS. An amazing amount of electronics work has been done using equipment with non-current calibration stickers, some of which was out of calibration. If metrology is something that interests you as a hobby, then jump into it and have fun. Tim's last paragraph ought to be printed and framed. Chuck this is good advice,because without a service manual and cal procedure,you have no way of knowing what adjustments INTERACT with others. Adjust a power supply,and gain and timing goes out the window. Freq.response tweaks can affect more than one area of the signal. for example, TEK 475s have multiple vertical gain adjustments,and different adjustments for the 2/5-10mv ranges.And the gain affects F-response. -- Jim Yanik jyanik at kua.net- Hide quoted text - - Show quoted text -- Hide quoted text - - Show quoted text - I just got done calibrating a AM503 & A6302 current probe / amp somebody took a screwdriver to. Without the manual and all required gear (PG506), and cal fixtures it would never have worked properly again. I work in a cal lab and the best part about iso9002 was requiring the sealing stickers (cal void if seal is broken). We never used them prior to iso certification.- Hide quoted text - - Show quoted text - Forgot to mention : If you want to cal your own gear, mark any pots / vari-caps and write down any software codes BEFORE changing. Do not adjust the compensation capacitors in any Tektronix attenuators without a PG506 and a procedure. Jim Yanik - This note isn't for you, as you have seen the damage too..... |
#20
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
Tim Shoppa wrote:
On Feb 28, 9:27 pm, "Too_Many_Tools" wrote: I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? It depends entirely on what you need the equipment for. If for any legal reason you need NBS traceability, then the question of how and how often is already answered by your regulatory agencies. If you don't, then I cannot imagine that a couple off-the-shelf precision resistors, voltage references, and frequency references (total cost: $10) would not be good enough for sanity checking for almost any pedestrian uses. If you're the sort who keeps equipment on your bench just to calibrate equipment on your bench just to calibrate equipment on your bench, then any rational argument about traceability is pointless because you've already set yourself up in an infinite circular loop. Tim. Exactly and completely correct in all aspects. I made a 0.1% resistance reference box: 100 ohms, 1K, 10K, 100K, 1M, 10M and 100M that has been invaluable. I made a voltage reference box using an Intersil (was Xicor) 5V FGA reference powered by a 9V battery; good for source and sink and that initial accuracy of 0.5mV was hard to beat; my HP 5326B verifies the value within its accuracy as well as the 0.5mV of the reference; both in the same region of fuzziness - so not too bad. My handheld DVMs are "fair"; actually the 3.5 digit one os more stable and reliable in readings than the 4.5 digit. That one can be set by only one pot which either makes the resistor readings within spec or makes the DC readings in spec - but not both; i opted for DC reading accuracy. I hate it when i have to fiddle with what seems to be a perfectly good meter, just to make it read correctly (based on two other references). One of these daze, i may be rich enough to get a Fluke 88845A; and if *really* rich, will pay for a traceable meter! |
#21
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
Too_Many_Tools wrote:
I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? Links to recommended circuits, pictures and sources would be appreciated. Since this is a need for anyone who has test equipment, I hope to see a good discussion on this subject. Thanks TMT The real question is how much precision do you really need in the home "lab"? How often have you needed to use your DMM with how many *accurate* significant digits? 100 minus some *very* small percent of the time, 2 significant digits is all you need. Do you _really_ care if your 5.055 volt reading is really 5.06 or 5.04? Oh hell yes, I want to puff out my chest like everyone else and think I have *accurate* equipment. But I'm curious as to what home circuits need meters that can read voltage accurately to 3 decimal places? 2 decimal places? The question for current measurement: in what home brew circuit design/troubleshooting do you need accuracy below the tens of mA digit ? *Need*, not *want*. Do you even trust your DMM on an amps setting for those measurements, or do you measure the current indirectly? How about ohms? Would you trust any DMM, regardless of who calibrated it, to measure down in the miliohm numbers? To me, the design of the circuit being mesured has to take care of all of that crap. If it is so poorly designed that a 10 mV departure from nominal (that is missed by my innaccurate meter) will keep it from working, that suggests other problems. Yes, the home "lab" person wants extreme accuracy to as many decimal places as he can get. But when does he ever really need it? None of this is to argue against having the best instrumentation you can afford, or references to check it against, or paying for calibration and so forth. But for myself, I need a dose of reality from time to time when I start drooling over some accuracy specs that I will never need at home. My bet is that most of us are seduced by that same muse. Ed |
#22
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
ehsjr wrote:
Too_Many_Tools wrote: I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? Links to recommended circuits, pictures and sources would be appreciated. Since this is a need for anyone who has test equipment, I hope to see a good discussion on this subject. Thanks TMT The real question is how much precision do you really need in the home "lab"? How often have you needed to use your DMM with how many *accurate* significant digits? 100 minus some *very* small percent of the time, 2 significant digits is all you need. Do you _really_ care if your 5.055 volt reading is really 5.06 or 5.04? Oh hell yes, I want to puff out my chest like everyone else and think I have *accurate* equipment. But I'm curious as to what home circuits need meters that can read voltage accurately to 3 decimal places? 2 decimal places? The question for current measurement: in what home brew circuit design/troubleshooting do you need accuracy below the tens of mA digit ? *Need*, not You surely didn't mean tens of _mA_, did you? I build stuff with PICs as you know, and some of it is designed to run on batteries and needs to go for long periods of time unattended. The current draw for a 12F683 running at 31kHz is 11uA, sleep current is 50nA. If I could only measure current to "tens of mA", I'd never know if the PIC was setup right for low current draw and I certainly couldn't have any idea of expected battery life. I wouldn't even know if it was sleeping until it ate thru some batteries in a few days instead of six or eight months. I think I have a need to measure fractions of a uA. *want*. Do you even trust your DMM on an amps setting for those measurements, or do you measure the current indirectly? How about ohms? Would you trust any DMM, regardless of who calibrated it, to measure down in the miliohm numbers? To me, the design of the circuit being mesured has to take care of all of that crap. If it is so poorly designed that a 10 mV departure from nominal (that is missed by my innaccurate meter) will keep it from working, that suggests other problems. Yes, the home "lab" person wants extreme accuracy to as many decimal places as he can get. But when does he ever really need it? When he needs it he needs it, what can I say? Do I really "need" a new DSO? Well I've managed to get by all this time without one, so maybe you think I don't really "need" one. I see it like this though, I don't get allot of time to tinker anymore. I'd like to spend it more productively. Instead of fumbling around and trying to devise silly methods to make my existing equipment do something it wasn't designed to (like going off on a tangent to build a PIC circuit that will trigger my scope early so I can try to see some pre-trigger history). None of this is to argue against having the best instrumentation you can afford, or references to check it against, or paying for calibration and so I don't know if I really agree with that. ;-) forth. But for myself, I need a dose of reality from time to time when I start drooling over some accuracy specs that I will never need at home. My bet is that most of us are seduced by that same muse. Ed |
#23
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
Anthony Fremont wrote:
ehsjr wrote: Too_Many_Tools wrote: I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? Links to recommended circuits, pictures and sources would be appreciated. Since this is a need for anyone who has test equipment, I hope to see a good discussion on this subject. Thanks TMT The real question is how much precision do you really need in the home "lab"? How often have you needed to use your DMM with how many *accurate* significant digits? 100 minus some *very* small percent of the time, 2 significant digits is all you need. Do you _really_ care if your 5.055 volt reading is really 5.06 or 5.04? Oh hell yes, I want to puff out my chest like everyone else and think I have *accurate* equipment. But I'm curious as to what home circuits need meters that can read voltage accurately to 3 decimal places? 2 decimal places? The question for current measurement: in what home brew circuit design/troubleshooting do you need accuracy below the tens of mA digit ? *Need*, not You surely didn't mean tens of _mA_, did you? I surely meant tens of mA. I build stuff with PICs as you know, and some of it is designed to run on batteries and needs to go for long periods of time unattended. The current draw for a 12F683 running at 31kHz is 11uA, sleep current is 50nA. If I could only measure current to "tens of mA", I'd never know if the PIC was setup right for low current draw and I certainly couldn't have any idea of expected battery life. I wouldn't even know if it was sleeping until it ate thru some batteries in a few days instead of six or eight months. I think I have a need to measure fractions of a uA. You may, but not accuracy below the tens of _mA_ digit. When you need accuracy below tens of mA, you measure voltage across a resistance. It doesn't make a lot of sense to look for your meter to be accurate to 8 decimal places for your .00000005 amp reading. Here's how you do it with accuracy at the tens of _mV_ digit: For 11 uA, put a 10K .01% resistor in series with the supply and measure .11 volts across it. The voltage would range from 0.109989 to 0.110011. Keep only 2 decimal places. Your computed current, worst case, would be off by 1 uA For 50 nA, use a 2 meg 1% resistor and measure .10 volts across it. The voltage would range from .099 to .101 taking the 1% into account. Throw out the last digit. Your current computation would be off worst case, by 5 nA. With a voltmeter accurate to 2 decimal places. I don't know why you would *want*. Do you even trust your DMM on an amps setting for those measurements, or do you measure the current indirectly? How about ohms? Would you trust any DMM, regardless of who calibrated it, to measure down in the miliohm numbers? To me, the design of the circuit being mesured has to take care of all of that crap. If it is so poorly designed that a 10 mV departure from nominal (that is missed by my innaccurate meter) will keep it from working, that suggests other problems. Yes, the home "lab" person wants extreme accuracy to as many decimal places as he can get. But when does he ever really need it? When he needs it he needs it, what can I say? I asked, looking for concrete cases. Your case with the PIC is an excellent example of when a person needs to know about really small currents. It definitely fits into the difference I had in mind between "needs" and "wants". But it does not mean he needs accuracy out to 8 decimal places. He needs it to 2 decimal places, as was shown. Three decimal places would be nice. :-) Do I really "need" a new DSO? I have no opinion on that, and it would be irrelevant if I did. I don't know what your situation is. Well I've managed to get by all this time without one, so maybe you think I don't really "need" one. I see it like this though, I don't get allot of time to tinker anymore. I'd like to spend it more productively. Instead of fumbling around and trying to devise silly methods to make my existing equipment do something it wasn't designed to (like going off on a tangent to build a PIC circuit that will trigger my scope early so I can try to see some pre-trigger history). None of this is to argue against having the best instrumentation you can afford, or references to check it against, or paying for calibration and so I don't know if I really agree with that. ;-) Well, you're free to argue against having the best instrumentation you can afford, or having references to check it against or getting it calibrated or whatever, if that's how you feel. I tend to err on the side of wanting the best even when it is not the best fit for what I really need. Ed forth. But for myself, I need a dose of reality from time to time when I start drooling over some accuracy specs that I will never need at home. My bet is that most of us are seduced by that same muse. Ed |
#24
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
On Sun, 04 Mar 2007 08:25:08 GMT, ehsjr Gave
us: You may, but not accuracy below the tens of _mA_ digit. When you need accuracy below tens of mA, you measure voltage across a resistance. It doesn't make a lot of sense to look for your meter to be accurate to 8 decimal places for your .00000005 amp reading. ALL handheld meters use voltage read across a precision shunt resistor for current readings. I am not talking about inductive probes. Standard current. |
#25
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
ehsjr wrote:
Anthony Fremont wrote: ehsjr wrote: But I'm curious as to what home circuits need meters that can read voltage accurately to 3 decimal places? 2 decimal places? The question for current measurement: in what home brew circuit design/troubleshooting do you need accuracy below the tens of mA digit ? *Need*, not You surely didn't mean tens of _mA_, did you? I surely meant tens of mA. I build stuff with PICs as you know, and some of it is designed to run on batteries and needs to go for long periods of time unattended. The current draw for a 12F683 running at 31kHz is 11uA, sleep current is 50nA. If I could only measure current to "tens of mA", I'd never know if the PIC was setup right for low current draw and I certainly couldn't have any idea of expected battery life. I wouldn't even know if it was sleeping until it ate thru some batteries in a few days instead of six or eight months. I think I have a need to measure fractions of a uA. You may, but not accuracy below the tens of _mA_ digit. When you need accuracy below tens of mA, you measure voltage across a resistance. It doesn't make a lot of Isn't that exactly how my DMM does it? sense to look for your meter to be accurate to 8 decimal places for your .00000005 amp reading. Now come on, the 8 decimal places is only assuming that the scale is in an Amps range. The meter would be in the 500uA full scale range where 50nA is only 2 decimal places. Here's how you do it with accuracy at the tens of _mV_ digit: For 11 uA, put a 10K .01% resistor in series with the supply and measure .11 volts across it. The voltage would range from 0.109989 to 0.110011. Keep only 2 decimal places. Your computed current, worst case, would be off by 1 uA For 50 nA, use a 2 meg 1% resistor and measure .10 volts across it. The voltage would range from .099 to .101 taking the 1% into account. Throw out the last digit. Your current computation would be off worst case, by 5 nA. Those are fine ways to measuring static current levels, but they will not work for me. Until the PIC goes to sleep, the current draw is much higher. So much so that it would never power up thru a 2M resistor. With a voltmeter accurate to 2 decimal places. I don't know why you would If your volt meter has a 1V maximum at full scale and one can live with 10% error, then I agree. If it has a 100V range, then you need .01% accuracy on your equipment to make your measurements, right? *want*. Do you even trust your DMM on an amps setting for those measurements, or do you measure the current indirectly? How about ohms? Would you trust any DMM, regardless of who calibrated it, to measure down in the miliohm numbers? To me, the design of the circuit being mesured has to take care of all of that crap. If it is so poorly designed that a 10 mV departure from nominal (that is missed by my innaccurate meter) will keep it from working, that suggests other problems. Yes, the home "lab" person wants extreme accuracy to as many decimal places as he can get. But when does he ever really need it? When he needs it he needs it, what can I say? I asked, looking for concrete cases. Your case with the PIC is an excellent example of when a person needs to know about really small currents. It definitely fits into the difference I had in mind between "needs" and "wants". But it does not mean he needs accuracy out to 8 decimal places. He needs it to 2 decimal places, as was shown. Three decimal places would be nice. :-) Aren't you arbitrarily relocating your base measurement scale to uA or nA and then claiming that you're only being accurate to two decimal places? You are still measuring current to the same "8 decimal places" in terms of whole Amps, you just moved the decimal around. IMO, it's not about decimal places at all, that's just a matter of scale. It's about accuracy. 10% ain't good enough, and that's only accounting for the error in your shunt resistors. :-) Do I really "need" a new DSO? I have no opinion on that, and it would be irrelevant if I did. I don't know what your situation is. Well I've managed to get by all this time without one, so maybe you think I don't really "need" one. I see it like this though, I don't get allot of time to tinker anymore. I'd like to spend it more productively. Instead of fumbling around and trying to devise silly methods to make my existing equipment do something it wasn't designed to (like going off on a tangent to build a PIC circuit that will trigger my scope early so I can try to see some pre-trigger history). None of this is to argue against having the best instrumentation you can afford, or references to check it against, or paying for calibration and so I don't know if I really agree with that. ;-) Well, you're free to argue against having the best instrumentation you can afford, or having references to check it against or getting it calibrated or whatever, if that's how you feel. I tend to err on Actually, I'm all for that part. the side of wanting the best even when it is not the best fit for what I really need. And this is what I do as well. I'd rather have a margin of overkill than to be constantly living with sacrifice by saving a couple of bucks on a NRE. What I wasn't "sure" about was whether _you_ really felt that way. ;-) Ed forth. But for myself, I need a dose of reality from time to time when I start drooling over some accuracy specs that I will never need at home. My bet is that most of us are seduced by that same muse. Ed |
#26
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
Anthony Fremont wrote:
ehsjr wrote: Anthony Fremont wrote: ehsjr wrote: But I'm curious as to what home circuits need meters that can read voltage accurately to 3 decimal places? 2 decimal places? The question for current measurement: in what home brew circuit design/troubleshooting do you need accuracy below the tens of mA digit ? *Need*, not You surely didn't mean tens of _mA_, did you? I surely meant tens of mA. I build stuff with PICs as you know, and some of it is designed to run on batteries and needs to go for long periods of time unattended. The current draw for a 12F683 running at 31kHz is 11uA, sleep current is 50nA. If I could only measure current to "tens of mA", I'd never know if the PIC was setup right for low current draw and I certainly couldn't have any idea of expected battery life. I wouldn't even know if it was sleeping until it ate thru some batteries in a few days instead of six or eight months. I think I have a need to measure fractions of a uA. You may, but not accuracy below the tens of _mA_ digit. When you need accuracy below tens of mA, you measure voltage across a resistance. It doesn't make a lot of Isn't that exactly how my DMM does it? I don't know what your meter does. I assume it's like any other. If so, it uses a shunt and develops a voltage across the shunt so it is the same principle as what I'm taking about, but not the same values. AFAIK, they don't use a megohm neighborhood shunt for low current - but then, I don't have any meters with an nA scale. sense to look for your meter to be accurate to 8 decimal places for your .00000005 amp reading. Now come on, the 8 decimal places is only assuming that the scale is in an Amps range. The meter would be in the 500uA full scale range where 50nA is only 2 decimal places. Perhaps I did not make the point clearly. When you are using your DMM and measuring something in the neighborhood of 8 decimal places, like tens of nA, your meter, regardless of scale, will be less accurate than when it is measuring something in the 2 decimal place neighborhood. The meter itself is more susceptable to uncertainty the lower you go. AFAIK, the current shunt even for low current scales has a much lower resistance than the 2meg or 100 k I mentioned. That means that the meter has to work with a lower level than the 110 mv those resistors produce. Regarding scaling - DMM's have tens of mV in 2 decimal places. Most DMM's do not have tens of nA in 2 decimal places. To get an 8th decimal point current reading into the 2 decimal point range, convert it to mV with a resistor. To put it in another perspective, consider a Fluke 187. It will give .01 uA resolution (2 digits after the decimal) on the 500 ua scale at a claimed accuracy of +/- .25%. We'll ignore the further 20 count uncertaincy. That's a +/- 1.25 uA error. That measurement is useless for the 55 nA current measurement you need. The meter could show 500.00 or 500.05 or 501.25 or whatever and you would not know whether you had 55 nA or not. On that scale, the meeter cannot be accurate to 2 decimal places. And you cannot throw away the third digit after the decimal - it doesn't exist on the meter, the resolution is too poor. The same meter, on the 3 volt (3000mV) scale is accurate to within +/- .025% which is +/- 75 uV - again, ignoring the further 5 count uncertainty. On the 3 volt scale with the technique I mentioned where you throw away the third digit after the decimal, the error is meaningless. That digit happens to be accurate on this meter and scale, so the error is meaningless, even if you keep it. Here's how you do it with accuracy at the tens of _mV_ digit: For 11 uA, put a 10K .01% resistor in series with the supply and measure .11 volts across it. The voltage would range from 0.109989 to 0.110011. Keep only 2 decimal places. Your computed current, worst case, would be off by 1 uA For 50 nA, use a 2 meg 1% resistor and measure .10 volts across it. The voltage would range from .099 to .101 taking the 1% into account. Throw out the last digit. Your current computation would be off worst case, by 5 nA. Those are fine ways to measuring static current levels, but they will not work for me. Until the PIC goes to sleep, the current draw is much higher. So much so that it would never power up thru a 2M resistor. So I guess you're stuck with a need that the fancy Fluke mentioned above cannot meet. How _do_ you measure the 55 nA? What I would do is bypass the resistor with a switch so the PIC can power up and run, and monitor it while it is active by whatever technique you choose, so that you know it is active. When it goes inactive, open the switch to measure the voltage across the resistor. With a voltmeter accurate to 2 decimal places. I don't know why you would If your volt meter has a 1V maximum at full scale and one can live with 10% error, then I agree. If it has a 100V range, then you need .01% accuracy on your equipment to make your measurements, right? Anyone who is not smart enough to turn his meter range down from the 100V scale to measure mV is not smart enough to need nA measurements. Measuring mV with the range set to 100 is stupid. And 10% error for a DMM is stupid. I know you are *not* stupid. So what is your point? Ed |
#27
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
ehsjr wrote:
Anthony Fremont wrote: ehsjr wrote: Too_Many_Tools wrote: I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? Links to recommended circuits, pictures and sources would be appreciated. Since this is a need for anyone who has test equipment, I hope to see a good discussion on this subject. Thanks TMT The real question is how much precision do you really need in the home "lab"? How often have you needed to use your DMM with how many *accurate* significant digits? 100 minus some *very* small percent of the time, 2 significant digits is all you need. Do you _really_ care if your 5.055 volt reading is really 5.06 or 5.04? Oh hell yes, I want to puff out my chest like everyone else and think I have *accurate* equipment. But I'm curious as to what home circuits need meters that can read voltage accurately to 3 decimal places? 2 decimal places? The question for current measurement: in what home brew circuit design/troubleshooting do you need accuracy below the tens of mA digit ? *Need*, not You surely didn't mean tens of _mA_, did you? I surely meant tens of mA. I build stuff with PICs as you know, and some of it is designed to run on batteries and needs to go for long periods of time unattended. The current draw for a 12F683 running at 31kHz is 11uA, sleep current is 50nA. If I could only measure current to "tens of mA", I'd never know if the PIC was setup right for low current draw and I certainly couldn't have any idea of expected battery life. I wouldn't even know if it was sleeping until it ate thru some batteries in a few days instead of six or eight months. I think I have a need to measure fractions of a uA. You may, but not accuracy below the tens of _mA_ digit. When you need accuracy below tens of mA, you measure voltage across a resistance. It doesn't make a lot of sense to look for your meter to be accurate to 8 decimal places for your .00000005 amp reading. Here's how you do it with accuracy at the tens of _mV_ digit: For 11 uA, put a 10K .01% resistor in series with the supply and measure .11 volts across it. The voltage would range from 0.109989 to 0.110011. Keep only 2 decimal places. Your computed current, worst case, would be off by 1 uA For 50 nA, use a 2 meg 1% resistor and measure .10 volts across it. The voltage would range from .099 to .101 taking the 1% into account. Throw out the last digit. Your current computation would be off worst case, by 5 nA. With a voltmeter accurate to 2 decimal places. I don't know why you would *want*. Do you even trust your DMM on an amps setting for those measurements, or do you measure the current indirectly? How about ohms? Would you trust any DMM, regardless of who calibrated it, to measure down in the miliohm numbers? To me, the design of the circuit being mesured has to take care of all of that crap. If it is so poorly designed that a 10 mV departure from nominal (that is missed by my innaccurate meter) will keep it from working, that suggests other problems. Yes, the home "lab" person wants extreme accuracy to as many decimal places as he can get. But when does he ever really need it? When he needs it he needs it, what can I say? I asked, looking for concrete cases. Your case with the PIC is an excellent example of when a person needs to know about really small currents. It definitely fits into the difference I had in mind between "needs" and "wants". But it does not mean he needs accuracy out to 8 decimal places. He needs it to 2 decimal places, as was shown. Three decimal places would be nice. :-) Do I really "need" a new DSO? I have no opinion on that, and it would be irrelevant if I did. I don't know what your situation is. Well I've managed to get by all this time without one, so maybe you think I don't really "need" one. I see it like this though, I don't get allot of time to tinker anymore. I'd like to spend it more productively. Instead of fumbling around and trying to devise silly methods to make my existing equipment do something it wasn't designed to (like going off on a tangent to build a PIC circuit that will trigger my scope early so I can try to see some pre-trigger history). None of this is to argue against having the best instrumentation you can afford, or references to check it against, or paying for calibration and so I don't know if I really agree with that. ;-) Well, you're free to argue against having the best instrumentation you can afford, or having references to check it against or getting it calibrated or whatever, if that's how you feel. I tend to err on the side of wanting the best even when it is not the best fit for what I really need. Ed forth. But for myself, I need a dose of reality from time to time when I start drooling over some accuracy specs that I will never need at home. My bet is that most of us are seduced by that same muse. Ed Please see my earlier post regarding the use of a shunt box for a DVM. |
#28
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
On Sun, 04 Mar 2007 10:38:28 GMT, Robert Baer
Gave us: Please see my earlier post regarding the use of a shunt box for a DVM. What a joke that was. When is a 1.1M resistor EVER a current shunt? |
#29
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
Anthony Fremont wrote:
ehsjr wrote: Too_Many_Tools wrote: I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? Links to recommended circuits, pictures and sources would be appreciated. Since this is a need for anyone who has test equipment, I hope to see a good discussion on this subject. Thanks TMT The real question is how much precision do you really need in the home "lab"? How often have you needed to use your DMM with how many *accurate* significant digits? 100 minus some *very* small percent of the time, 2 significant digits is all you need. Do you _really_ care if your 5.055 volt reading is really 5.06 or 5.04? Oh hell yes, I want to puff out my chest like everyone else and think I have *accurate* equipment. But I'm curious as to what home circuits need meters that can read voltage accurately to 3 decimal places? 2 decimal places? The question for current measurement: in what home brew circuit design/troubleshooting do you need accuracy below the tens of mA digit ? *Need*, not You surely didn't mean tens of _mA_, did you? I build stuff with PICs as you know, and some of it is designed to run on batteries and needs to go for long periods of time unattended. The current draw for a 12F683 running at 31kHz is 11uA, sleep current is 50nA. If I could only measure current to "tens of mA", I'd never know if the PIC was setup right for low current draw and I certainly couldn't have any idea of expected battery life. I wouldn't even know if it was sleeping until it ate thru some batteries in a few days instead of six or eight months. I think I have a need to measure fractions of a uA. *want*. Do you even trust your DMM on an amps setting for those measurements, or do you measure the current indirectly? How about ohms? Would you trust any DMM, regardless of who calibrated it, to measure down in the miliohm numbers? To me, the design of the circuit being mesured has to take care of all of that crap. If it is so poorly designed that a 10 mV departure from nominal (that is missed by my innaccurate meter) will keep it from working, that suggests other problems. Yes, the home "lab" person wants extreme accuracy to as many decimal places as he can get. But when does he ever really need it? When he needs it he needs it, what can I say? Do I really "need" a new DSO? Well I've managed to get by all this time without one, so maybe you think I don't really "need" one. I see it like this though, I don't get allot of time to tinker anymore. I'd like to spend it more productively. Instead of fumbling around and trying to devise silly methods to make my existing equipment do something it wasn't designed to (like going off on a tangent to build a PIC circuit that will trigger my scope early so I can try to see some pre-trigger history). None of this is to argue against having the best instrumentation you can afford, or references to check it against, or paying for calibration and so I don't know if I really agree with that. ;-) forth. But for myself, I need a dose of reality from time to time when I start drooling over some accuracy specs that I will never need at home. My bet is that most of us are seduced by that same muse. Ed Here is a good "trick" to measure low currents with your DVM. Make a switchable shunt box with (at least) the following full scale ranges: 200nA (shunt resistor 1.11 megs), 2uA (shunt resistor 101K), 20uA (shunt resistor 10.0K), 200uA (shunt resistor 1.00K). Put a twisted pair of leads (red, black) with banana plugs (red, black) running out of the box via a small grommet, to plug into your DVM set to the 200mV scale; a pair of (red, black) banana jacks with 0.75 "spacing is mounted on the box for your test leads. Hint: add to the legend the parallel resistance of the system (200nA/1M, 2uA/100K, etc) as a reminder of the resistance of this current meter scheme. Added hint: the 200MV scale is good for 20nA full scale, just remember the meter resistance is 10 megs. |
#30
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
On Sun, 04 Mar 2007 10:11:09 GMT, Robert Baer
Gave us: Here is a good "trick" to measure low currents with your DVM. Make a switchable shunt box with (at least) the following full scale ranges: 200nA (shunt resistor 1.11 megs), 2uA (shunt resistor 101K), 20uA (shunt resistor 10.0K), 200uA (shunt resistor 1.00K). Put a twisted pair of leads (red, black) with banana plugs (red, black) running out of the box via a small grommet, to plug into your DVM set to the 200mV scale; a pair of (red, black) banana jacks with 0.75 "spacing is mounted on the box for your test leads. Hint: add to the legend the parallel resistance of the system (200nA/1M, 2uA/100K, etc) as a reminder of the resistance of this current meter scheme. Added hint: the 200MV scale is good for 20nA full scale, just remember the meter resistance is 10 megs. Tell us, oh master... what does placing a 1,1 meg resistor in series with a circuit's power source do to the voltage presented to the circuit? Shunt resistors are typically less than an ohm. Show me where ANYONE uses a 1.1 meg resistor os a current shunt. |
#31
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
MassiveProng wrote:
On Sun, 04 Mar 2007 10:11:09 GMT, Robert Baer Gave us: Here is a good "trick" to measure low currents with your DVM. Make a switchable shunt box with (at least) the following full scale ranges: 200nA (shunt resistor 1.11 megs), 2uA (shunt resistor 101K), 20uA (shunt resistor 10.0K), 200uA (shunt resistor 1.00K). Put a twisted pair of leads (red, black) with banana plugs (red, black) running out of the box via a small grommet, to plug into your DVM set to the 200mV scale; a pair of (red, black) banana jacks with 0.75 "spacing is mounted on the box for your test leads. Hint: add to the legend the parallel resistance of the system (200nA/1M, 2uA/100K, etc) as a reminder of the resistance of this current meter scheme. Added hint: the 200MV scale is good for 20nA full scale, just remember the meter resistance is 10 megs. Tell us, oh master... what does placing a 1,1 meg resistor in series with a circuit's power source do to the voltage presented to the circuit? http://www.google.com/search?hl=en&q=ohms+law It drops it like any other shunt would. If you were designing precision equipment, would you attempt to measure nA by using a .01Ohm resistor? Read Ed's post, he is right about being able to measure low currents that way, just as Robert is here. They just don't work if the current is varying over a wide range. But for static current measurements, THEY WILL WORK FINE. Shunt resistors are typically less than an ohm. Show me where ANYONE uses a 1.1 meg resistor os a current shunt. You've got to be kidding! Don't you design/build HV supplies? Tell me that your feedback circuits sense the full output voltage. HA HA HA I guess you don't use any micros, huh? Or do you just use a 100Ohm and .01Ohm resistive divider to sense that 50KV output? I'd really like to see that. |
#32
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
MassiveProng wrote:
On Sun, 04 Mar 2007 10:11:09 GMT, Robert Baer Gave us: Here is a good "trick" to measure low currents with your DVM. Make a switchable shunt box with (at least) the following full scale ranges: 200nA (shunt resistor 1.11 megs), 2uA (shunt resistor 101K), 20uA (shunt resistor 10.0K), 200uA (shunt resistor 1.00K). Put a twisted pair of leads (red, black) with banana plugs (red, black) running out of the box via a small grommet, to plug into your DVM set to the 200mV scale; a pair of (red, black) banana jacks with 0.75 "spacing is mounted on the box for your test leads. Hint: add to the legend the parallel resistance of the system (200nA/1M, 2uA/100K, etc) as a reminder of the resistance of this current meter scheme. Added hint: the 200MV scale is good for 20nA full scale, just remember the meter resistance is 10 megs. Tell us, oh master... what does placing a 1,1 meg resistor in series with a circuit's power source do to the voltage presented to the circuit? Shunt resistors are typically less than an ohm. Show me where ANYONE uses a 1.1 meg resistor os a current shunt. I think i said nothing about a series resistor. Take that handheld DVM and note (rare exceptions) that its input resistance is 10 megs onany of the voltage scales. Take further note that the most sensitive scale is (almost always) 200mVFS. So, by the simple application of ohms law, driving the meter for full scale reading, the current thru the meter is 20nAFS. Now, if one places a 1.11Meg resistor in parallel with the DVM, then the equivalent input rtesistance would then be 1.00Megs and that would mean, by the simple application of ohms law, driving the meter for full scale reading, the current thru the meter is 200nAFS. Und so wieder. Now if you happen to have a *different* meter that has current scales moer sensitive than 2mAFS, then this "trick" would not be needed. Or....if you have absolutely no need for 3.5 or4.5 digit readings of low currents via your handheld DVM, then this is moot. |
#33
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
Good comments Ed.
I want to thank everyone else who has offered *positive* comments also. Like I said, I think this is a need for anyone who has equipment at home. TMT On Mar 2, 11:22 pm, ehsjr wrote: Too_Many_Tools wrote: I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? Links to recommended circuits, pictures and sources would be appreciated. Since this is a need for anyone who has test equipment, I hope to see a good discussion on this subject. Thanks TMT The real question is how much precision do you really need in the home "lab"? How often have you needed to use your DMM with how many *accurate* significant digits? 100 minus some *very* small percent of the time, 2 significant digits is all you need. Do you _really_ care if your 5.055 volt reading is really 5.06 or 5.04? Oh hell yes, I want to puff out my chest like everyone else and think I have *accurate* equipment. But I'm curious as to what home circuits need meters that can read voltage accurately to 3 decimal places? 2 decimal places? The question for current measurement: in what home brew circuit design/troubleshooting do you need accuracy below the tens of mA digit ? *Need*, not *want*. Do you even trust your DMM on an amps setting for those measurements, or do you measure the current indirectly? How about ohms? Would you trust any DMM, regardless of who calibrated it, to measure down in the miliohm numbers? To me, the design of the circuit being mesured has to take care of all of that crap. If it is so poorly designed that a 10 mV departure from nominal (that is missed by my innaccurate meter) will keep it from working, that suggests other problems. Yes, the home "lab" person wants extreme accuracy to as many decimal places as he can get. But when does he ever really need it? None of this is to argue against having the best instrumentation you can afford, or references to check it against, or paying for calibration and so forth. But for myself, I need a dose of reality from time to time when I start drooling over some accuracy specs that I will never need at home. My bet is that most of us are seduced by that same muse. Ed- Hide quoted text - - Show quoted text - |
#34
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
On 3 Mar 2007 17:20:54 -0800, "Too_Many_Tools"
Gave us: Good comments Ed. I want to thank everyone else who has offered *positive* comments also. I want you to leave the group and never return, you top posting Usenet RETARD! Like I said, I think this is a need for anyone who has equipment at home. Like I said, anyone as dumb as you are, regardless of your "tool count", should not be futzing with perfectly good instruments. You are simply too ****ing stoopid to do it correctly. TMT Yes, YOU! Learn about top posting, asswipe, and how it is frowned upon in Usenet, or are you just another pants down past the asscrack, gang boy retard? |
#35
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
On Sat, 03 Mar 2007 05:22:58 GMT, ehsjr Gave
us: Too_Many_Tools wrote: I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? Links to recommended circuits, pictures and sources would be appreciated. Since this is a need for anyone who has test equipment, I hope to see a good discussion on this subject. Thanks TMT The real question is how much precision do you really need in the home "lab"? How often have you needed to use your DMM with how many *accurate* significant digits? 100 minus some *very* small percent of the time, 2 significant digits is all you need. Do you _really_ care if your 5.055 volt reading is really 5.06 or 5.04? Oh hell yes, I want to puff out my chest like everyone else and think I have *accurate* equipment. But I'm curious as to what home circuits need meters that can read voltage accurately to 3 decimal places? 2 decimal places? The question for current measurement: in what home brew circuit design/troubleshooting do you need accuracy below the tens of mA digit ? *Need*, not *want*. Do you even trust your DMM on an amps setting for those measurements, or do you measure the current indirectly? How about ohms? Would you trust any DMM, regardless of who calibrated it, to measure down in the miliohm numbers? To me, the design of the circuit being mesured has to take care of all of that crap. If it is so poorly designed that a 10 mV departure from nominal (that is missed by my innaccurate meter) will keep it from working, that suggests other problems. Yes, the home "lab" person wants extreme accuracy to as many decimal places as he can get. But when does he ever really need it? None of this is to argue against having the best instrumentation you can afford, or references to check it against, or paying for calibration and so forth. But for myself, I need a dose of reality from time to time when I start drooling over some accuracy specs that I will never need at home. My bet is that most of us are seduced by that same muse. Modern instrument accuracies are so good, and keep their setup so well, to open one up and tweak it with less than a professional calibration standard available is ludicrous in the extreme. No matter how smart one is, if one has an instrument, and wants to test accuracy, one should make an appearance somewhere where an already recently calibrated instrument is available to EXAMINE your instrument against. NONE should be "adjusted" at all ever if the variance is too small to warrant it, and even pro calibrators follow this creed. If at all possible, their main task is to VERIFY an instrument's accuracy WITHOUT making ANY adjustment. ANY that DO need adjustments are typically marked "defective" and require a factory inspection/repair. I speak from experience, so I don't care what the ToolTard thinks about his capacity for the task, he is a ****ing retard if he tries it without first checking his gear against known good gear. It really is THAT SIMPLE. |
#36
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
MassiveProng wrote:
On Sat, 03 Mar 2007 05:22:58 GMT, ehsjr Gave us: Too_Many_Tools wrote: I have a well stocked test bench at home containing a range of analog, digital and RF test equipment as I am sure most of you also do. Well the question I have is how do you handle the calibration of your equipment? What do you use for calibration standards for resistance, voltage, current and frequency? Links to recommended circuits, pictures and sources would be appreciated. Since this is a need for anyone who has test equipment, I hope to see a good discussion on this subject. Thanks TMT The real question is how much precision do you really need in the home "lab"? How often have you needed to use your DMM with how many *accurate* significant digits? 100 minus some *very* small percent of the time, 2 significant digits is all you need. Do you _really_ care if your 5.055 volt reading is really 5.06 or 5.04? Oh hell yes, I want to puff out my chest like everyone else and think I have *accurate* equipment. But I'm curious as to what home circuits need meters that can read voltage accurately to 3 decimal places? 2 decimal places? The question for current measurement: in what home brew circuit design/troubleshooting do you need accuracy below the tens of mA digit ? *Need*, not *want*. Do you even trust your DMM on an amps setting for those measurements, or do you measure the current indirectly? How about ohms? Would you trust any DMM, regardless of who calibrated it, to measure down in the miliohm numbers? To me, the design of the circuit being mesured has to take care of all of that crap. If it is so poorly designed that a 10 mV departure from nominal (that is missed by my innaccurate meter) will keep it from working, that suggests other problems. Yes, the home "lab" person wants extreme accuracy to as many decimal places as he can get. But when does he ever really need it? None of this is to argue against having the best instrumentation you can afford, or references to check it against, or paying for calibration and so forth. But for myself, I need a dose of reality from time to time when I start drooling over some accuracy specs that I will never need at home. My bet is that most of us are seduced by that same muse. Modern instrument accuracies are so good, and keep their setup so well, to open one up and tweak it with less than a professional calibration standard available is ludicrous in the extreme. No matter how smart one is, if one has an instrument, and wants to test accuracy, one should make an appearance somewhere where an already recently calibrated instrument is available to EXAMINE your instrument against. NONE should be "adjusted" at all ever if the variance is too small to warrant it, and even pro calibrators follow this creed. If at all possible, their main task is to VERIFY an instrument's accuracy WITHOUT making ANY adjustment. ANY that DO need adjustments are typically marked "defective" and require a factory inspection/repair. I speak from experience, so I don't care what the ToolTard thinks about his capacity for the task, he is a ****ing retard if he tries it without first checking his gear against known good gear. It really is THAT SIMPLE. Check, and check mate; end game. |
#37
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
Dear friend :
It is my fortunate writing to you . you will discover this is a wealth accumulation place. The website of our company is http://www.china-powerseller.com We are a big agent for Laptop Mobile Phone\Digital Camera\CDJ\DJM\Apple Ipod\PSP\VEDEO GAMES \Television\GPS\Telescope in CHINA . All of our commodities is the most advanced quality but lowest price. we also have the safest and most convenient transaction way to guarante the transaction arries on normally . looking forward to your coorporation and cause all of us both to profit. you can contact us by Email:china.seller#hotmail.com MSN: china.seller#hotmail.com best wishes! your sincerely http://www.china-powerseller.com MSN/Email: |
#38
![]()
Posted to sci.electronics.design,sci.electronics.repair,sci.electronics.equipment
|
|||
|
|||
![]()
Dear friend :
It is my fortunate writing to you . you will discover this is a wealth accumulation place. The website of our company is http://www.china-powerseller.com We are a big agent for Laptop Mobile Phone\Digital Camera\CDJ\DJM\Apple Ipod\PSP\VEDEO GAMES \Television\GPS\Telescope in CHINA . All of our commodities is the most advanced quality but lowest price. we also have the safest and most convenient transaction way to guarante the transaction arries on normally . looking forward to your coorporation and cause all of us both to profit. you can contact us by Email:china.seller#hotmail.com MSN: china.seller#hotmail.com best wishes! your sincerely http://www.china-powerseller.com MSN/Email: |
Reply |
|
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
1st SMART HOME WORKSHOP and ICHIT 2006 | Home Repair | |||
DVD home theater identification/calibration | Electronics Repair | |||
Home Workshop Parkerizing - book review | Metalworking | |||
Myford ML7 Tri-Leva and model workshop equipment for sale | Metalworking | |||
Resell electronic equipment and more online! | Electronics Repair |