View Single Post
  #10   Report Post  
Ian Stirling
 
Posts: n/a
Default

T i m wrote:
On 06 Sep 2005 17:49:40 GMT, Ian Stirling
wrote:



The current isn't real.


Doh!

The meter is applying a resistor across the leads so that at its nominal
200mv fullscale, it'll read full current, so 2mA range has a 100 ohm
resistor across it.


But if the thermocouple is generating a voltage then there will
presumably (unless it's output impedance is infinity) some current
generated?


Of course.

As the thermocouple has a resistance way under 100 ohms, it's just reading
out the voltage.


So (confused here) the meter was supposed to be measuring the current
generated by the thermocouple (all be it into the input resistance of
the meter rather than a real load). The meter itself does this by
measuring the voltage applied across it's internal shunt resistor and
using various scale modifing restors to calibrate the mV (uV?) display
to read (display) Amps? Could you explain (and I'm an aging
electronics engineer remember) how / why that would be different from
reading any current in a cct (on a DMM) please?


Sorry, I was not being clear.
Of course it's not different.
My point was that it's not a 'real' measurement, it's affected by the
test instrument to a great degree.
On 2mA, it'll read .25mA, on 200uA, 30uA, and on 20mA 20mA. (or so).


The voltage ranges can be handy for reading out very low currents (nA)
similarly.


Hmm .. I think even I would throw *that* thermocouple away!


True.
But I've found it really handy in the past to measure leakage currents,
when the uA range won't quite cut it.
A 3.99 meter with a 1M input impedance will resolve down to 100pA when
set on voltage.
(care does need to be taken that you're getting accurate measurements.)