View Single Post
  #26   Report Post  
Richard Clark
 
Posts: n/a
Default

On Sun, 10 Apr 2005 21:54:26 -0500, Tom Ring
wrote:

So he's using decent equipment. Whether it's used correctly is another
matter. I'm betting he did a good job, given the results I've seen, and
what I know of him.


Hi Tom,

As I've offered, the test protocol is very precise, and the
instrumentation (as far as has been discussed or inferred) is up to
the resolution. However, many mistake what accuracy, precision, and
resolution mean.

Resolution is the number of digits in your reading. It usually
implies that you can read more digits than you report. So, to say you
have measured a voltage to be 1.5V means that you have an instrument
that can read in hundredths of volts.

Precision is the repetition of readings. High precision means your
measurements all can be reported as 1.5V because they vary no more
than 4 hundredths of a volt in readings around the reported value (or
by more fancy regression techniques).

Accuracy is how far from actual your report is. It is enough to say
that resolution and precision are not accuracy, but that they are
necessary components of accuracy.

Insofar as the range goes, it remains to be seen if it has been
calibrated in its own right. The test is not necessarily found in
absolutes, but rather in its response to perturbations. In other
words, inject a known variable and measure its ability to support a
report that faithfully records the value of that variable as evidence
of its robustness. You have to perturb the system with small changes
as well as large changes to see if it is linear in its response. This
is not easy and makes great demands upon not only the instrumentation,
but the ingenuity of the tester. Then you repeat the tests from a
different angle to see if it is symmetric. Then you test for
background contributions - noise (actually this is probably best done
first as it sets the boundaries of your low end and defines part of
the dynamic range).

You do all the above, and then some, pool the results and describe
your limits of error. Test results that are reported without knowing
the limits of error are not very informative. Hence, when I hear that
readings are repeatable to 0.1dB for UHF and I hear nothing of the
range of error (I must presume that it is no greater than 0.033dB);
then I am more than skeptical because 1% accuracy in power
determination is the extreme of very tightly controlled laboratory
conditions.

That there are repeated measurements in the field to this level of
precision is suspect because there is very little instrumentation AND
combinations of many pieces of gear that come close. It takes only
two pieces of 1% gear to create a situation that is at best 1.4%
accurate and you are already crossing the 0.1dB threshold. For those
trying to balance the ledger, a 1% accurate determination requires a
method that is at least 3 times more accurate. The usual aggregation
of error arrives through RSS (root sum square); some may like to gild
their prospects and compute RMS (root mean square) and if they are
lucky, this is not far off. Given enough results, luck washes out to
sea and RSS dominates. Given enough results that conform to RMS, then
you find you have qualified your methods and instrumentation to
superlative standards.

73's
Richard Clark, KB7QHC