Thread: Digital Clocks
View Single Post
  #2   Report Post  
PeteS
 
Posts: n/a
Default Digital Clocks

Part of the answer That's the difference between "good enough" and
"excellence."
is cost.

Consumer devices are very cost sensitive, and how the clock is
generated is part of that. A typical 32.768kHz resonator or crystal
will have a nominal error of up to 200ppm (parts per million) but which
will get worse depending on temperature and ageing. That's by far the
most common frequency used for this function, although many processors
simply use their main clock and divide it to the closest approximation
- that, in particular, will give larger errors. (32.768kHz divided
through a 15 bit binary counter yields exactly 1Hz).

The circuitry needed to auto-adjust would add unnecessary costs (at
least in the view of the manufacturers) for a non-essential feature
(absolute accuracy over time). Typically we would just buy a simple RTC
that may even have the crystal internally - see the list at Maxim for a
typical selection:
http://www.maxim-ic.com/products/tim...ime_clocks.cfm
(Most of the clocks are actually Dallas Semi parts)

In higher end and truly embedded equipment (where there are no user
adjustable controls), much depends on whether there is a method of
externally synchronising the clock so the error between external sync
is negligible. If so, there is no point in using (relatively) expensive
techniques.

Can we auto adjust? Sure. Do we? Only if we must.

Cheers

PeteS