View Single Post
  #323   Report Post  
David Maynard
 
Posts: n/a
Default Why aren't computer clocks as accurate as cheap quartz watches?

Woody Brison wrote:

After reading much of this thread, and a lot of it has been
quite insightful... I'd like to add 2 more cents.

w_tom wrote:

There are two ways to do as suggested. The first is to make
'Benjamins' part of the technical facts during design....
... the technical reason for high verses low accuracy
timers was provided. Computer motherboards don't have the
trimming capacitor and the oscillator is subject to wider
voltage variations. Why this technical decision was made was
not asked and would only be speculation.



So, two sides of the coin... then, there be the THIRD side of the coin.

Why do you have a clock on your computer? Can't afford a watch
or a desk clock or a wall clock?

The answer is that a clock on the computer is useful to record
creation/change time on files.

It doesn't really matter if the file was modified at 6:00.00 000000
or 6:00.00 000035

What matters is if one file was created before another. You're
compiling, but the source hasn't changed, or has; the params file
has been changed since X,Y, or Z... that kind of thing.

On a computer, Approximate Time is almost always all that's really
needed; a clock that ***always runs forward***, and keeps time within
a few minutes a day.

Even if Perry Mason drags you into the witness stand and confronts
you with file dates and times, approximate is probably good enough
to acquit you or convict you. If in the rare case it's not, bring in
your
expert to explain that computer clocks are often not accurate.

Wood


Or keep your clock set to 1935 and even Perry Mason won't know when they
were actually made.