View Single Post
  #25   Report Post  
Richard H.
 
Posts: n/a
Default

wrote:
1/60th of a second is important because it is specific to that hardware
and how it functions. It uses registers that change every 1/60th of a
second to make certain occurances "random". If one could react with an
accuracy of 1/60th of a second, then these occurances would follow a
predictable pattern. But of course that kind of timing is not humanly
possible with any kind of consistency.

Anyway, to simplify what I'm doing, this involves a huge number of
timed inputs(by a person) over the course of several hours. The timer
will be the reference.

If it is easier for me to get a set-up that involves frequent
resets/corrections to get the needed accuracy at any 60th of a second
over the course of several hours, then that is what I'll have to do.

P.S: The hardware itself is a videogame.


This strikes me as a very different definition of the problem from your
original post...

If your goal is to have an event 60 times per second with good accuracy,
that is trivial with most microcontrollers. Even a basic design could
get you 100,000 events per second with good accuracy.

But what you described in your original post was a requirement to finish
after 6 hours with a clock drift of no more than 1/60 second. That
problem is 21,600 times harder, and requires elaborate solutions.

i.e., it sounds like your requirement is for a timer that can:
a) trigger 60 times per second with "good" accuracy
b) count for 6 hours or more

In defining "good" accuracy, 1% equates to +/- 0.00017 secs margin per
60/sec event (between 0.01649 and 0.01683 seconds per event). These
timings aren't likely to vary much on one board (barring temerature
changes), but would vary in this range from one board to the next.

So, what degree of accuracy are you really needing?

Cheers,
Richard