View Single Post
  #26   Report Post  
 
Posts: n/a
Default



Richard H. wrote:
wrote:
1/60th of a second is important because it is specific to that hardware
and how it functions. It uses registers that change every 1/60th of a
second to make certain occurances "random". If one could react with an
accuracy of 1/60th of a second, then these occurances would follow a
predictable pattern. But of course that kind of timing is not humanly
possible with any kind of consistency.

Anyway, to simplify what I'm doing, this involves a huge number of
timed inputs(by a person) over the course of several hours. The timer
will be the reference.

If it is easier for me to get a set-up that involves frequent
resets/corrections to get the needed accuracy at any 60th of a second
over the course of several hours, then that is what I'll have to do.

P.S: The hardware itself is a videogame.


This strikes me as a very different definition of the problem from your
original post...


How? In my original post I said the following: "It must to be accurate
to within 1/60th of a second over the course of 6 hours."

If your goal is to have an event 60 times per second with good accuracy,
that is trivial with most microcontrollers. Even a basic design could
get you 100,000 events per second with good accuracy.


But that is not my goal.

But what you described in your original post was a requirement to finish
after 6 hours with a clock drift of no more than 1/60 second. That
problem is 21,600 times harder, and requires elaborate solutions.

i.e., it sounds like your requirement is for a timer that can:
a) trigger 60 times per second with "good" accuracy
b) count for 6 hours or more


Still wrong. The timer will trigger nothing. All it needs is a display
so that I can see the seconds.(Though showing 1/60th of a second
intervals would be great, it's just not required for this project,
which I have had to simplify greatly).

In defining "good" accuracy, 1% equates to +/- 0.00017 secs margin per
60/sec event (between 0.01649 and 0.01683 seconds per event). These
timings aren't likely to vary much on one board (barring temerature
changes), but would vary in this range from one board to the next.

So, what degree of accuracy are you really needing?


1/60th of a second...

(ie: When the 2 hour, 53 minute, and 37 second point is reached, the
display should show it at exactly that time at an accuracy of 1/60th of
a second from when the clock started running).

Darren Harris
Staten Island, New York.