![]() |
WTF with my computer clock?
On Sun, 16 Aug 2009 22:20:34 -0700, isw wrote:
In article , Jeff Liebermann wrote: On Sun, 16 Aug 2009 17:34:37 -0700, David Nebenzahl wrote: On 8/16/2009 2:51 PM Meat Plow spake thus: I agree that for most a minute per month is reasonable but I would expect the same accuracy as my $29.99 Timex wris****ch which is more like a second a month. So that kinda begs the question of why computer mfrs. can't (or won't) include clocks that are at *least* as accurate as a Timex, no? Wouldn't a computah be a more compelling reason for a more accurate clock? (I know, $$$ bottom line, right?) Because it's difficult. The right way to have done it would have been to do a function call from an RTC (real time clock) every time some application needs the actual time. I don't agree. That's ok. Nobody ever agrees with me. I'm used to it. NO CLOCK, running alone, can be really accurate over the long term. A much better way is to take the output from a crummy, inaccurate *but low cost* clock and using an external time reference, synthesize from it a local clock of simply amazing accuracy. Yep. However, the IBM PC was designed in 1981. At that time, there were 10 Phase 1 GPS birds, incomplete coverage, and $5,000 receivers. There were overpriced WWV and WWVB receivers, and no internet. The best you could do was something synced to the color burst frequency of a local TV station, assuming they were on 24 hours per day. Your brilliant hindight is totally correct for a 21st century design, but would be impossibly expensive in 1981. My point was that in 1981 IBM had a reasonably accurate clock inside the IBM PC using a fairly large 14.31818MHz xtal which could have easily been temperature compensated. There were other computers at the time that had a seperate stabilized RTC that did it the right way. However, the IBM PC was originally designed as a home computer, not a laboratory instrument. The casette tape interface should be a clue. Using it for industrial, scientific or navigation applications was probably never considered by the original architects. We're living with the results today. NTP solves the problem completely, and at a very low cost (processing cycles instead of expen$ive hardware). NTP works even if the computer it's running on has *no RTC* (in the hardware sense) at all. All it needs is some sort of interrupt generated every N cycles of the processor clock (N is any integer that produces regular interrupts a few times a second; the actual interval is not important). Isaac -- Jeff Liebermann 150 Felker St #D http://www.LearnByDestroying.com Santa Cruz CA 95060 http://802.11junk.com Skype: JeffLiebermann AE6KS 831-336-2558 |
WTF with my computer clock?
On Sun, 16 Aug 2009 22:09:36 -0700, isw wrote:
You should stay away from NIST (and all other stratum one servers) to avoid overloading their server unless you have a real need for high precision -- REALLY high. Otherwise, find a good stratum two server to connect to; you'll never know the difference. There are a lot; just google. I use time.apple.com. Isaac You might want to look into: http://www.ntp.org http://www.pool.ntp.org http://www.pool.ntp.org/zone/us For NTP I use: us.pool.ntp.org 500 servers in the US pool and growing. -- Jeff Liebermann 150 Felker St #D http://www.LearnByDestroying.com Santa Cruz CA 95060 http://802.11junk.com Skype: JeffLiebermann AE6KS 831-336-2558 |
WTF with my computer clock?
In article ],
isw wrote: Functionally impossible. By adding money, you can reduce the drift rate but you can't make it zero. Period. Just use NTP. Or, if you prefer something stand-alone which will give you a good time reference if your network connection is down: use GPS. It's not hard to find a GPS receiver which has a decent "pulse per second" output on its serial port, as well as standard NMEA sentences. Software packages are available which will monitor the NMEA output and the PPS signal, and synchronize your PC's clock very accurately. The better receivers (those specifically intended for timing purposes) synchronize the PPS pulse edge with the start-of-second to within a small number of nanoseconds. The limiting factor in your PC's clock accuracy is likely to be the speed at which it can respond to the change in the PPS signal (which typically requires taking an interrupt). A good timekeeping package should be able to compare an averaged PPS timing (over a period of some minutes) with your system's inherent clock drift, and figure out how to jiggle the internal clock so that the drift averages down to zero. You should end up with time accuracy good to within a few milliseconds. On Linux, you can do this by running "gpsd" to monitor the GPS, and having it feed timing information into the NTP daemon. In effect, your GPS then serves as a new time source to the local NTP timing pool... it's very accurate in the long term but somewhat prone to short-term jitter. You can, if you wish, configure the ntp daemon to use both the local GPS time source, and one or more network time servers... this will give you redundency in both directions. And *stay away* from the stratum one servers like NIST; they have better things to do than keep your computer's clock on time. Correct. Use "pool.ntp.org", or one of the regional subdomains thereof (e.g. "us.pool.ntp.org"). These domains point to a list of well-connected, relatively-high-stratrum servers which have volunteered to serve as public NTP resources. -- Dave Platt AE6EO Friends of Jade Warrior home page: http://www.radagast.org/jade-warrior I do _not_ wish to receive unsolicited commercial email, and I will boycott any company which has the gall to send me such ads! |
WTF with my computer clock?
On 8/16/2009 10:12 PM isw spake thus:
Functionally impossible. By adding money, you can reduce the drift rate but you can't make it zero. Period. I don't care about zero. One minute a month is plenty accurate enough for me. Just use NTP. And *stay away* from the stratum one servers like NIST; they have better things to do than keep your computer's clock on time. You're admonishing me not to use NIST? Why? After all, they offer this service to me. See http://tf.nist.gov/service/its.htm: The NIST Internet Time Service (ITS) allows users to synchronize computer clocks via the Internet. The time information provided by the service is directly traceable to UTC(NIST). The service responds to time requests from any Internet client in several formats including the DAYTIME, TIME, and NTP protocols. So why shouldn't I use them? Keep in mind that I use this service *at most* 3 or 4 times a *year*. -- Found--the gene that causes belief in genetic determinism |
WTF with my computer clock?
David Nebenzahl wrote:
You're admonishing me not to use NIST? Why? ..... Keep in mind that I use this service *at most* 3 or 4 times a *year*. David, it depends upon how you use it. If you use Windows' or MacOS's automatic time sync or *NIX's NTPDATE, you only access it occasionaly. Windows and Mac access it once a week, NTPDATE does it whenever it is invoked, usually when you boot your computer. If you are runnin *NIX NTP deamon (including MacOS's) or a third party Windows time sync program, your computer is in frequent contact with the time server. In that case, it would be a good idea not to use those servers as they are heavily loaded down. For once in a week sync of one computer, you can use just about any server without worry about it being overloaded or adding any additional load. If you have multiple computers networked together, that is a different story. Geoff. -- Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM |
WTF with my computer clock?
On Mon, 17 Aug 2009 11:37:35 -0700, David Nebenzahl
wrote: So why shouldn't I use them? Keep in mind that I use this service *at most* 3 or 4 times a *year*. I use 8 hours, which guarantees at least one update during working hours. You're fine, but there are NTP abuse problems among application writers and device vendors: http://en.wikipedia.org/wiki/NTP_server_misuse_and_abuse Since there doesn't seem to be a way to stop someone from flooding the NTP servers with excessive requests, the SNTP group came up with the "kiss of death" packet, which tells the sender to shut up: http://tools.ietf.org/html/rfc4330#page-20 |
WTF with my computer clock?
On Sun, 16 Aug 2009 22:12:44 -0700, isw wrote:
Functionally impossible. By adding money, you can reduce the drift rate but you can't make it zero. Period. Just use NTP. And *stay away* from the stratum one servers like NIST; they have better things to do than keep your computer's clock on time. Why bother with an internet solution? It won't work for laptops, PDA's, stand alone PC devices, and such. A WWVB receiver is cheap enough that it's included inside weather stations, alarm clocks, wrist watches, and yes.... computahs: http://www.meinberg-usa.com/usb-radio-clocks/23-40/wwvb51usb---wwvb-radio-clock-for-the-universal-serial-bus--usb-.htm http://www.atomictimeclock.com/radsynhome.htm http://www.beaglesoft.com/radsynhome.htm http://www.timetools.co.uk/products/mps-time-server.htm etc... The only problem I can see with building one into a PC is the RF noise generated by the PC will probably trash the receiver. That's what long extension cords and external antennas are good for. You can also sync to the local CDMA cellular provider, although the prices are close to astronomical: http://www.beaglesoft.com/celsynhome.htm Got $10.70? Build your own: http://search.digikey.com/scripts/DkSearch/dksus.dll?Detail&name=561-1014-ND |
WTF with my computer clock?
On 8/17/2009 6:44 PM Jeff Liebermann spake thus:
Got $10.70? Build your own: http://search.digikey.com/scripts/DkSearch/dksus.dll?Detail&name=561-1014-ND Interesting. But once you've picked up WWV, what do you do with that signal to derive a time base from it? (I guess you gots to know something about the signal, which I don't.) Pretty simple? -- Found--the gene that causes belief in genetic determinism |
WTF with my computer clock?
In article ,
"Dave Plowman (News)" wrote: In article , Arfa Daily wrote: That's my feeling too. It definitely used to be much better here in the UK, than it is now. If a programme was billed to start at 8pm, then it pretty much did. Pretty much sums it up. But in those days few had dead accurate clocks which are so common now. Actually,they were very close. Western Union clocks all over the country were almost always all synched to within a second or so. The technique was to use clocks (those big things with the red sweep hand you may have seen in a broadcast studio) that were basically pretty good, and to synch them to a remote timebase from time to time. The clocks were pendulum timed and electrically wound (couple of big dry cells inside), and every one of them had a leased-line connection to the nearest WU office, and from there to a national site. Every 12 hours (AFAIR), Western Union sent a pulse down the wire that "jammed" the sweep hands of all those clocks to 12 (and illuminated a little red light behind the clock face so you could see that your time was being corrected). I don't think the minute and hour hands were controlled. It was up to the engineering personnel in each station to twiddle their clocks' pendulums so the clocks could run within a second or two in 12 hours -- not at all difficult for a good pendulum clock. So as long as the accounting department paid the WU bill, you could join your network or insert a local commercial with almost perfect accuracy. Isaac |
WTF with my computer clock?
In article ,
David Nebenzahl wrote: On 8/16/2009 10:12 PM isw spake thus: Functionally impossible. By adding money, you can reduce the drift rate but you can't make it zero. Period. I don't care about zero. One minute a month is plenty accurate enough for me. Just use NTP. And *stay away* from the stratum one servers like NIST; they have better things to do than keep your computer's clock on time. You're admonishing me not to use NIST? Why? After all, they offer this service to me. See http://tf.nist.gov/service/its.htm: The NIST Internet Time Service (ITS) allows users to synchronize computer clocks via the Internet. The time information provided by the service is directly traceable to UTC(NIST). The service responds to time requests from any Internet client in several formats including the DAYTIME, TIME, and NTP protocols. So why shouldn't I use them? If you need stratum one precision, NIST is not a good choice unless you live near them; you should choose a stratum one server near to you. If you don't need that precision but just want your computer's clock to be decently close all the time, why put an unnecessary load on any of the stratum one (i.e. high precision) servers? Leave them alone to serve folks who *do* need that accuracy. There's a decent number of lower stratum servers that sync to NIST and some of the other high precision servers. They are specifically intended to be used by the folks who don't need their computer to be within a fraction of a microsecond of "actual time". Lots more info he http://www.pool.ntp.org/ Isaac |
WTF with my computer clock?
|
WTF with my computer clock?
In article ,
"Dave Plowman (News)" wrote: In article , David Nebenzahl wrote: I agree that for most a minute per month is reasonable but I would expect the same accuracy as my $29.99 Timex wris****ch which is more like a second a month. So that kinda begs the question of why computer mfrs. can't (or won't) include clocks that are at *least* as accurate as a Timex, no? Wouldn't a computah be a more compelling reason for a more accurate clock? (I know, $$$ bottom line, right?) Wonder if it's because a wrist watch is kept at a pretty constant temperature via the skin? Bingo! Isaac |
WTF with my computer clock?
isw wrote:
In article , "Dave Plowman (News)" wrote: In article , Arfa Daily wrote: That's my feeling too. It definitely used to be much better here in the UK, than it is now. If a programme was billed to start at 8pm, then it pretty much did. Pretty much sums it up. But in those days few had dead accurate clocks which are so common now. Actually,they were very close. Western Union clocks all over the country were almost always all synched to within a second or so. The technique was to use clocks (those big things with the red sweep hand you may have seen in a broadcast studio) that were basically pretty good, and to synch them to a remote timebase from time to time. The clocks were pendulum timed and electrically wound (couple of big dry cells inside), and every one of them had a leased-line connection to the nearest WU office, and from there to a national site. Every 12 hours (AFAIR), Western Union sent a pulse down the wire that "jammed" the sweep hands of all those clocks to 12 (and illuminated a little red light behind the clock face so you could see that your time was being corrected). I don't think the minute and hour hands were controlled. It was up to the engineering personnel in each station to twiddle their clocks' pendulums so the clocks could run within a second or two in 12 hours -- not at all difficult for a good pendulum clock. So as long as the accounting department paid the WU bill, you could join your network or insert a local commercial with almost perfect accuracy. Isaac In the UK, or in London at least, the mains frequency was maintained with a very accurate long term average, so that synchronous mains clocks just stayed correct. Sometimes, after short power cuts, the frequency was increased to bring such clocks back to the correct time. Which was actually a bit of a nuiscance for us - we had clocks that weren't self starting (a reflection of the rareness of power outages in those days), so after a power cut, we'd set the clocks correctly and start them, only to find them gaining. It seems a backward step that now, forty or so years later, household wall clocks are less accurate than they were back then. Sylvia. |
WTF with my computer clock?
On Mon, 17 Aug 2009 19:22:09 -0700, David Nebenzahl
wrote: On 8/17/2009 6:44 PM Jeff Liebermann spake thus: Got $10.70? Build your own: http://search.digikey.com/scripts/DkSearch/dksus.dll?Detail&name=561-1014-ND Interesting. But once you've picked up WWV, what do you do with that signal to derive a time base from it? (I guess you gots to know something about the signal, which I don't.) Pretty simple? Well, that's the basic receiver without that CPU. If you want to build a complete receiver, methinks this one: http://search.digikey.com/scripts/DkSearch/dksus.dll?Detail&name=561-1016-ND is more appropriate. However it costs $71. The full development system, with software costs about $250. I was thinking that it might be best to have the PC do the processing, and use the cheaper receiver. I don't know that much about the code used by WWVB. (I are not a programmist). I've gone as far as to disembowl an "atomic" clock (cheap LCD display), and sniff the time signals with a scope. It doesn't look too horrible. Some app notes on the CME8000 chip: http://www.c-max-time.com/downloads/search.php?search=CME8000 Software flow chart: http://www.c-max-time.com/downloads/getFile.php?id=525 Also, the demo board app note was misnamed. Download the file and rename it with a .PDF extension. Then, it will be readable. Start he http://tf.nist.gov/stations/wwvb.htm http://tf.nist.gov/timefreq/stations/radioclocks.htm http://en.wikipedia.org/wiki/WWVB#Modulation_Format http://en.wikipedia.org/wiki/Radio_clock Basically, you're building a clock, but instead of driving a display, you're converting the UTC to something the PC will digest. There are plenty of project on the internet, but none that I could find that included PC interface software. PIC Based WWVB Decoder http://www.geocities.com/hagtronics/wwvb.html Parallax Basic Stamp II Atomic Time Clock Receiver http://marwww.in2p3.fr/~levansuu/projets_es2i/Parallax/parallax%20website/stamps/atomictimeclock.htm Build a WWVB Radio Controlled Nixie Clock http://www.amug.org/~jthomas/wwvb.html Decoding WWVB from a Sony atomic time radio controlled clock http://leapsecond.com/pages/sony-wwvb/ etc... One more... WWVH on a ISA or PCI card: http://www.beaglesoft.com/clcahome.htm Research: Chip-scale atomic clock: http://tf.nist.gov/timefreq/ofm/smallclock/CSAC.html |
WTF with my computer clock?
Jeff Liebermann wrote:
I don't know that much about the code used by WWVB. (I are not a programmist). It's a very simple system that is well documented. It's simple and slow enough that anyone used to pulling apart data streams would be able to decode it with a Z80 derived embedded processor, the ARM chips in dead iPods, WiFi routers, etc would be "overkill". Here in Jerusalem, we don't receive the signals of WWVB, or the German or UK equivalents here. Someone about 50 miles north and out of the mountains has a clock that syncs, but he never told me which station it uses (he may not know), or how often it syncs. This lead me to research how one would do the opposite, devise a local transmitter with an ethernet port on one end for NTP sync and a 60kHz transmitter to sync a clock on the other. I gave up due to lack of a suitable design for the transmitter, no receiver and a lack of funds to obtain them. You probably could do it out of one of the Linux based routers, and blink one of the status LED's to generate the 50/60kHz signal. I know by now you must be thinking "why would anyone even think of such a thing", but a discussion a few months ago about resurecting a Heathkit Most Accurate Clock, got me going. I think I also read a posting that the WWVB signals were being phased out. Geoff. -- Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM |
WTF with my computer clock?
In article ],
isw wrote: Pretty much sums it up. But in those days few had dead accurate clocks which are so common now. Actually,they were very close. Western Union clocks all over the country were almost always all synched to within a second or so. The technique was to use clocks (those big things with the red sweep hand you may have seen in a broadcast studio) that were basically pretty good, and to synch them to a remote timebase from time to time. Oh indeed. Master clock systems were common once. But not in the home, which is what I meant. -- *Do they ever shut up on your planet? Dave Plowman London SW To e-mail, change noise into sound. |
WTF with my computer clock?
In article ,
Sylvia Else wrote: In the UK, or in London at least, the mains frequency was maintained with a very accurate long term average, so that synchronous mains clocks just stayed correct. The UK has had synchronised mains frequency for a very long time. Sometimes, after short power cuts, the frequency was increased to bring such clocks back to the correct time. Which was actually a bit of a nuiscance for us - we had clocks that weren't self starting (a reflection of the rareness of power outages in those days), so after a power cut, we'd set the clocks correctly and start them, only to find them gaining. I can't remember the last power cut in this part of London. Many years ago. Of course the household RCD has taken over that function. ;-) It seems a backward step that now, forty or so years later, household wall clocks are less accurate than they were back then. 'Radio' controlled ones are cheap these days. -- *If Barbie is so popular, why do you have to buy her friends? * Dave Plowman London SW To e-mail, change noise into sound. |
WTF with my computer clock?
"Geoffrey S. Mendelson" wrote: It's a very simple system that is well documented. It's simple and slow enough that anyone used to pulling apart data streams would be able to decode it with a Z80 derived embedded processor, the ARM chips in dead iPods, WiFi routers, etc would be "overkill". I did it 20 years ago with a VIC-20, in Commodore Basic. Here in Jerusalem, we don't receive the signals of WWVB, or the German or UK equivalents here. Someone about 50 miles north and out of the mountains has a clock that syncs, but he never told me which station it uses (he may not know), or how often it syncs. This lead me to research how one would do the opposite, devise a local transmitter with an ethernet port on one end for NTP sync and a 60kHz transmitter to sync a clock on the other. I gave up due to lack of a suitable design for the transmitter, no receiver and a lack of funds to obtain them. You probably could do it out of one of the Linux based routers, and blink one of the status LED's to generate the 50/60kHz signal. You need a good antenna system to transmit on 50 or 60 KHz. Here is a sat photo of the WWVB antenna farm: http://maps.google.com/maps?hl=en&rls=com.microsoft:en-us&q=wwvb%20time%20signals%20ft%20collins&um=1&ie= UTF-8&sa=N&tab=wl I know by now you must be thinking "why would anyone even think of such a thing", but a discussion a few months ago about resurecting a Heathkit Most Accurate Clock, got me going. I think I also read a posting that the WWVB signals were being phased out. They replaced all the transmitters and towers at WWVB a few years ago to improve service. It now reaches Central Florida without a long wire antenna & tuner. Why would they spend millions and take a couple years to do the update if they were planning to shut it down? -- You can't have a sense of humor, if you have no sense! |
WTF with my computer clock?
On Mon, 17 Aug 2009 07:41:19 +0100, "Dave Plowman (News)"
wrote: In article , David Nebenzahl wrote: I agree that for most a minute per month is reasonable but I would expect the same accuracy as my $29.99 Timex wris****ch which is more like a second a month. So that kinda begs the question of why computer mfrs. can't (or won't) include clocks that are at *least* as accurate as a Timex, no? Wouldn't a computah be a more compelling reason for a more accurate clock? (I know, $$$ bottom line, right?) Wonder if it's because a wrist watch is kept at a pretty constant temperature via the skin? Do you really expect people to wear a watch when they sleep just to maintain accuracy? There's quite a difference in temperature between skin temp (about 37C) and room temperature (about 25C). The same for a computah. When turned off or in standby, the clock is slightly above room temperature. When running, it might be as warm as 75C. -- Jeff Liebermann 150 Felker St #D http://www.LearnByDestroying.com Santa Cruz CA 95060 http://802.11junk.com Skype: JeffLiebermann AE6KS 831-336-2558 |
WTF with my computer clock?
Jeff Liebermann wrote:
Building a store and forward repeater for WWVB (or the EU equivalent) 60Khz is a waste of time. The storage delay needed to regenerate the signal will result in the sync pulses arriving too late. However, a system that uses GPS, GLONASS, or Galileo as a reference, and generates a simulated time code format will work. The problem is that at 60Khz, the necessary antenna farm would be huge and the transmitters rather power hungry. WWVB runs at an EIRP of about 70kW. That's sort of what I was thinking of. Get the time from NTP, generate a fresh time code signal, which would not be accurate enough for someone who wanted truely accurate time code, but to keep a clock that displays to the minute, or even to the second on time, it would be good enough. As for the transmitter, how much power do you need to transmit a signal from a time code generator to a receiver next to it, connected via a coax cable? A microwatt? A milliwatt? limited abilities of a commodity router. Methinks you would be better off with a SBC (single board computer) or common PC (ITX, Mini-ITX, etc). It depends. A cheap router, such as the Linksys WRTG-54L (note the L at the end it's the enhanced model that runs Linux) would do it. It sells new for not much money, will be obosolete as the 802.11N routers come into general useage, has an ARM processor, two ethernet interfaces (one connected to a 4 port hub), a WiFi radio and a bunch of status LEDs. The advantage of it is that there are several alternate Linux packages for it and you can easily compile your own programs, build your own "flash" (firmware image) and load it. There are also distributions for other routers, I recently bought a $30 EDIMAX wired router that had a distribution for it. Actually, a 60KHz xmitter is fairly easy to design. The problem is that all the components would be huge. There's also the not so easy problem of getting Ministry of Communications approval. It's not on the designated ham radio frequency list: If it is directly connected to the input of the receiver, it needs no license. I'm sure that nearby users that went through the trouble of obtaining large antennas, will not be thrilled with your transmissions. Even a "local" transmitter can carry a substantial distance at 60KHz. I also expect there are none. Around here the noise level is so high that they would never hear it until I got into the "real antenna" type system. A microwatt with true isotropic radiator antenna (a short wire) would not leave my apartment, let alone go anywhere. But if it is connected directly, then it is a moot point. I was thinking of something simple, such as flashing the LED at 60kHz, and wrapping a pickup loop around it. That's about the same power level and frequency of a TV remote control and no one from the MOC has come and complained about any of the ones I have. I'm talking about the RF leakage from it, not the optical signal. The GC-1000 used WWV at 5,10, and 15 MHz, all of which are still on the air. I'm not sure what a 60KHz system would do for you. Not here. I have never heard them here, nor have I heard CHU (yes, I know it moved), any of the European stations, etc. I'm not talking about a cheap portable shortwave, I'm talking about a Kenwood R-5000 with a 75 foot random wire, or other equally as sensitive ham receivers either with a 20m resonant dipole or 1/4 wave vertical, or a 40m resonant dipole. I'm not actually familar with the clock in question, the discussion (I think it was on this newsgroup) focused on them using WWVB (VLF) radios. I've given up asking "why". Some of the strange things I've seen on the internet defy logic and explanation. I'll agree with that. Geoff. -- Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM |
WTF with my computer clock?
Michael A. Terrell wrote:
They replaced all the transmitters and towers at WWVB a few years ago to improve service. It now reaches Central Florida without a long wire antenna & tuner. Why would they spend millions and take a couple years to do the update if they were planning to shut it down? They will shut it down eventually because of the cost. NTP servers cost almost nothing, GPS is "free" because there are no incremental costs for providing the time signals. Eventually someone will figure out that a 75kW transmitter has both a significant expense and a large "carbon footprint". The upgrade was thought to be needed because of a satellite time system that was dropped due to GPS was thought to be unable to fit the needs of the common user. Now GPS units are almost throw away "toys", being used in almost every cell phone, and for all sorts of SATNAV devices. Geoff. -- Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM |
WTF with my computer clock?
"Dave Plowman (News)" wrote in message ... In article , Arfa Daily wrote: "Dave Plowman (News)" wrote in message ... In article , Arfa Daily wrote: And now that "The Bill" has got a 9 o'clock slot, they've changed the shooting medium to something that looks altogether 'wrong', It's called HD. ;-) It's shot using progressive scan so you get the movement artifacts as on film. Thompson (Grass Valley) cameras recorded on Panasonic P2 using solid state memory cards. Are you sure that's what it is ? Any HD that I've seen is just that. A perfectly 'normal' looking picture, but with a higher resolution. Trouble is they use fog filters on the cameras to reduce the resolution - common on drama even in SD. And use long lenses most of the time to keep the backgrounds soft. Why should a higher res camera change the tonal composition of the picture ? (assuming that it is being shot on video). Looks more like they've changed from film to video, or the other way round perhaps. No - it's always been video. Or are maybe using a video mode that attempts to simulate film, something like that. I saw it before on the programme when they did a couple of 'specials'. Didn't like it then, don't like it now. Tend to agree. But most production people hate video and will do anything to make it look 'different'. They've also changed most if not all the Lighting Directors. The Bill used to be known for using available light - or making it look like it was. It now looks 'lit'. rest snipped So Dave, do you happen to know if "New Tricks" is shot available light ? I was watching last week's episode that I had recorded, and that looked just like "The Bill" used to. Do you also happen to know how Euston Films shot "Minder" ? I watched an old episode of that on ITV4 tonight, and again, it had a lovely 'untouched' look. Was that actually film, naturally lit, or video ? Arfa |
WTF with my computer clock?
"Geoffrey S. Mendelson" wrote: Michael A. Terrell wrote: They replaced all the transmitters and towers at WWVB a few years ago to improve service. It now reaches Central Florida without a long wire antenna & tuner. Why would they spend millions and take a couple years to do the update if they were planning to shut it down? They will shut it down eventually because of the cost. NTP servers cost almost nothing, GPS is "free" because there are no incremental costs for providing the time signals. Eventually someone will figure out that a 75kW transmitter has both a significant expense and a large "carbon footprint". The receivers run on one or two AA cells for over a year. A GPS system would use a lot more than 75 KW to power an equal number of GPS based clocks. The upgrade was thought to be needed because of a satellite time system that was dropped due to GPS was thought to be unable to fit the needs of the common user. Now GPS units are almost throw away "toys", being used in almost every cell phone, and for all sorts of SATNAV devices. Everything will die, some day including the earth itself. Shutting down WWVB, with the large number of people using the 'Atomic clocks' that sync to it would cause a real stink. You can buy 'Atomic clocks' for under $20. Try replacing 75 cents worth of parts in a WWVB receiver & decoder with a GPS based clock for the same price. VLF can be received under water, as well. Try using GPS for a submarine without surfacing, or raising a buoy with the antennas. Even better, try using that floating buoy in rough waters. -- You can't have a sense of humor, if you have no sense! |
WTF with my computer clock?
In article ,
Jeff Liebermann wrote: On Mon, 17 Aug 2009 07:41:19 +0100, "Dave Plowman (News)" wrote: In article , David Nebenzahl wrote: I agree that for most a minute per month is reasonable but I would expect the same accuracy as my $29.99 Timex wris****ch which is more like a second a month. So that kinda begs the question of why computer mfrs. can't (or won't) include clocks that are at *least* as accurate as a Timex, no? Wouldn't a computah be a more compelling reason for a more accurate clock? (I know, $$$ bottom line, right?) Wonder if it's because a wrist watch is kept at a pretty constant temperature via the skin? Do you really expect people to wear a watch when they sleep just to maintain accuracy? There's quite a difference in temperature between skin temp (about 37C) and room temperature (about 25C). The same for a computah. When turned off or in standby, the clock is slightly above room temperature. When running, it might be as warm as 75C. Yup, but the long-term average will be pretty good -- gain a little in the daytime, lose a bit at night (or the other way around; could be either one depending on how the circuit was set up). Remember the old "Accutron" watches -- the ones with a tuning fork inside? You could adjust those by deciding which way to lay them on the table when you went to bed. "12 up" would run at a different rate than "12 down" because of the effects o gravity on the fork. Also, they ran noticeably fast on airplane trips, due to thinner air. Isaac |
WTF with my computer clock?
In article ,
Jeff Liebermann wrote: -- snippage -- The GC-1000 used WWV at 5,10, and 15 MHz, all of which are still on the air. I'm not sure what a 60KHz system would do for you. Avoid a lot (but not all) of the problems caused by groundwave-skywave conflicts that occur in the HF bands. Isaac |
WTF with my computer clock?
In article ,
"Geoffrey S. Mendelson" wrote: Jeff Liebermann wrote: Building a store and forward repeater for WWVB (or the EU equivalent) 60Khz is a waste of time. The storage delay needed to regenerate the signal will result in the sync pulses arriving too late. However, a system that uses GPS, GLONASS, or Galileo as a reference, and generates a simulated time code format will work. The problem is that at 60Khz, the necessary antenna farm would be huge and the transmitters rather power hungry. WWVB runs at an EIRP of about 70kW. That's sort of what I was thinking of. Get the time from NTP, generate a fresh time code signal, which would not be accurate enough for someone who wanted truely accurate time code, but to keep a clock that displays to the minute, or even to the second on time, it would be good enough. It would be *very good* because it would never drift. The rate (long-term) would be spot on, and the epoch (the name for "what time is it right now"?) would be only slightly in error. As for the transmitter, how much power do you need to transmit a signal from a time code generator to a receiver next to it, connected via a coax cable? A microwatt? A milliwatt? limited abilities of a commodity router. Methinks you would be better off with a SBC (single board computer) or common PC (ITX, Mini-ITX, etc). It depends. A cheap router, such as the Linksys WRTG-54L (note the L at the end it's the enhanced model that runs Linux) would do it. It sells new for not much money, will be obosolete as the 802.11N routers come into general useage, has an ARM processor, two ethernet interfaces (one connected to a 4 port hub), a WiFi radio and a bunch of status LEDs. The advantage of it is that there are several alternate Linux packages for it and you can easily compile your own programs, build your own "flash" (firmware image) and load it. There are also distributions for other routers, I recently bought a $30 EDIMAX wired router that had a distribution for it. Actually, a 60KHz xmitter is fairly easy to design. The problem is that all the components would be huge. You could synthesize the whole signal in software and drive a D-to-A converter to create the modulated carrier. Any commodity video D-A would be more than fast enough. I'm sure that nearby users that went through the trouble of obtaining large antennas, will not be thrilled with your transmissions. Even a "local" transmitter can carry a substantial distance at 60KHz. But very probably not, unless a very long antenna were attached to it. Isaac |
WTF with my computer clock?
In article ,
"Geoffrey S. Mendelson" wrote: -- snippety-snip -- The GC-1000 used WWV at 5,10, and 15 MHz, all of which are still on the air. I'm not sure what a 60KHz system would do for you. Not here. I have never heard them here, nor have I heard CHU (yes, I know it moved), any of the European stations, etc. I'm not talking about a cheap portable shortwave, I'm talking about a Kenwood R-5000 with a 75 foot random wire, or other equally as sensitive ham receivers either with a 20m resonant dipole or 1/4 wave vertical, or a 40m resonant dipole. But even the "best" receiver still wouldn't solve the multipath problems that plague the MW bands. Isaac |
WTF with my computer clock?
In article ,
Arfa Daily wrote: So Dave, do you happen to know if "New Tricks" is shot available light ? Not sure. I was watching last week's episode that I had recorded, and that looked just like "The Bill" used to. Do you also happen to know how Euston Films shot "Minder" ? I watched an old episode of that on ITV4 tonight, and again, it had a lovely 'untouched' look. Was that actually film, naturally lit, or video ? Euston Films never used video. Suitable location equipment simply wasn't available in those days. They mostly used 16mm - with some 8mm if they wanted a particular effect. I should make clear that making something look like it's shot by available light doesn't mean actually doing just that. Reflectors and flags etc are used to reduce the contrast to acceptable limits. To do it well can be more time consuming than actually lighting the thing. ;-) But things like Minder from Euston films were shot at breakneck speeds in normal film terms. ;-) -- *If a pig loses its voice, is it disgruntled? Dave Plowman London SW To e-mail, change noise into sound. |
WTF with my computer clock?
On Tue, 18 Aug 2009 22:08:44 -0700, isw wrote:
In article , Jeff Liebermann wrote: On Mon, 17 Aug 2009 07:41:19 +0100, "Dave Plowman (News)" wrote: In article , David Nebenzahl wrote: I agree that for most a minute per month is reasonable but I would expect the same accuracy as my $29.99 Timex wris****ch which is more like a second a month. So that kinda begs the question of why computer mfrs. can't (or won't) include clocks that are at *least* as accurate as a Timex, no? Wouldn't a computah be a more compelling reason for a more accurate clock? (I know, $$$ bottom line, right?) Wonder if it's because a wrist watch is kept at a pretty constant temperature via the skin? Do you really expect people to wear a watch when they sleep just to maintain accuracy? There's quite a difference in temperature between skin temp (about 37C) and room temperature (about 25C). The same for a computah. When turned off or in standby, the clock is slightly above room temperature. When running, it might be as warm as 75C. Yup, but the long-term average will be pretty good -- gain a little in the daytime, lose a bit at night (or the other way around; could be either one depending on how the circuit was set up). Maybe, if the wearer maintains a regular schedule. That's a fair assumption, until the wearer changes their usage pattern, such as going on a ski trip. Also, please note that the original discussion was over the accuracy of a computah clock, not a wrist watch. Unless left on continuously, computers don't maintain a set schedule. Even so, their internal temperature is affected by the building environment. Remember the old "Accutron" watches -- the ones with a tuning fork inside? You could adjust those by deciding which way to lay them on the table when you went to bed. "12 up" would run at a different rate than "12 down" because of the effects o gravity on the fork. Also, they ran noticeably fast on airplane trips, due to thinner air. http://members.iinet.net.au/~fotoplot/acc.htm I have a 1965 Accutron 214 Space View wrist watch in poor condition. The specs offered 1 or 2 seconds per day, but only for the first year. After about 30 years (the last time it ran) and zero service, my guess is that it was off about 60 seconds per day. I forgot if it was a gain or loss. The mercury battery leaked inside and it's unfortunately not currently running. (Yet another project). You might also be refering to the problem caused by the original steel watch hands. When they were near the tuning fork coils, the frequency would lower slightly. The effect was not very big, but still and error. The position problem is also not 12 o'clock up versus down. It's 12 o'clock verus 90 degree rotation which is 3 or 9 o'clock. The problem stems from the tuning fork being vertical or horizontal. The recommended solution is to lay the watch flat at night. I don't think it was ever a major problem, just an interesting curiousity for accuracy fanatics. A more interesting problem was mechanical vibrations in the 360Hz range. (the frequency of the tuning fork). When my watch was working, it would tend to run quite fast if I was working near big synchronous or induction motors driven by 60Hz such as in my fathers clothing factory. It was not unusual to gain about a minute, after spending an hour pushing cloth through an industrial sewing machine (with my hands on the table). I suspect (guess) that vibration was also the problem in airplanes, not thin air. Temperature is of course a problem: http://bmumford.com/mset/tech/accutron/index.html -- Jeff Liebermann 150 Felker St #D http://www.LearnByDestroying.com Santa Cruz CA 95060 http://802.11junk.com Skype: JeffLiebermann AE6KS 831-336-2558 |
WTF with my computer clock?
On Tue, 18 Aug 2009 22:26:03 -0700, isw wrote:
But even the "best" receiver still wouldn't solve the multipath problems that plague the MW bands. How much accuracy are you looking for in a PC clock? I doubt that WWV will give you millisecond PC clock accuracy, but it's more than suitable for nailing it withing one second. Averaged over even a fairly short period of time, the 2.5/5/10/15/20Mhz frequencies are quite accurate. For re-synchronizing the clock, the time ticks are also sufficiently accurate: WWV Frequency Accuracy As transmitted: 1 part in 100 billion As received: 1 part in 10 million WWV Time Tick Accuracy As received: 1 millisecond plus propagation delay WWVB Frequency Accuracy As transmitted: 1 part in 100 billion As received: 1 part in 100 billion The major ionospheric multipath problem is the almost 180 degree phase reversals from constant path switching as the various incident and reflected signals fade in and out. Yeah, that's going to be a problem, but due to the limited accuracy required in a PC, it's not a big problem. -- Jeff Liebermann 150 Felker St #D http://www.LearnByDestroying.com Santa Cruz CA 95060 http://802.11junk.com Skype: JeffLiebermann AE6KS 831-336-2558 |
WTF with my computer clock?
On Tue, 18 Aug 2009 18:49:05 +0000 (UTC), "Geoffrey S. Mendelson"
wrote: Michael A. Terrell wrote: They replaced all the transmitters and towers at WWVB a few years ago to improve service. It now reaches Central Florida without a long wire antenna & tuner. Why would they spend millions and take a couple years to do the update if they were planning to shut it down? They will shut it down eventually because of the cost. NTP servers cost almost nothing, GPS is "free" because there are no incremental costs for providing the time signals. NIST funding was increaded from $719 million to $819 million. http://www.aip.org/fyi/2009/081.html National Institute of Standards and Technology: The FY 2009 appropriation was $819.0 million. The Administration's request was $846.1 million, an increase of 3.3 percent or $27.1 million. The House bill provides $781.1 million, a cut of 4.6 percent or $37.9 million. The Senate Appropriations Committee bill would provide $878.8 million, an increase of 7.3 percent or $59.8 million. http://www.nist.gov/public_affairs/releases/approps-summary2008-2010.htm http://www.nist.gov/public_affairs/releases/approps-summary2009.htm Operating costs of the various standards stations and labs seems to so small as to not even be mentioned. I didn't see any mention of shutting down anything. -- Jeff Liebermann 150 Felker St #D http://www.LearnByDestroying.com Santa Cruz CA 95060 http://802.11junk.com Skype: JeffLiebermann AE6KS 831-336-2558 |
WTF with my computer clock?
In article ,
Jeff Liebermann wrote: Yup, but the long-term average will be pretty good -- gain a little in the daytime, lose a bit at night (or the other way around; could be either one depending on how the circuit was set up). Maybe, if the wearer maintains a regular schedule. That's a fair assumption, until the wearer changes their usage pattern, such as going on a ski trip. Also, please note that the original discussion was over the accuracy of a computah clock, not a wrist watch. Unless left on continuously, computers don't maintain a set schedule. Even so, their internal temperature is affected by the building environment. But is there any real difference between a 'quartz' watch and a PC clock? They both rely on a low cost crystal? -- *A journey of a thousand sites begins with a single click * Dave Plowman London SW To e-mail, change noise into sound. |
WTF with my computer clock?
Meat Plow wrote:
Size? Temp? Does a tiny watch xtal garner any more accuracy merely because of its size? Does a watch xtal have a different temperature coefficient? You are confusing the hardware clock and software clock in a computer. The hardware clock is crystal controlled. It is used at boot time to set the software clock. The software clock is incremented by the lowest priority interupts, which causes it to wander off. There are various schemes to sync it with the hardware clock, but without an external source, e.g. NTP, the don't work very well as hardware clocks are not very accurate. Geoff. -- Geoffrey S. Mendelson, Jerusalem, Israel N3OWJ/4X1GM |
WTF with my computer clock?
On Wed, 19 Aug 2009 18:14:14 +0100, "Dave Plowman (News)"
wrote: But is there any real difference between a 'quartz' watch and a PC clock? They both rely on a low cost crystal? Oh yes. The original Accutron was a steel tuning fork osillator. No crystal of any kind to drive it. It depended totally on mechanical stability. Watch crystals come in a few flavors. The original version used Statek type quartz tuning forks. They're really a mechanical tuning fork made out of quartz: http://www.statek.com/products.php They work nicely at low frequencies and do not require a large divider chain to drive the gears. 32.768Khz was the most common. As IC technology progressed, it was more economical to use a big divider chain and a higher frequency crystal such as 3.57945Mhz. Meanwhile, someone figured out how to shrink the 32.768Mhz crystal, so the next generation went back to those. (This is a gross over simplification). The problem is that these relatively low frequency and small physical size crystals have a terrible temperature coeficient. Here's a typical data sheet: www.abracon.com/Resonators/AB26T.pdf The original IBM PC used a 14.31818MHz AT cut crystal. It was much more stable, but there was no mechanism for adjusting the exact frequency. There was also no temperature compensation or even the use of temperature stable capacitors. This sorta explains how it works and includes at series of curves for AT and SC cut crystals. http://www.4timing.com/techcrystal.htm The IBM PC oscillator was somewhat of an improvement in stability over the typical watch crystal, but without an adjustment, it was nearly useless. Since 1981, I've looked inside literally hundreds of computahs and SBC's. Not a single one has a tunable clock oscillator. One or two used replaceable modular oscillators, which could pre purchased as a TCXO, but which were usually supplied as a commodity clock oscillator. These daze, the way to stabilize a TCXO is to first pre-age (beat-up) the crystal to reduce long term drift. The crystal oscillator is then characterized over the required temperature range. A table of frequency versus temperature is generated and saved in a PROM. A PIC controller on the oscillator takes the measured temperature, reads the table, and applies the necessary correcting voltage to a varactor to stabilize the oscillator over a very wide temp range. With this method, you can take a really awful crystal, and compensate it to impressive accuracies. gotta run... -- Jeff Liebermann 150 Felker St #D http://www.LearnByDestroying.com Santa Cruz CA 95060 http://802.11junk.com Skype: JeffLiebermann AE6KS 831-336-2558 |
WTF with my computer clock?
On 8/19/2009 12:49 PM Jeff Liebermann spake thus:
Since 1981, I've looked inside literally hundreds of computahs and SBC's. Not a single one has a tunable clock oscillator. One or two used replaceable modular oscillators, which could pre purchased as a TCXO, but which were usually supplied as a commodity clock oscillator. So I wonder if the lowly SX28, one of my favorite little machines to program (a PIC-like li'l guy) is an exception to this seeming rule? I ask because, looking at the specs for this CPU, it has some configuration bits (marked IRCTRIM0-2) that trim the internal RC oscillator frequency, supposedly in steps of about 3%, up to a maximum of +/- 8% (yeah, I know, doesn't add up, but whatever). Is this what you would call a "tunable oscillator"? These daze, the way to stabilize a TCXO is to first pre-age (beat-up) the crystal to reduce long term drift. The crystal oscillator is then characterized over the required temperature range. A table of frequency versus temperature is generated and saved in a PROM. A PIC controller on the oscillator takes the measured temperature, reads the table, and applies the necessary correcting voltage to a varactor to stabilize the oscillator over a very wide temp range. With this method, you can take a really awful crystal, and compensate it to impressive accuracies. So presumably what I just described is a varactor built into the SX28. -- Found--the gene that causes belief in genetic determinism |
WTF with my computer clock?
On Wed, 19 Aug 2009 13:15:30 -0700, David Nebenzahl
wrote: On 8/19/2009 12:49 PM Jeff Liebermann spake thus: Since 1981, I've looked inside literally hundreds of computahs and SBC's. Not a single one has a tunable clock oscillator. One or two used replaceable modular oscillators, which could pre purchased as a TCXO, but which were usually supplied as a commodity clock oscillator. So I wonder if the lowly SX28, one of my favorite little machines to program (a PIC-like li'l guy) is an exception to this seeming rule? Is that the Ubicom or Parallax SX28 processor? Dunno, I've never worked with these. (Reminder: I are not a programmist). I ask because, looking at the specs for this CPU, it has some configuration bits (marked IRCTRIM0-2) that trim the internal RC oscillator frequency, supposedly in steps of about 3%, up to a maximum of +/- 8% (yeah, I know, doesn't add up, but whatever). Is this what you would call a "tunable oscillator"? I can't tell for su http://www.parallax.com/dl/docs/prod/datast/SX20AC-SX28AC-Data-v1.6.pdf See Section 9.0 I don't see any internal or external compensation for temperature drift. It does have a real time clock, but again, no stabilization. There is a section in the RC oscillator (FUSE register) which sets the divider ratio from the RC oscillator. This is really a coarse adjustment to set the divider ratio to generate an assortment of frequencies between 31KHz and 4MHz. No way is it intended for fine tuning for temp compensation. These daze, the way to stabilize a TCXO is to first pre-age (beat-up) the crystal to reduce long term drift. The crystal oscillator is then characterized over the required temperature range. A table of frequency versus temperature is generated and saved in a PROM. A PIC controller on the oscillator takes the measured temperature, reads the table, and applies the necessary correcting voltage to a varactor to stabilize the oscillator over a very wide temp range. With this method, you can take a really awful crystal, and compensate it to impressive accuracies. So presumably what I just described is a varactor built into the SX28. I don't think so. I couldn't see such a feature on the data sheet. Varactors are also chip real estate hogs, and would usually require substantial documentation and explanation to impliment. I don't see any of that in the data sheet. I sorta blundered across this: "NTP temperature compensation" http://www.ijs.si/time/temp-compensation/ |
WTF with my computer clock?
In article ,
Jeff Liebermann wrote: Accutron watches -- snippage -- A more interesting problem was mechanical vibrations in the 360Hz range. (the frequency of the tuning fork). When my watch was working, it would tend to run quite fast if I was working near big synchronous or induction motors driven by 60Hz such as in my fathers clothing factory. It was not unusual to gain about a minute, after spending an hour pushing cloth through an industrial sewing machine (with my hands on the table). I used mine for sports car rallies, and needed to synch it to WWV fairly often (weekend rallies), but it was damn hard to set to the nearest second even though I had the jeweler install the "hack" feature. I learned to adjust it to run just slow enough so that accidental knocks and so on would never put it ahead of time during the week. Then, simply by giving it a good "thump" on the edge, I could overdrive the fork briefly (it would do a three-tooth push on the driven gear instead of the usual two), which would make it gain a good fraction of a second. A few of those would get the thing spot on. I suspect (guess) that vibration was also the problem in airplanes, not thin air. Bulova said it was air density. Temperature is of course a problem: http://bmumford.com/mset/tech/accutron/index.html As well as every other momentum-transfer effect that plagues tuning forks. Interestingly, they also affect those 32,768 Hz. crystals because they are physically shaped like tuning forks (that's the only oscillatory mode that can run that slowly in such a small piece of quartz). John Harrison's marine chronometer, developed for the British navy in the mid-1700's, was good for about a minute a month, which was considered the lowest accuracy usable for navigation. The Accutron was the first "commercial" watch to have the same accuracy. Isaac |
WTF with my computer clock?
In article ,
"Dave Plowman (News)" wrote: In article , Jeff Liebermann wrote: Yup, but the long-term average will be pretty good -- gain a little in the daytime, lose a bit at night (or the other way around; could be either one depending on how the circuit was set up). Maybe, if the wearer maintains a regular schedule. That's a fair assumption, until the wearer changes their usage pattern, such as going on a ski trip. Also, please note that the original discussion was over the accuracy of a computah clock, not a wrist watch. Unless left on continuously, computers don't maintain a set schedule. Even so, their internal temperature is affected by the building environment. But is there any real difference between a 'quartz' watch and a PC clock? They both rely on a low cost crystal? Most do. In many cases, the actual "CPU clock" of a couple of GHz. or so, is derived from that same crystal, upconverted by a digital phase-locked loop. Isaac |
WTF with my computer clock?
In article ,
Jeff Liebermann wrote: On Wed, 19 Aug 2009 13:15:30 -0700, David Nebenzahl wrote: On 8/19/2009 12:49 PM Jeff Liebermann spake thus: Since 1981, I've looked inside literally hundreds of computahs and SBC's. Not a single one has a tunable clock oscillator. One or two used replaceable modular oscillators, which could pre purchased as a TCXO, but which were usually supplied as a commodity clock oscillator. So I wonder if the lowly SX28, one of my favorite little machines to program (a PIC-like li'l guy) is an exception to this seeming rule? Is that the Ubicom or Parallax SX28 processor? Dunno, I've never worked with these. (Reminder: I are not a programmist). I ask because, looking at the specs for this CPU, it has some configuration bits (marked IRCTRIM0-2) that trim the internal RC oscillator frequency, supposedly in steps of about 3%, up to a maximum of +/- 8% (yeah, I know, doesn't add up, but whatever). Is this what you would call a "tunable oscillator"? I can't tell for su http://www.parallax.com/dl/docs/prod/datast/SX20AC-SX28AC-Data-v1.6.pdf See Section 9.0 I don't see any internal or external compensation for temperature drift. It does have a real time clock, but again, no stabilization. There is a section in the RC oscillator (FUSE register) which sets the divider ratio from the RC oscillator. This is really a coarse adjustment to set the divider ratio to generate an assortment of frequencies between 31KHz and 4MHz. No way is it intended for fine tuning for temp compensation. These daze, the way to stabilize a TCXO is to first pre-age (beat-up) the crystal to reduce long term drift. The crystal oscillator is then characterized over the required temperature range. A table of frequency versus temperature is generated and saved in a PROM. A PIC controller on the oscillator takes the measured temperature, reads the table, and applies the necessary correcting voltage to a varactor to stabilize the oscillator over a very wide temp range. With this method, you can take a really awful crystal, and compensate it to impressive accuracies. So presumably what I just described is a varactor built into the SX28. I don't think so. I couldn't see such a feature on the data sheet. Varactors are also chip real estate hogs, and would usually require substantial documentation and explanation to impliment. I don't see any of that in the data sheet. I sorta blundered across this: "NTP temperature compensation" http://www.ijs.si/time/temp-compensation/ Putting a crystal in a temperature-stabilized "oven" is a well known technique for generating a stable frequency (the telco folks and the broadcast folks have been doing that for over 75 years, at least). I have thought for a long time that it would be "neat" to glue a resistor to the crystal case, and use heat to control the frequency. You'd pulse-width modulate the power going to the resistor... Isaac |
WTF with my computer clock?
In article ,
Meat Plow wrote: On Wed, 19 Aug 2009 18:14:14 +0100, "Dave Plowman (News)" wrote: In article , Jeff Liebermann wrote: Yup, but the long-term average will be pretty good -- gain a little in the daytime, lose a bit at night (or the other way around; could be either one depending on how the circuit was set up). Maybe, if the wearer maintains a regular schedule. That's a fair assumption, until the wearer changes their usage pattern, such as going on a ski trip. Also, please note that the original discussion was over the accuracy of a computah clock, not a wrist watch. Unless left on continuously, computers don't maintain a set schedule. Even so, their internal temperature is affected by the building environment. But is there any real difference between a 'quartz' watch and a PC clock? They both rely on a low cost crystal? Size? Temp? Does a tiny watch xtal garner any more accuracy merely because of its size? Because of the oscillatory mode, low-frequency watch crystals are notoriously inaccurate. Does a watch xtal have a different temperature coefficient? Yes; poor, for the same reason. Isaac |
All times are GMT +1. The time now is 02:29 PM. |
Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright ©2004 - 2014 DIYbanter