After seeing this post, I looked into the specs of various GPS receivers.
toggle quoted message
Show quoted text
I note that the Thunderbolt is rated at 1 part in 10^12 accuracy with respect to the 10 MHz signal that it produces - can't beat that unless you have your own cesium standard. Apparently this requires the Thunderbolt to do some averaging over a day, to reach that accuracy. However the 1 pps signal is rated at much lower accuracy, within 20 ns. I believe that this translate to an error of up to 0.2 Hz at 10 MHz. Is my analysis correct? Therefore, it seems to me that using the 1 pps signal as a real-time reference to adjust a rubidium oscillator is not helpful unless you know the Rb oscillator was off by more than 0.2 Hz to begin with. I recently bought an HP 5335A which can read out frequency up to 1.3+ GHz to 11 digits, and this has caused me to become interested in an effective way to reference my rubidium frequency standard as well as the built in 10 MHz standard within the HP 5335A. Right now the difference between the two is around 0.01 Hz which reflects ~1 part in 10^9... I guess I also will look for a Trimble Thunderbolt or equivalent. Patrick Wong AK6C --- In TekScopes@..., "denyhstk" <denyhstk@...> wrote:
|