Re: [time-nuts] Continuous timestamping reciprocal counter question
On 05/13/2011 04:56 PM, Tijd Dingen wrote: To calculate the frequency from these time stamps you have to do some slop fitting. If you use a least squares matrix approach for that I could see how the more random distribution could help prevent singularities. The only reason I can see now to really try harder to always get the exact Nth edge is for numerical solving. As in, should you choose a solver that only operates optimally for equidistant samples. Any thoughts? You don't have to get exactly every Nth edge. But you need to count the edges. A continuous time-stamping counter will count time and edges and the time-stamp will contain both (except in some special conditions where it isn't needed). There are a number of different approaches on how frequency is extracted out of the dataset, however very few of them assumes perfect event count distance. Cheers, Magnus ___ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.
Re: [time-nuts] Limitations of Allan Variance applied to frequency divided signal?
On 05/13/2011 05:28 PM, Tijd Dingen wrote: In trying to put together a way to calculate Allan variance based on a series of timestamps of every Nth cycle, I ran into the following... Suppose you have an input signal, but it's a bit on the high side. So you use a prescaler to divide it down to a manageable frequency range. And now you want to use that signal to be able to say something useful about the original high frequency signal. Now taking a look at the part about Non-overlapped variable tau estimators in the wikipedia article here: http://en.wikipedia.org/wiki/Allan_variance#Non-overlapped_variable_.CF.84_estimators Nice to see people actually read and use what I wrote. It seems to me that divide by 4 and then measure all cycles back-to-back is essentially the same as measure all cycles of the undivided high frequency signal back-to-back and decimate. Or skipping past n − 1 samples as the wiki article puts it. And that is disregarding /extra/ jitter due to the divider, purely for the sake of simplicity. If you use a prescaler of say 1/64 then it takes 64 cycles of the original signal to cause a cycle to the counter core. These are then time-stamped, i.e. a time-measure and an event counter measure is taken. To transform the event-counter value into the properly scaled event value, you then multiply the event counter by 64, since it took 64 times more events than counted by the counter. The time-stamps does not have to be modified. Notice that the pre-scaler is only used for higher frequencies. Plus, I strongly suspect that all these commercial counters that can handle 6 Ghz and such are not timestamping every single cycle back-to-back either. Especially the models that have a few versions in the series. One cheaper one that can handle 300 MHz for example, and a more expensive one that can handle 6 GHz. That reads like: All models share the same basic data processing core and the same time interpolators. For the more expensive model we just slapped on an high bandwidth input + a prescaler. You never time-stamp individual cycles anyway, so a pre-scaler doesn't do much difference. It does limit the granularity of the tau values you use, but usually not in a significant way since Allan variance is rarely used for taus shorter than 100 ms and well... pre-scaling usually is below 100 ns so it isn't a big difference. Anyways, any drawbacks to calculating Allan Variance of a divided signal that I am overlooking here? No significant, it adds to the noise floor, but in practice the time-stamping and processing doesn't have big problems due to it. Cheers, Magnus ___ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.
Re: [time-nuts] Limitations of Allan Variance applied to frequencydivided signal?
On 05/13/2011 06:05 PM, Bob Camp wrote: Hi For AVAR you want a time record not a frequency measure. Your time stamps are a direct phase estimate. They are what you would use directly for the AVAR calculation. If they are faster than your shortest tau, all is well. Divide, mix down, what ever, just stamp faster than the shortest tau. You can use frequency measures but there is a number of quirks hiding in there which can make a frequency-based analysis biased. Using time-stamps avoid those quirks, but naturally you can fluke those too... For instance, use of averaging can be a bad idea. It can be used, but it needs to be blended in not to bias the AVAR measures. Cheers, Magnus ___ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.
Re: [time-nuts] Overheard from NASA
On 05/10/2011 02:21 AM, Jim Lux wrote: A USO (quartz oscillator in a temperature controlled dewar) isn't in this class of performance (and is big and power hungry to boot). If you had a good onboard oscillator, you can do VLBI type measurements to measure not only range, but angle to a higher precision than is currently possible. Would CSAC type of oscillator be of use? Fairly small, fairly power starved. Cheers, Magnus ___ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.
Re: [time-nuts] Continuous timestamping reciprocal counter question
Thanks for the sanity check! :) I was indeed hoping to be able to get away with not every Nth edge since that simplifies thing. On the subject of extracting frequency out of the dataseet, for now I use ordinary least squares. What other approaches do you know of that are used for this particular applicaction? regards, Fred - Original Message - From: Magnus Danielson mag...@rubidium.dyndns.org To: time-nuts@febo.com Cc: Sent: Saturday, May 14, 2011 9:28 AM Subject: Re: [time-nuts] Continuous timestamping reciprocal counter question On 05/13/2011 04:56 PM, Tijd Dingen wrote: To calculate the frequency from these time stamps you have to do some slop fitting. If you use a least squares matrix approach for that I could see how the more random distribution could help prevent singularities. The only reason I can see now to really try harder to always get the exact Nth edge is for numerical solving. As in, should you choose a solver that only operates optimally for equidistant samples. Any thoughts? You don't have to get exactly every Nth edge. But you need to count the edges. A continuous time-stamping counter will count time and edges and the time-stamp will contain both (except in some special conditions where it isn't needed). There are a number of different approaches on how frequency is extracted out of the dataset, however very few of them assumes perfect event count distance. Cheers, Magnus ___ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there. ___ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.
Re: [time-nuts] Limitations of Allan Variance applied to frequency divided signal?
Magnus Danielson wrote: http://en.wikipedia.org/wiki/Allan_variance#Non-overlapped_variable_.CF.84_estimators Nice to see people actually read and use what I wrote. :-) If you use a prescaler of say 1/64 then it takes 64 cycles of the original signal to cause a cycle to the counter core. These are then time-stamped, i.e. a time-measure and an event counter measure is taken. To transform the event-counter value into the properly scaled event value, you then multiply the event counter by 64, since it took 64 times more events than counted by the counter. The time-stamps does not have to be modified. Okay, that clears things up. Thanks! Notice that the pre-scaler is only used for higher frequencies. Understood. I was just using the prescaler as an example for the what if if take every Nth edge. Plus, I strongly suspect that all these commercial counters that can handle 6 Ghz and such are not timestamping every single cycle back-to-back either. Especially the models that have a few versions in the series. One cheaper one that can handle 300 MHz for example, and a more expensive one that can handle 6 GHz. That reads like: All models share the same basic data processing core and the same time interpolators. For the more expensive model we just slapped on an high bandwidth input + a prescaler. You never time-stamp individual cycles anyway, so a pre-scaler doesn't do much difference. It does limit the granularity of the tau values you use, but usually not in a significant way since Allan variance is rarely used for taus shorter than 100 ms and well... pre-scaling usually is below 100 ns so it isn't a big difference. Well, I can certainly /try/ to be able to timestamp individual cycles. ;) That way I can for example characterize oscillator startup and such. Right now I can only spit out a medium resolution timestamp every cycle for frequencies up to about 400 Mhz, and a high resolution timestamp every cycle for frequencies up to about 20 MHz. Medium resolution being on the order of 100 ps, and high resolution being on the order of 10 ps. The medium resolution is possibly even a little worse than that due to non-linearities, but there is still a few ways to improve that. Just requires an aweful lot of design handholding to manually route parts of the fpga design. I.e: I will do that later. much much later. ;- But understood, for Allan variance you don't need timestamps for every indivual cycle. Anyways, any drawbacks to calculating Allan Variance of a divided signal that I am overlooking here? No significant, it adds to the noise floor, but in practice the time-stamping and processing doesn't have big problems due to it. Precisely what I was hoping for, thanks! :) regards, Fred ___ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.
Re: [time-nuts] Continuous timestamping reciprocal counter question
Hi Fred, On 05/14/2011 12:12 PM, Tijd Dingen wrote: Thanks for the sanity check! :) I was indeed hoping to be able to get away with not every Nth edge since that simplifies thing. There are many things you can get away with, just how much trouble you want to verify it versus doing the proper thing is another issue. On the subject of extracting frequency out of the dataseet, for now I use ordinary least squares. What other approaches do you know of that are used for this particular applicaction? Linear regression is being used as well as block-averaged time-stamps. The later method using say 200 time-stamps divides them into the first and second half, you average the time-stamps, subtract the first half average from the second half average and divides by the time between the first samples (or equivalent time). Just using a time sequence of 200 time-stamps you get 199 direct in sequence pairs forming 199 frequency estimates. Sounds cool, now we can average those 199 frequency estimates... well... the sad thing is that you essentially cancels the 198 measures and end up with only using the first and last samples. The sqrt(N) averaging of the above block averager is lost in this simple variant and still they have about the same computing complexity. So, frequency estimation algorithms can look good until you find out that their degrees of freedom may vary greatly. These to algorithms differs about N/2 in degrees of freedom.. I wanted to illustrate how good or bad algorithms can be on the same amount of data. I have not looked on detail performance comparison between these algorithms lately. However, they should not be used naively together with AVAR and friends since they attempt to do the same thing, so the resulting filtering will become wrong and biased results will be produced. Cheers, Magnus ___ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.
Re: [time-nuts] Continuous timestamping reciprocal counter question
Magnus Danielson wrote: There are many things you can get away with, just how much trouble you want to verify it versus doing the proper thing is another issue. Define proper thing. ;-) From what I understand taking the exact Nth edge, and then do linear regression is equivalent to taking roughly every Nth edge and then do linear regression. Equivalent in the sense that the frequency estimates of the two will be the same, to within the usual numerical uncertainties. Or to put that another way: The first method of doing things is not inherently better or worse than the second method. After all, that is the whole thing I am trying to be sure of right now. Of course I can make sure that I take exactly every Nth edge. It is just that there are some considerable implementation advantages if that constraint does not have to be so strict. One advantage being that if this constraint can be fairly loose, then using the ISERDES2 in the spartan-6 as part of the coarse counter is fairly simple. I did a couple of test with that, and all looks good. The main advantage there being that if I use the serdes, this translates into a higher input frequency without the need for a prescaler. Which translates into better precision. Hence my current (over)focus to make absolutely sure that all the results are also valid if one takes almost the Nth edge, but not quite right all the time... However, you still know which edge is which. You just don't know it early enough in the pipeline to use as basis for a triggering decision. I have not looked on detail performance comparison between these algorithms lately. However, they should not be used naively together with AVAR and friends since they attempt to do the same thing, so the resulting filtering will become wrong and biased results will be produced. Well, for the AVAR calculation I only use the raw time-stamps. So nothing preprocessed. Then I should not have to worry about this sort of bias, right? regards, Fred ___ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.
Re: [time-nuts] Symmetricom TruTime XL-DC date error
Very good insite on the 1024 weeks The odetics operate this way just fine. Regards Paul On Thu, May 12, 2011 at 3:05 PM, Jean-Louis Oneto jean-louis.on...@obs-azur.fr wrote: Hello If it's a rollover problem, you should try 26.09.1991 (which is 1024 weeks before today). HTH, Jean-Louis - Original Message - From: Peter Loron pet...@standingwave.org To: time-nuts@febo.com Sent: Thursday, May 12, 2011 5:46 PM Subject: [time-nuts] Symmetricom TruTime XL-DC date error Hello, group. I've got a Symmetricom TruTime XL-DC, which appears to be working ok now that I have the proper antenna. However, it thinks the date is 2031. Is this a known firmware issue for these boxes? Thanks. -Pete ___ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there. ___ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there. ___ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.
[time-nuts] Results parts selection + commercial assembly poll?
Incidentally, did something ever come of these two polls? I was trying find the conclusion / results, but could not find it on the list. Entirely possible that I am blind for which I apologize in advance. Anyone know what came of it? regards, Fred ___ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.
Re: [time-nuts] Atomic clock on a chip
On 5/11/2011 9:24 PM, James Fournier wrote: http://www.smartertechnology.com/c/a/Technology-For-Change/Smarter-Atomic-Clock-on-a-Chip-Debuts/ NIST (aka bureau of standards) invented an atomic clock movement that is the size of a grain of rice. That's almost good enough to fit in a self-setting wristwatch, if they can get the power needs down enough. A radio-controlled REAL atomic watch would be awesome, as it'll never be more than a few milliseconds off at any instant. The watch could even be designed so that the user sets in the city and it compensates for time of radio propagation. (like about 5 milliseconds from the WWVB transmitter to Chicago) An atomic watch is as of now is no more than a half a second off at any time, good enough for human affairs. (but not for a good time nut!) ___ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.