> To make this work somewhat smoothly, one would need at least 0.001C or > better 0.0001C resolution.
Hi Said, Can you provide some real data, to help me believe this claim? When you get below 1C or 0.1C resolution it starts to matter from which direction the temperature gradient is headed, or from which angle air is flowing. Or which side is up. Or what the humidity is, etc. This, because temperature gradients break any steady state model you have based on a point source (e.g., temp sensor) or average temperature measurement (e.g., oven current). Then there is the matter of the rate at which temperature changes; slow temp changes and rapid temp changes affect an OCXO quite differently. This, due to different thermal time constants of all the metal and insulation materials in and around the OCXO or GPSDO. > To give you an example: a typical single-oven OCXO has about 1ppb per degree > C change. This would mean the unit could only be adjusted by 2.5E-010 steps > with the Dallas Temp sensor as a reference. So the average error would be > 1.25E-010, which results in a massive 12.5 microseconds average drift error > in > 1000s intervals due to the temperature quantization error! Are you assuming a design where one takes a Dallas reading each second and stuffs it into the EFC every second? No one would actually do that. Instead if one averages the temperature sensor over 10, 100 or more seconds you avoid large EFC steps. Everything else in a GPSDO is all about slow averaging; there's no reason temperature adaptation cannot be treated the same. I suppose I should add a random temperature cycle hold-over test to the abuse I inflict on each GPSDO here. Are you saying the Fury would do much better than others in this regard? Again, given this is time-nuts, some real data would be nice. /tvb _______________________________________________ time-nuts mailing list -- [email protected] To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.
