The idea of using a standard stereo sound card interface to do Allan deviation 
measurements has been discussed on the list in the past (i.e. Beat the two 
signals down to some convenient audio frequency, digitize, and find zero 
crossings by curve fits to the sampled data).

Several have commented that one needs good isolation between the channels of 
the digitizer to get good results, and inexpensive interfaces (e.g. The one 
that comes on the motherboard) often don't have good isolation.

Here's a question.. Is that coupling determinstic and "calibrate-out-able"?  
Seems that the factors leading to lack of isolation are things like layout, 
capacitive coupling, shared ground paths, and the like.  If the card is in a 
constant environment, those shouldn't be changing, so, in theory, one could 
somehow measure it, and apply the inverse transformation.

Jim
_______________________________________________
time-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to