On 7/9/19 12:46 PM, Greg Beam wrote: Hi Greg,
Personally, I don't have a need for it either. If all I am after is QSO validation: grep + awk or a good quality text editor is all that's needed :-)
Exactly my opinion !
However, if one wants to do any sort of data analysis, the flat file format is less than ideal. Normalizing the data would go a long way toward shrinking the footprint, however, it also adds a level of complexity some may not find very pleasant.
It's depending of what we want to analyse. As an example, if we want to analyse the dT, a little parsing and extraction program followed by a statistics program like R are sufficient. Having the data stored in a database, would give us no advantage, as long as the database management system is not coupled itself to such a statistics software.
I am aware, and have done a bit of parsing in the past regarding the varying data structures of the file. It's changed a a good bit over the last few versions.
Exactly ! I think that a little program in perl is sufficient to extract the needed information. Of course, other languages are possible too.
Best wishes, Claude (DJ0OT) _______________________________________________ wsjt-devel mailing list wsjt-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/wsjt-devel