Joe,

Sounds good. Regarding tuning the sfrsd2 erasure probabilities for the HF use 
case, I played around with that some and I reached the conclusion that we 
should tune the algorithm using statistics derived from vectors that required 
soft-symbol decoding (as opposed to using the bulk of our data, which can be 
decoded using BM). 

Unfortunately, I have only about 500 such vectors from my data set, which is 
not really enough to do a good job of filling in the 8x8 probability matrices. 
Thus, I propose the following --- once you are satisfied with your algorithm 
for selecting candidates, let’s each use that version to write an s3_hf.bin 
file using our HF data sets. We can then share those files and concatenate them 
to combine your data and mine. That should give us enough soft-symbol-decodable 
vectors upon which to base the iterative self-tuning scheme. 

Steve

 
> On Oct 5, 2015, at 4:35 PM, Joe Taylor <j...@princeton.edu> wrote:
> 
> Hi STeve,
> 
> I think I understand what's going on with the less-than-perfect 
> selection of candidate frequencies at which to attempt JT65 decoding.  I 
> hope to spend some time on it in the next couple of days.  If I don't 
> get it sorted out then, it may be delayed for about a week.  I'll be 
> away between Oct 8 and Oct 14.
> 
>       -- Joe, K1JT
> 
> ------------------------------------------------------------------------------
> _______________________________________________
> wsjt-devel mailing list
> wsjt-devel@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/wsjt-devel


------------------------------------------------------------------------------
_______________________________________________
wsjt-devel mailing list
wsjt-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/wsjt-devel

Reply via email to