> On 02/08/2012 01:06 PM, Ralph Giles wrote: > >> On 8 February 2012 10:00, Justin Ruggles <[email protected]> wrote: >> >>> 0.5% to 1.0% on average. That's with a fairly simple algorithm. >> >> Not very worthwhile. I imagine it's possible to do quite a bit more on >> some files, but it would be pretty expensive to find the boundaries... > > > Yes, it likely can do better, and yes it can be quite expensive. IIRC > (it was several years ago) the algorithm I used in Flake is just a > simple threshold comparison of the sum of 2nd order LPC residual on > equally-divided pieces of the input block to decide whether or not to > merge adjacent pieces. But something better could be done using smaller > pieces, trellis search, and/or a more accurate cost function. > > -Justin
One idea that might be faster would be to pick some standard block size from which you compute the LPC coefficients, and you keep "growing" the block as long as the residuals remain within some bound. You would then start a new block as soon as it become obvious that picking new coefficients would be a better strategy. I'm not sure how effective it would be, and certainly not optimal, but it might be faster. However, not using the potentially "grown" block in the calculation of the original LPC coefficients might reduce its effectiveness. -Ben Allison _______________________________________________ flac-dev mailing list [email protected] http://lists.xiph.org/mailman/listinfo/flac-dev
