Dear, sklearn community,

1. The source code of the Block SBL algorithm is now available at bitbucket:
https://bitbucket.org/liubenyuan/pybsbl
any suggestion, optimization and test on the code are all welcome! as well
as your success stories on applying our methods.

2. Block-OMP is an extension to the original OMP algorithm which can handle
the block sparsity model. Proposed by Eldar in 2010.

3. Group-lasso in python is also very important. As introduced in the
TSP2012 paper by Zhang, the block SBL can be viewed as an iterative
re-weighted group lasso algorithm, which is called BSBL-L1 in the original
paper. The implementation of BSBL-L1 thus relies on some group-lasso
solvers. I will contact  Fabian (@fabianp on github) ASAP.

4. The BSBL, easily outperforms Block-OMP, group-lasso, Model-CoSaMP (by
Baraniuk) with a simple, intuitive Bayesian framework. The BSBL algorithm
has superior phase-transition performance, and plus, in additional to the
ability of recovering block sparse signals with high sparsity levels and
low indeterminacy levels, the proposed algorithm BSBL-BO can also recover
non-sparse signals. The website of the author Zhilin Zhang:
http://dsp.ucsd.edu/~zhilin/BSBL.html
provide many real-life examples, you can check it out.

5. The algorithm is not far away from real applications. Apply the
algorithm on the bio-medical signal processing has witness many success. By
the way, in our real life applications, many signals tend to be block
sparse and are rich of intra-block structures. The algorithm has a
promising future.

6. I will continue commit to the python society of implementing further,
many more SBL algorithms, first on the bitbucket and if any of you get
interested, contact me!

Best wishes to the sklearn community.

liu benyuan


On Thu, Nov 29, 2012 at 1:22 AM, Gael Varoquaux <
[email protected]> wrote:

> Hi Liu,
>
> This work is really nice and very fancy, but it is also very recent and
> needs a bit more insight and benchmarking before it can enter
> scikit-learn: we have a rule not to integrate any new approach that is
> more than 2 years old. The reason is that if the approach is to be a
> massive success, it will be well-studied and thus it pays to wait a bit
> to understand the trade-offs better. A good approach is to put the
> implementation that you have on github in a separate repo and advertise
> it so that people use it and improve it. As it becomes more and more
> used, you'll get feedback on it, and we'll gather insight, so that we can
> planify a merge or not, depending on the community pick up.
>
> There are well-known and fundamental algorithms to deal with similar
> problems that are not in the scikit-learn yet, such as group lasso. From
> what I understand of the block sparse bayesian learning, could benefit
> from a group lasso solver. Thus integrating a group lasso would be a
> useful step to deal with block-sparse problems. Fabian (@fabianp on
> github) has started work on a group lasso solver that is on his github
> account, but we never could find time to do the integration. I am not
> sure if it does overlapping groups, and thus if you can use it, but it
> would be nice to work on its integration in the scikit-learn.
> Documentation and examples need to be written.
>
> Thanks a lot for your interest.
>
> Gaƫl
>
> On Wed, Nov 28, 2012 at 03:08:23PM +0800, [email protected] wrote:
> > Dear scikit-learn community:
>
> > Block Sparse Bayesian Learning is a powerful CS algorithm for recovering
> block sparse signals with structures, and shows the additional benefits of
> reconstruct non-sparse signals, see Dr. zhilin zhang's websites:
> > http://dsp.ucsd.edu/~zhilin/BSBL.html
>
> > I currently implement the BSBL-BO algorithm by Zhang and a fast version
> of BSBL algorithm recently proposed by us, called BSBL-FM, in python. Plus
> many demos using these two codes. Does scikit-learn community welcome such
> type of code ?
> >  what is the procedure to submit the code in the mainstream of scikit
> learn?
>
> > Thanks for the great project!
>
> > Liu benyuan
>
> >
> ------------------------------------------------------------------------------
> > Keep yourself connected to Go Parallel:
> > INSIGHTS What's next for parallel hardware, programming and related
> areas?
> > Interviews and blogs by thought leaders keep you ahead of the curve.
> > http://goparallel.sourceforge.net
>
> > _______________________________________________
> > Scikit-learn-general mailing list
> > [email protected]
> > https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>
>
> --
>     Gael Varoquaux
>     Researcher, INRIA Parietal
>     Laboratoire de Neuro-Imagerie Assistee par Ordinateur
>     NeuroSpin/CEA Saclay , Bat 145, 91191 Gif-sur-Yvette France
>     Phone:  ++ 33-1-69-08-79-68
>     http://gael-varoquaux.info            http://twitter.com/GaelVaroquaux
>
>
> ------------------------------------------------------------------------------
> Keep yourself connected to Go Parallel:
> INSIGHTS What's next for parallel hardware, programming and related areas?
> Interviews and blogs by thought leaders keep you ahead of the curve.
> http://goparallel.sourceforge.net
> _______________________________________________
> Scikit-learn-general mailing list
> [email protected]
> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>
------------------------------------------------------------------------------
Keep yourself connected to Go Parallel: 
VERIFY Test and improve your parallel project with help from experts 
and peers. http://goparallel.sourceforge.net
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to