On Mon, Aug 24, 2015 at 06:02:19PM -0400, Andreas Mueller wrote:
> I think the real solution is to provide backward-compatible ``__getattr__`` 
> and
> ``__setattr_``.

It's a lot of work to support this and do QA. I am not sure we want to
add this to our plate.

I would personnally rather support PMML I/O, as it has greater value and
is probably on the same order of complexity.

Anyhow, all this is for after 1.0.

G

> Theano seems able to do that (at least that is what I was told).
> It is unclear weather we want to do this. If we want to do this, we probably
> only want it post 1.0

> On 08/19/2015 02:35 AM, Joel Nothman wrote:

>     Frequently the suggestion of supporting PMML or similar is raised, but 
> it's
>     not clear whether such models would be importable in to scikit-learn, or
>     how to translate scikit-learn transformation pipelines into its notation
>     without going mad, etc. Still, even a library of exporters for individual
>     components would be welcome, IMO, if someone wanted to construct it.

>     On 19 August 2015 at 15:08, Sebastian Raschka <se.rasc...@gmail.com> 
> wrote:

>         Oh wow, thanks for the link, I just skimmed over the code, but this is
>         an interesting idea snd looks like the sort of thing that would make 
> my
>         life easier in future. I will dig into that! That’s great, thanks!


>         > On Aug 19, 2015, at 12:58 AM, Stefan van der Walt <
>         stef...@berkeley.edu> wrote:

>         > On 2015-08-18 21:37:41, Sebastian Raschka <se.rasc...@gmail.com>
>         > wrote:
>         >> I think for “simple” linear models, it would be not a bad idea
>         >> to save the weight coefficients in a log file or so. Here, I
>         >> think that your model is really not that dependent on the
>         >> changes in the scikit-learn code base (for example, imagine that
>         >> you trained a model 10 years ago and published the results in a
>         >> research paper, and today, someone asked you about this
>         >> model). I mean, you know all about how a logistic regression,
>         >> SVM etc. works, in the worst case you just use those weights to
>         >> make the prediction on new data — I think in a typical “model
>         >> persistence” case you don’t “update” your model anyways so
>         >> “efficiency” would not be that big of a deal in a typical “worst
>         >> case use case”.

>         > Agreed—this is exactly the type of use case I want to support.
>         > Pickling won't work here, but using HDF5 like MNE does would
>         > probably be close to ideal (thanks to Chris Holdgraf for the
>         > heads-up):

>         > https://github.com/mne-tools/mne-python/blob/master/mne/_hdf5.py

>         > Stéfan


>         
> ------------------------------------------------------------------------------
>         > _______________________________________________
>         > Scikit-learn-general mailing list
>         > Scikit-learn-general@lists.sourceforge.net
>         > https://lists.sourceforge.net/lists/listinfo/scikit-learn-general


>         
> ------------------------------------------------------------------------------
>         _______________________________________________
>         Scikit-learn-general mailing list
>         Scikit-learn-general@lists.sourceforge.net
>         https://lists.sourceforge.net/lists/listinfo/scikit-learn-general




>     
> ------------------------------------------------------------------------------



>     _______________________________________________
>     Scikit-learn-general mailing list
>     Scikit-learn-general@lists.sourceforge.net
>     https://lists.sourceforge.net/lists/listinfo/scikit-learn-general



> ------------------------------------------------------------------------------

> _______________________________________________
> Scikit-learn-general mailing list
> Scikit-learn-general@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general


-- 
    Gael Varoquaux
    Researcher, INRIA Parietal
    NeuroSpin/CEA Saclay , Bat 145, 91191 Gif-sur-Yvette France
    Phone:  ++ 33-1-69-08-79-68
    http://gael-varoquaux.info            http://twitter.com/GaelVaroquaux

------------------------------------------------------------------------------
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to