Re: [scikit-learn] Model trained in 0.17 gives entirely different results in 0.15

2016-08-03 Thread Andreas Mueller
Hi Shi. In general, there is no guarantee that models built with one version will work in a different version. In particular, loading in an older version when built in a newer version seems something that's tricky to achieve. We might want to warn the user when doing this. The docs are not ver

Re: [scikit-learn] Model trained in 0.17 gives entirely different results in 0.15

2016-08-03 Thread Shi Yu
Hi Andy, Thanks for the feedback. Indeed we think it would be a good idea to enforce version persistence something like in serialVersionUID Java here. We deployed models trained on our laptop onto our clusters, and ran into this issue and paid a serious lesson for that. Best, Shi --

Re: [scikit-learn] Model trained in 0.17 gives entirely different results in 0.15

2016-08-03 Thread Matthieu Brucher
More often than not, forward compatiblity is not possible. I don't think there are lots of companies doing so, as even backward compatibility is tricky to achieve. Even with serializing the version, if the previous version doesn't know about the additional data structures that have an impact on the

Re: [scikit-learn] Model trained in 0.17 gives entirely different results in 0.15

2016-08-03 Thread Dale T Smith
Use conda or a virtualenv to handle compatibility issues. Then you can control when upgrades occur. I’ve used conda with good effect to handle version issues such as yours. Otherwise, use PMML. The Data Mining Group maintains a list of PMML producers and consumers. I think there is a Python wra

Re: [scikit-learn] Model trained in 0.17 gives entirely different results in 0.15

2016-08-03 Thread Andreas Mueller
On 08/03/2016 03:16 PM, Matthieu Brucher wrote: More often than not, forward compatiblity is not possible. I don't think there are lots of companies doing so, as even backward compatibility is tricky to achieve. Even with serializing the version, if the previous version doesn't know about the

Re: [scikit-learn] Model trained in 0.17 gives entirely different results in 0.15

2016-08-03 Thread Matthieu Brucher
True! 2016-08-03 20:38 GMT+01:00 Andreas Mueller : > > > On 08/03/2016 03:16 PM, Matthieu Brucher wrote: > >> More often than not, forward compatiblity is not possible. I don't think >> there are lots of companies doing so, as even backward compatibility is >> tricky to achieve. >> Even with seri

Re: [scikit-learn] Model trained in 0.17 gives entirely different results in 0.15

2016-08-03 Thread Luke Chang
1pmish -luke > On Aug 3, 2016, at 4:13 PM, Matthieu Brucher > wrote: > > True! > > 2016-08-03 20:38 GMT+01:00 Andreas Mueller : >> >> >>> On 08/03/2016 03:16 PM, Matthieu Brucher wrote: >>> More often than not, forward compatiblity is not possible. I don't think >>> there are lots of compa

[scikit-learn] StackOverflow Documentation

2016-08-03 Thread Joel Nothman
StackOverflow has introduced its Documentation space, where scikit-learn is a covered subject: http://stackoverflow.com/documentation/scikit-learn. The project is a little interesting, and otherwise somewhat exasperating/tiring, given the overlap with our own documentation efforts, which we would l

Re: [scikit-learn] StackOverflow Documentation

2016-08-03 Thread Sebastian Raschka
Hm, that’s an “interesting” approach by SO, I guess their idea is to build a collection of code-and-example based snippets for less well-documented libraries — especially, libraries that want to keep their documentation lean. > But I assume that copying without attribution is actually plagiaris

Re: [scikit-learn] StackOverflow Documentation

2016-08-03 Thread Gael Varoquaux
> In this scikit-learn case, it seems more like that these users are merely > “farming” for SO points and rep by reposting scikit-learn documentation. In > my opinion, the polite way to go about it is to just comment as a > scikit-learn dev saying that these reposts are okay under the BSD licens