Hello community,
here is the log from the commit of package python-scikit-learn for
openSUSE:Factory checked in at 2019-07-29 17:28:32
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-scikit-learn (Old)
and /work/SRC/openSUSE:Factory/.python-scikit-learn.new.4126 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-scikit-learn"
Mon Jul 29 17:28:32 2019 rev:5 rq:718971 version:0.21.2
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-scikit-learn/python-scikit-learn.changes
2019-02-25 17:48:42.202825051 +0100
+++
/work/SRC/openSUSE:Factory/.python-scikit-learn.new.4126/python-scikit-learn.changes
2019-07-29 17:28:46.638249218 +0200
@@ -1,0 +2,546 @@
+Fri Jul 26 16:08:07 UTC 2019 - Todd R <[email protected]>
+
+- Update to Version 0.21.2
+ + sklearn.decomposition
+ * Fix: Fixed a bug in cross_decomposition.CCA improving numerical
+ stability when Y is close to zero..
+ + sklearn.metrics
+ * Fix: Fixed a bug in metrics.euclidean_distances where a part of the
+ distance matrix was left un-instanciated for suffiently large float32
+ datasets (regression introduced in 0.21)..
+ + sklearn.preprocessing
+ * Fix: Fixed a bug in preprocessing.OneHotEncoder where the new
+ drop parameter was not reflected in get_feature_names..
+ + sklearn.utils.sparsefuncs
+ * Fix: Fixed a bug where min_max_axis would fail on 32-bit systems
+ for certain large inputs. This affects preprocessing.MaxAbsScaler,
+ preprocessing.normalize and preprocessing.LabelBinarizer..
+- Update to Version 0.21.1
+ + sklearn.metrics
+ * Fix: Fixed a bug in metrics.pairwise_distances where it would raise
+ AttributeError for boolean metrics when X had a boolean dtype and
+ Y == None..
+ * Fix: Fixed two bugs in metrics.pairwise_distances when
+ n_jobs > 1. First it used to return a distance matrix with same dtype as
+ input, even for integer dtype. Then the diagonal was not zeros for
euclidean
+ metric when Y is X..
+ + sklearn.neighbors
+ * Fix: Fixed a bug in neighbors.KernelDensity which could not be
+ restored from a pickle if sample_weight had been used..
+- Update to Version 0.21.0
+ + Changed models
+ The following estimators and functions, when fit with the same data and
+ parameters, may produce different models from the previous version. This
often
+ occurs due to changes in the modelling logic (bug fixes or enhancements),
or in
+ random sampling procedures.
+ * discriminant_analysis.LinearDiscriminantAnalysis for multiclass
+ classification. |Fix|
+ * discriminant_analysis.LinearDiscriminantAnalysis with 'eigen'
+ solver. |Fix|
+ * linear_model.BayesianRidge |Fix|
+ * Decision trees and derived ensembles when both max_depth and
+ max_leaf_nodes are set. |Fix|
+ * linear_model.LogisticRegression and
+ linear_model.LogisticRegressionCV with 'saga' solver. |Fix|
+ * ensemble.GradientBoostingClassifier |Fix|
+ * sklearn.feature_extraction.text.HashingVectorizer,
+ sklearn.feature_extraction.text.TfidfVectorizer, and
+ sklearn.feature_extraction.text.CountVectorizer |Fix|
+ * neural_network.MLPClassifier |Fix|
+ * svm.SVC.decision_function and
+ multiclass.OneVsOneClassifier.decision_function. |Fix|
+ * linear_model.SGDClassifier and any derived classifiers. |Fix|
+ * Any model using the linear_model.sag.sag_solver function with a 0
+ seed, including linear_model.LogisticRegression,
+ linear_model.LogisticRegressionCV, linear_model.Ridge,
+ and linear_model.RidgeCV with 'sag' solver. |Fix|
+ * linear_model.RidgeCV when using generalized cross-validation
+ with sparse inputs. |Fix|
+ Details are listed in the changelog below.
+ (While we are trying to better inform users by providing this information,
we
+ cannot assure that this list is complete.)
+ + Known Major Bugs
+ * The default max_iter for linear_model.LogisticRegression is too
+ small for many solvers given the default tol. In particular, we
+ accidentally changed the default max_iter for the liblinear solver from
+ 1000 to 100 iterations in released in version 0.16.
+ In a future release we hope to choose better default max_iter and tol
+ heuristically depending on the solver.
+ + Support for Python 3.4 and below has been officially dropped.
+ + sklearn.base
+ * API: The R2 score used when calling score on a regressor will use
+ multioutput='uniform_average' from version 0.23 to keep consistent with
+ metrics.r2_score. This will influence the score method of all
+ the multioutput regressors (except for
+ multioutput.MultiOutputRegressor)..
+ + sklearn.calibration
+ * Enhancement: Added support to bin the data passed into
+ calibration.calibration_curve by quantiles instead of uniformly
+ between 0 and 1..
+ * Enhancement: Allow n-dimensional arrays as input for
+ calibration.CalibratedClassifierCV..
+ + sklearn.cluster
+ * MajorFeature: A new clustering algorithm: cluster.OPTICS: an
+ algoritm related to cluster.DBSCAN, that has hyperparameters easier
+ to set and that scales better,
+ * Fix: Fixed a bug where cluster.Birch could occasionally raise an
+ AttributeError..
+ * Fix: Fixed a bug in cluster.KMeans where empty clusters weren't
+ correctly relocated when using sample weights..
+ * API: The n_components_ attribute in cluster.AgglomerativeClustering
+ and cluster.FeatureAgglomeration has been renamed to
+ n_connected_components_..
+ * Enhancement: cluster.AgglomerativeClustering and
+ cluster.FeatureAgglomeration now accept a distance_threshold
+ parameter which can be used to find the clusters instead of n_clusters.
+ + sklearn.compose
+ * API: compose.ColumnTransformer is no longer an experimental
+ feature..
+ + sklearn.datasets
+ * Fix: Added support for 64-bit group IDs and pointers in SVMLight files..
+ * Fix: datasets.load_sample_images returns images with a deterministic
+ order..
+ + sklearn.decomposition
+ * Enhancement: decomposition.KernelPCA now has deterministic output
+ (resolved sign ambiguity in eigenvalue decomposition of the kernel
matrix)..
+ * Fix: Fixed a bug in decomposition.KernelPCA, fit().transform()
+ now produces the correct output (the same as fit_transform()) in case
+ of non-removed zero eigenvalues (remove_zero_eig=False).
+ fit_inverse_transform was also accelerated by using the same trick as
+ fit_transform to compute the transform of X.
+ * Fix: Fixed a bug in decomposition.NMF where init = 'nndsvd',
+ init = 'nndsvda', and init = 'nndsvdar' are allowed when
+ n_components < n_features instead of
+ n_components <= min(n_samples, n_features).
+ * API: The default value of the init argument in
+ decomposition.non_negative_factorization will change from
+ random to None in version 0.23 to make it consistent with
+ decomposition.NMF. A FutureWarning is raised when
+ the default value is used..
+ + sklearn.discriminant_analysis
+ * Enhancement: discriminant_analysis.LinearDiscriminantAnalysis now
+ preserves float32 and float64 dtypes.
+ * Fix: A ChangedBehaviourWarning is now raised when
+ discriminant_analysis.LinearDiscriminantAnalysis is given as
+ parameter n_components > min(n_features, n_classes - 1), and
+ n_components is changed to min(n_features, n_classes - 1) if so.
+ Previously the change was made, but silently..
+ * Fix: Fixed a bug in discriminant_analysis.LinearDiscriminantAnalysis
+ where the predicted probabilities would be incorrectly computed in the
+ multiclass case.
+ * Fix: Fixed a bug in discriminant_analysis.LinearDiscriminantAnalysis
+ where the predicted probabilities would be incorrectly computed with
eigen
+ solver.
+ + sklearn.dummy
+ * Fix: Fixed a bug in dummy.DummyClassifier where the
+ predict_proba method was returning int32 array instead of
+ float64 for the stratified strategy..
+ * Fix: Fixed a bug in dummy.DummyClassifier where it was throwing a
+ dimension mismatch error in prediction time if a column vector y with
+ shape=(n, 1) was given at fit time.
+ + sklearn.ensemble
+ * MajorFeature: Add two new implementations of
+ gradient boosting trees: ensemble.HistGradientBoostingClassifier
+ and ensemble.HistGradientBoostingRegressor. The implementation of
+ these estimators is inspired by
+ LightGBM and can be orders of
+ magnitude faster than ensemble.GradientBoostingRegressor and
+ ensemble.GradientBoostingClassifier when the number of samples is
+ larger than tens of thousands of samples. The API of these new estimators
+ is slightly different, and some of the features from
+ ensemble.GradientBoostingClassifier and
+ ensemble.GradientBoostingRegressor are not yet supported.
+ These new estimators are experimental, which means that their results or
+ their API might change without any deprecation cycle. To use them, you
+ need to explicitly import enable_hist_gradient_boosting::
+ >>> # explicitly require this experimental feature
+ >>> from sklearn.experimental import enable_hist_gradient_boosting #
noqa
+ >>> # now you can import normally from sklearn.ensemble
+ >>> from sklearn.ensemble import HistGradientBoostingClassifier.
+ * Feature: Add ensemble.VotingRegressor
+ which provides an equivalent of ensemble.VotingClassifier
+ for regression problems.
+ * Efficiency: Make ensemble.IsolationForest prefer threads over
+ processes when running with n_jobs > 1 as the underlying decision tree
+ fit calls do release the GIL. This changes reduces memory usage and
+ communication overhead.
+ * Efficiency: Make ensemble.IsolationForest more memory efficient
+ by avoiding keeping in memory each tree prediction..
+ * Efficiency: ensemble.IsolationForest now uses chunks of data at
+ prediction step, thus capping the memory usage..
+ * Efficiency: sklearn.ensemble.GradientBoostingClassifier and
+ sklearn.ensemble.GradientBoostingRegressor now keep the
+ input y as float64 to avoid it being copied internally by trees..
+ * Enhancement: Minimized the validation of X in
+ ensemble.AdaBoostClassifier and ensemble.AdaBoostRegressor.
+ * Enhancement: ensemble.IsolationForest now exposes warm_start
+ parameter, allowing iterative addition of trees to an isolation
+ forest..
+ * Fix: The values of feature_importances_ in all random forest based
+ models (i.e.
+ ensemble.RandomForestClassifier,
+ ensemble.RandomForestRegressor,
+ ensemble.ExtraTreesClassifier,
+ ensemble.ExtraTreesRegressor,
+ ensemble.RandomTreesEmbedding,
+ ensemble.GradientBoostingClassifier, and
+ ensemble.GradientBoostingRegressor) now:
+ > sum up to 1
+ > all the single node trees in feature importance calculation are ignored
+ > in case all trees have only one single node (i.e. a root node),
+ feature importances will be an array of all zeros.
+ * Fix: Fixed a bug in ensemble.GradientBoostingClassifier and
+ ensemble.GradientBoostingRegressor, which didn't support
+ scikit-learn estimators as the initial estimator. Also added support of
+ initial estimator which does not support sample weights. and.
+ * Fix: Fixed the output of the average path length computed in
+ ensemble.IsolationForest when the input is either 0, 1 or 2.
++++ 349 more lines (skipped)
++++ between
/work/SRC/openSUSE:Factory/python-scikit-learn/python-scikit-learn.changes
++++ and
/work/SRC/openSUSE:Factory/.python-scikit-learn.new.4126/python-scikit-learn.changes
Old:
----
scikit-learn-0.20.2.tar.gz
New:
----
scikit-learn-0.21.2.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-scikit-learn.spec ++++++
--- /var/tmp/diff_new_pack.G7H1ut/_old 2019-07-29 17:28:48.098248677 +0200
+++ /var/tmp/diff_new_pack.G7H1ut/_new 2019-07-29 17:28:48.102248676 +0200
@@ -17,46 +17,37 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
-%define oldpython python
-# test suite just doesn't work and upstream doesn't look like fixing it
-# anytime soon, gh#scikit-learn/scikit-learn#12369
-# %%ifarch %%{ix86} x86_64
-# %%bcond_without test
-# %%else
-%bcond_with test
-# %%endif
+%define skip_python2 1
Name: python-scikit-learn
-Version: 0.20.2
+Version: 0.21.2
Release: 0
Summary: Python modules for machine learning and data mining
License: BSD-3-Clause
Group: Development/Libraries/Python
URL: http://scikit-learn.org/
Source0:
https://files.pythonhosted.org/packages/source/s/scikit-learn/scikit-learn-%{version}.tar.gz
+BuildRequires: %{python_module Cython}
BuildRequires: %{python_module devel}
-BuildRequires: %{python_module matplotlib}
BuildRequires: %{python_module numpy-devel >= 1.8.2}
-BuildRequires: %{python_module pytest}
BuildRequires: %{python_module scipy >= 0.13.3}
BuildRequires: %{python_module setuptools}
-BuildRequires: %{python_module xml}
BuildRequires: fdupes
BuildRequires: gcc-c++
BuildRequires: gcc-fortran
BuildRequires: openblas-devel
BuildRequires: python-rpm-macros
+# SECTION test requirements
+BuildRequires: %{python_module joblib}
+BuildRequires: %{python_module matplotlib}
+BuildRequires: %{python_module nose}
+BuildRequires: %{python_module pytest}
+BuildRequires: %{python_module xml}
+# /SECTION
+Requires: python-joblib
Requires: python-matplotlib
Requires: python-numpy >= 1.8.2
Requires: python-scipy >= 0.13.3
Requires: python-xml
-%if %{with test}
-BuildRequires: %{python_module Cython}
-BuildRequires: %{python_module nose}
-%endif
-%ifpython2
-Provides: %{oldpython}-scikits-learn = %{version}
-Obsoletes: %{oldpython}-scikits-learn < %{version}
-%endif
%python_subpackages
%description
@@ -65,6 +56,7 @@
%prep
%setup -q -n scikit-learn-%{version}
+rm -rf sklearn/.pytest_cache
%build
%python_build
@@ -73,16 +65,20 @@
%python_install
%python_expand %fdupes %{buildroot}%{$python_sitearch}
-%if %{with test}
+# Precision-related errors on non-x86 platforms
+%ifarch %{ix86} x86_64
%check
export SKLEARN_SKIP_NETWORK_TESTS=1
NO_TESTS="test_feature_importance_regression or
test_minibatch_with_many_reassignments"
NO_TESTS="$NO_TESTS or test_sparse_coder_parallel_mmap or
test_explained_variances"
export NO_TESTS
+mv sklearn sklearn_temp
+rm -rf build _build.*
%{python_expand export PYTHONPATH=%{buildroot}%{$python_sitearch}
-# rm -v ensemble/tests/test_gradient_boosting.py tests/test_init.py
-py.test-%{$python_bin_suffix} -v -k "not ($NO_TESTS)" sklearn
+rm -rf build _build.*
+py.test-%{$python_bin_suffix} -p no:cacheprovider -v -k "not ($NO_TESTS)"
%{buildroot}%{$python_sitearch}/sklearn
}
+mv sklearn_temp sklearn
%endif
%files %{python_files}
++++++ scikit-learn-0.20.2.tar.gz -> scikit-learn-0.21.2.tar.gz ++++++
/work/SRC/openSUSE:Factory/python-scikit-learn/scikit-learn-0.20.2.tar.gz
/work/SRC/openSUSE:Factory/.python-scikit-learn.new.4126/scikit-learn-0.21.2.tar.gz
differ: char 5, line 1