Matt Harrison <[email protected]> added the comment:
The ML world has collapsed on the terms X and y. (With that capitalization).
Moreover, most (Python libraries) follow the interface of scikit-learn [0].
Training a model looks like this:
model = LinearRegression()
model.fit(X, y)
After that, the model instance has attribute that end in "_" that were learned
from fitting. For linear regression[1] you get:
model.coef_ # slope
model.intercept_ # intercept
To make predictions you call .predict:
y_hat = model.predict(X)
One bonus of leveraging the .fit/.predict interface (which other libraries such
as XGBoost have also adopted) is that if your model is in the correct layout,
you can trivially try different models.
0 -
https://scikit-learn.org/stable/tutorial/basic/tutorial.html#learning-and-predicting
1 -
https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LinearRegression.html#sklearn.linear_model.LinearRegression
----------
nosy: +matthewharrison
_______________________________________
Python tracker <[email protected]>
<https://bugs.python.org/issue44151>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe:
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com