On Thu, Aug 9, 2012 at 9:11 AM, Zach Bastick <[email protected]> wrote:

>
> So, how do you do multivariate regression with higher degree polynomials?
>

In the multivariate case, the principle is the same as np.vander. You just
need to concatenate the higher degree features. Only this time since your
data is multi-variate you can also add the feature cross-products.

This PR might be of inspiration:
https://github.com/scikit-learn/scikit-learn/pull/476

But as I said before, the standard way to do what you want is to use a
regressor with a polynomial kernel. You can do that with SVR or with kernel
ridge regression (not supported in scikit-learn yet). SVR has one more
hyper-parameter (epsilon) but contrary to kernel ridge regression, its
solution is sparse.

HTH,
Mathieu
------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to