On Thu, Aug 9, 2012 at 3:28 PM, Vlad Niculae wrote:
> Andy, Mathieu:
> The docs are lacking guidelines and examples on how to tune SVR
> parameters. IIUC, C, gamma, etc should be use just as in SVC. The tricky
> part is epsilon, how should it be set? What are sensible defaults and a
> sensible gr
On Thu, Aug 9, 2012 at 1:30 PM, Andreas Müller wrote:
> Sorry for being unspecific.
> Using the kernel should be more efficient with higher degree polynomials
> and when having
> many features. The dimensionality of the explicit features grows very fast
> with the degree while the cost
> of the ke
he kernel computation stays the same.
>
> Also SVMs work quite well in may settings.
>
> Cheers,
> Andy
>
> - Ursprüngliche Mail -
> Von: "Paolo Losi"
> An: [email protected]
> Gesendet: Donnerstag, 9. August 2012 11:53:40
quite well in may settings.
Cheers,
Andy
- Ursprüngliche Mail -
Von: "Paolo Losi"
An: [email protected]
Gesendet: Donnerstag, 9. August 2012 11:53:40
Betreff: Re: [Scikit-learn-general] multivariate regression with higher degree
polynomials
Hi Andy,
O
Hi Andy,
On Thu, Aug 9, 2012 at 11:53 AM, Andreas Müller wrote:
> Also you might need to normalize the data and set the value of C.
> Still this should work better than doing the explicit expansion.
>
What do you mean exactly by work better?
Paolo
---
etreff: Re: [Scikit-learn-general] multivariate regression with higher degree
polynomials
On Thu, Aug 9, 2012 at 4:02 PM, Zach Bastick < [email protected] > wrote:
I'm going to manually stop it now by closing the python window. Am I
doing something wrong?
It probably mean
On Thu, Aug 9, 2012 at 4:02 PM, Zach Bastick wrote:
> I'm going to manually stop it now by closing the python window. Am I
> doing something wrong?
>
>
It probably means that epsilon is not well tuned. You can try
SVR(kernel="linear") to see how it fares compared to least squares.
Mathieu
--
I ran:
>> model = SVR(kernel="poly", degree=2)
but the % Error of the prediction is worse than using simple Ordinary
Least Squares using:
>> linear_model.LinearRegression()
It's also much slower. I changed the degree to 4 to see if the results
of the prediction got any better, but it's taking
On Thu, Aug 9, 2012 at 9:11 AM, Zach Bastick wrote:
>
> So, how do you do multivariate regression with higher degree polynomials?
>
In the multivariate case, the principle is the same as np.vander. You just
need to concatenate the higher degree features. Only this time since your
data is multi-v
That works when there is only 1 feature / indepedent-variable / x-value
for each case, but not when there are many (ie. for multivariate
regression).
Since there are many independent variables my variables look like this:
|x= [[1,2,3,4,5], [2,2,4,4,5], [2,2,4,4,1]]
y= [1,2,3,4,5]
|
For
Another solution is to use SVR(kernel="poly", degree=2).
Mathieu
--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and
threat landscape has changed and how IT managers can r
The following example explains how to do it using the numpy.vander function:
http://scikit-learn.org/stable/auto_examples/linear_model/plot_polynomial_interpolation.html
Mathieu
On Wed, Aug 8, 2012 at 11:27 AM, Zach Bastick wrote:
> How can you increase the degree of the polynomial for multivari
How can you increase the degree of the polynomial for multivariate
LinearRegression?
Numpy.polyfit has a "deg" parameter, allowing you to choose the degree
of the fitting polynomial, but doesn't work with multivariate data:
http://docs.scipy.org/doc/numpy/reference/generated/numpy.polyfit.html
F
13 matches
Mail list logo