Stochastic Gradient Descent is one of the most important optimisation algorithm in Machine Learning. So having it perform better than Java is important to have more widespread adoption.
On Sunday, April 27, 2014 2:03:28 PM UTC+8, Freddy Chua wrote: > > This code takes 60+ secs to execute on my machine. The Java equivalent > takes only 0.2 secs!!! Please tell me how to optimise the following > code.begin > > begin > N = 10000 > K = 100 > rate = 1e-2 > ITERATIONS = 1 > > # generate y > y = rand(N) > > # generate x > x = rand(K, N) > > # generate w > w = zeros(Float64, K) > > tic() > for i=1:ITERATIONS > for n=1:N > y_hat = 0.0 > for k=1:K > y_hat += w[k] * x[k,n] > end > > for k=1:K > w[k] += rate * (y[n] - y_hat) * x[k,n] > end > end > end > toc() > end > > Sorry for repeated posting, I did so to properly indent the code.. >
