Le lundi 19 octobre 2015 à 12:05 -0700, Phil Tomson a écrit :
> Several comments here about the need to de-vectorize code and use for
> -loops instead. However, vectorized code is a lot more compact and
> generally easier to read than lots of for-loops. I seem to recall
> that there was discussion in the past about speeding up vectorized
> code in Julia so that it could be on par with the vectorized code
> performance - is this still something being worked on for future
> versions?
> 
> Otherwise, if we keep telling people that they need to convert their
> code to use for-loops, I think Julia isn't going to seem very
> compelling for people looking for alternatives to Matlab, R, etc.
There's a long discussion, with still no perfect solution :
https://github.com/JuliaLang/julia/issues/8450

But I'm confident something will finally be done about this. :-)


Regards

> On Sunday, October 18, 2015 at 6:41:54 AM UTC-7, Daniel Carrera
> wrote:
> > Hello,
> > 
> > Other people have already given advice on how to speed up the code.
> > I just want to comment that Julia really is faster than Matlab, but
> > the way that you make code faster in Julia is almost the opposite
> > of how you do it in Matlab. Specifically, in Matlab the advice is
> > that if you want the code to be fast, you need to eliminate every
> > loop you can and write vectorized code instead. This is because
> > Matlab loops are slow. But Julia loops are fast, and vectorized
> > code creates a lot of overhead in the form of temporary variables,
> > garbage collection, and extra loops. So in Julia you optimize code
> > by putting everything into loops. The upshot is that if you take a
> > Matlab-optimized program and just do a direct line-by-line
> > conversion to Julia, the Julia version can easily be slower. But by
> > the same token, if you took a Julia-optimized program and converted
> > it line-by-line to Matlab, the Matlab version would be ridiculously
> > slow.
> > 
> > Oh, and in Julia you also care about types. If the compiler can
> > infer correctly the types of your variables it will write more
> > optimal code.
> > 
> > Cheers,
> > Daniel.
> > 
> > 
> > On Sunday, 18 October 2015 13:17:50 UTC+2, Vishnu Raj wrote:
> > > Although Julia homepage shows using Julia over Matlab gains more
> > > in performance, my experience is quite opposite.
> > > I was trying to simulate channel evolution using Jakes Model for
> > > wireless communication system.
> > > 
> > > Matlab code is:
> > > function [ h, tf ] = Jakes_Flat( fd, Ts, Ns, t0, E0, phi_N )
> > > %JAKES_FLAT 
> > > %   Inputs:
> > > %       fd, Ts, Ns  : Doppler frequency, sampling time, number of
> > > samples
> > > %       t0, E0      : initial time, channel power
> > > %       phi_N       : initial phase of the maximum Doppler
> > > frequeny
> > > %       sinusoid
> > > %
> > > %   Outputs:
> > > %       h, tf       : complex fading vector, current time
> > > 
> > >     if nargin < 6,  phi_N = 0;  end
> > >     if nargin < 5,  E0 = 1;     end
> > >     if nargin < 4,  t0 = 0;     end
> > >     
> > >     N0 = 8;         % As suggested by Jakes
> > >     N  = 4*N0 + 2;  % an accurate approximation
> > >     wd = 2*pi*fd;   % Maximum Doppler frequency[rad]
> > >     t  = t0 + [0:Ns-1]*Ts;  % Time vector
> > >     tf = t(end) + Ts;       % Final time
> > >     coswt = [ sqrt(2)*cos(wd*t); 2*cos(wd*cos(2*pi/N*[1:N0]')*t)
> > > ];
> > >     h  = E0/sqrt(2*N0+1)*exp(j*[phi_N pi/(N0+1)*[1:N0]])*coswt;
> > > end
> > > Enter code here...
> > > 
> > > My call results in :
> > > >> tic; Jakes_Flat( 926, 1E-6, 50000, 0, 1, 0 ); toc
> > > Elapsed time is 0.008357 seconds.
> > > 
> > > 
> > > My corresponding Julia code is
> > > function Jakes_Flat( fd, Ts, Ns, t0 = 0, E0 = 1, phi_N = 0 )
> > > # Inputs:
> > > #
> > > # Outputs:
> > >   N0  = 8;                  # As suggested by Jakes
> > >   N   = 4*N0+2;             # An accurate approximation
> > >   wd  = 2*pi*fd;            # Maximum Doppler frequency
> > >   t   = t0 + [0:Ns-1]*Ts;
> > >   tf  = t[end] + Ts;
> > >   coswt = [ sqrt(2)*cos(wd*t'); 2*cos(wd*cos(2*pi/N*[1:N0])*t') ]
> > >   h = E0/sqrt(2*N0+1)*exp(im*[ phi_N pi/(N0+1)*[1:N0]']) * coswt
> > >   return h, tf;
> > > end
> > > # Saved this as "jakes_model.jl"
> > > 
> > > 
> > > My first call results in 
> > > julia> include( "jakes_model.jl" )
> > > Jakes_Flat (generic function with 4 methods)
> > > 
> > > julia> @time Jakes_Flat( 926, 1e-6, 50000, 0, 1, 0 )
> > > elapsed time: 0.65922234 seconds (61018916 bytes allocated)
> > > 
> > > julia> @time Jakes_Flat( 926, 1e-6, 50000, 0, 1, 0 )
> > > elapsed time: 0.042468906 seconds (17204712 bytes allocated,
> > > 63.06% gc time)
> > > 
> > > For first execution, Julia is taking huge amount of time. On
> > > second call, even though Julia take considerably less(0.042468906
> > > sec) than first(0.65922234 sec), it's still much higher to
> > > Matlab(0.008357 sec).
> > > I'm using Matlab R2014b and Julia v0.3.10 on Mac OSX10.10.
> > > 
> > > - vish
> > > 

Reply via email to