Thanks, Stefan, it's good to know this already exists! Seeing it mentioned
side-by-side with an Bloomberg API indicates that this feature request
probably came from the financial industry.
-Zhong
On Tuesday, August 2, 2016 at 11:50:51 PM UTC-5, Stefan Karpinski wrote:
>
> Julia Computing
Zhong, people chop wood and they play the piano, too. Fill in Julia or
Excel where you like.
Sisyphuss, VBA show a 4% speed GAIN through making nsamples and y() public
variables. Try it out!
Julia Computing offers a product (JuliaInXL) which does exactly this.
On Tue, Aug 2, 2016 at 11:07 PM, Zhong Pan wrote:
> Eric, hustf,
>
> I think making Julia attractive to Excel/VBA users will be quite valuable.
> Excel still rules in business world for simple to moderately
Eric, hustf,
I think making Julia attractive to Excel/VBA users will be quite valuable.
Excel still rules in business world for simple to moderately complex data
analysis. Strangely, even engineers love it - there is still a large group
of hardware/mechanical engineers who are not productive
Julia is not as "mature" as VBA, which prevents "analysts" of large firms
adopting it.
In addition, they will be happier to continue using global variables.
On Monday, August 1, 2016 at 8:14:37 AM UTC+2, Eric Forgy wrote:
>
> I mentioned to Prof. Edelman (only half jokingly) at an event in
>
It is nice to have a little check on speed from time to time. I still use
VBA for easy cooperation with less programming savvy colleguaes.
Julia 1.17s.
VBA (excel alt + f11):12 s.
This is a bit unfair to neolithic man Joel Spolsky since no optimization
was performed:
Sub benchmark()
Since I contributed the Numba JIT timing earlier in the thread, it seems
only fair to note that the modified Julia version with the properly
preallocated data is now 17% faster than the Numba version on my computer.
Overall, this seems to support my thesis that good Julia code is on par or
Given the apparent interest in the topic and the decisions that people seem to
be making, it seems worth pointing out that folks are still using apples-to-
oranges comparisons on this benchmark.
There are at least two important differences:
- in the other languages, `linspace` allocates a
Agree that while raw speed is important, in most situations it wouldn't be
the most important reason to choose one programming language over another.
I came from the angle of an engineer in a small company. For myself, the
main attraction of Julia was the easiness to achieve decent speed
I haven't done any systematic benchmarking since Numba introduced the
ability to JIT compile entire classes. In my experience, very well written
Julia code is usually equivalent or better (in cases when @simd is helpful)
compared to Numba JIT'd code. The Python code is sometimes easier to
Interesting. Did you use the updated Julia code?
Have you done any comparisons between reading and writing Numba JIT
classes and Julia types in tight loops?
On Monday, July 25, 2016 at 10:41:48 AM UTC-4, dexto...@gmail.com wrote:
>
> Just for the sake of comprehensiveness, I ran your Python
Just for the sake of comprehensiveness, I ran your Python benchmark through
the Numba JIT library (which uses the same underlying LLVM infrastructure
that Julia does) and on my computer the Python code is faster than Julia by
68%. Vanilla CPython is terrible for this kind of simple explicit
Dear Zhong,
I understand your points very well... as I said I also love and use Julia.
But at this point I understood that the *only* thing that was discussed was
speed. And in that matter, a factor 1.6 is still a big difference. In my
case I'm willing to use it instead of fortran right now,
Zong, we would be pleased to increment the tally of Julia users with your
use.
On Tuesday, July 12, 2016 at 12:00:02 PM UTC-4, Zhong Pan wrote:
>
> Ferran,
>
> I can totally understand Fortran is still the king of fast and efficient
> numerical calculation in terms of computation time and
Ferran,
I can totally understand Fortran is still the king of fast and efficient
numerical calculation in terms of computation time and memory usage. As to
"fast", I am comparing Julia main to high-level languages like Python, R,
and Matlab which offers similar level of fast prototyping.
just my two cents...
On Tuesday, July 12, 2016 at 5:28:24 AM UTC+2, Chris Rackauckas wrote:
>
>
> MATLAB really improved their JIT in 2015b, but as you can see, it cannot
> come close to Julia. The main reason is that, although it can do quite a
> bit because it ensures type stability, it has
btw, I use Julia also, and I really like it very much :)
Chris,
Thanks for the comments! I tried @inbounds and @fastmath, which reduced
Julia execution time from 2.55 sec to 2.46 sec. @simd wouldn't be
appropriate here as the point is to test brute-force loops that in reality
are employed in calculations that cannot be vectorized. (And yes, I agree
Also: it's "brute force", not "brutal-force". The connotation is quite
different :).
On Monday, July 11, 2016 at 9:21:34 AM UTC-7, Zhong Pan wrote:
>
> Thanks to all the helpful comments. Just for the sake of tidiness, I
> attached rev 3 of the (now even simpler) benchmark result PDF.
>
>
You should add @inbounds and try adding @fastmath to the Julia code. Maybe
@simd, though the compiler should be doing that automatically. Make sure
Julia is compiling with -O3. I wouldn't be surprised if this gets nearly to
C++.
If you want to put the random number generation back in, you can
On Monday, July 11, 2016 at 6:38:13 PM UTC+2, Sisyphuss wrote:
>
> It's surprising to see Python so slow and Matlab so fast.
>
> Matlab: does the JIT compiler translate the loops to C?
>
Not to C, rather directly to machine code. LLVM seems to be in use here,
also.
It's surprising to see Python so slow and Matlab so fast.
Python: maybe it will get much faster when using comprehension?
Matlab: does the JIT compiler translate the loops to C?
On Monday, July 11, 2016 at 2:09:28 PM UTC+2, David Barton wrote:
>
> For reference, with Matlab 2016a: 4.97 sec;
Thanks to all the helpful comments. Just for the sake of tidiness, I
attached rev 3 of the (now even simpler) benchmark result PDF.
Changes are:
* Removed random number generation as it was pointed out that random
generators are relatively complex, and difference in implementation can
lead to
David,
Thanks for the test results and the correction - now I recall how it's done
in Matlab. Haven't been using it for a while. :-)
-Zhong
On Monday, July 11, 2016 at 7:09:28 AM UTC-5, David Barton wrote:
>
> For reference, with Matlab 2016a: 4.97 sec; Julia 0.4.6: 2.76 sec; Python
> 3.5.1:
For reference, with Matlab 2016a: 4.97 sec; Julia 0.4.6: 2.76 sec; Python
3.5.1: 166.76 sec.
Note that there is a mistake in your Matlab code - zeros(n) returns an n by
n matrix of zeros (hence running out of memory). Instead you want zeros(1,
n) to get a vector.
David
On Monday, 11 July
Hi Andreas,
Thanks for the comments.
* If someone has a more recent Matlab it'll be interesting to try. The
license is so expensive and I don't have access to newer version now.
* Yes you are right, I also realized that I don't know how much the random
number generator implementation
2 small things:
* a more recent Matlab should already be faster, especially in this loop
thing
* random generators' runtime -depending on the complexity they spend-
really makes a difference.
Hi Zhong,
you may want to check out Julia 0.5, on my box your benchmark is ~13%
faster with Julia 0.5: 3.367658 seconds with Julia 0.4.6 and 2.898068
seconds with Julia 0.5.
Bye,
Mosè
Hi,
>
> Sorry I have to post a revision so quickly. Just after I posted my
> previous benchmark, I found
28 matches
Mail list logo