By a process of elimination, I determined that the only variable whose
declaration affected the run time was vGridCapital. The variable is
declared to be of type Array{Float64,1}, but is initialized as
vGridCapital = 0.5*capitalSteadyState:0.00001:1.5*capitalSteadyState
which, unlike in Matlab, produces a Range object, rather than an array. If
the line above is modified to
vGridCapital = [0.5*capitalSteadyState:0.00001:1.5*capitalSteadyState]
then the type instability is eliminated, and all type declarations can be
removed with no effect on execution time.
--Peter
On Monday, June 16, 2014 2:59:31 PM UTC-7, Jesus Villaverde wrote:
>
> Also, defining
>
> mylog(x::Float64) = ccall((:log, "libm"), Float64, (Float64,), x)
>
> made quite a bit of difference for me, from 1.92 to around 1.55. If I also
> add @inbounds, I go down to 1.45, making Julia only twice as sslow as C++.
> Numba still beats Julia, which kind of bothers me a bit
>
>
> Thanks for the suggestions.
>
>
> On Monday, June 16, 2014 4:56:34 PM UTC-4, Jesus Villaverde wrote:
>>
>> Hi
>>
>> 1) Yes, we pre-compiled the function.
>>
>> 2) As I mentioned before, we tried the code with and without type
>> declaration, it makes a difference.
>>
>> 3) The variable names turns out to be quite useful because this code will
>> be eventually nested into a much larger project where it is convenient to
>> have very explicit names.
>>
>> Thanks
>>
>> On Monday, June 16, 2014 12:13:44 PM UTC-4, Dahua Lin wrote:
>>>
>>> First, I agree with John that you don't have to declare the types in
>>> general, like in a compiled language. It seems that Julia would be able to
>>> infer the types of most variables in your codes.
>>>
>>> There are several ways that your code's efficiency may be improved:
>>>
>>> (1) You can use @inbounds to waive bound checking in several places,
>>> such as line 94 and 95 (in RBC_Julia.jl)
>>> (2) Line 114 and 116 involves reallocating new arrays, which is probably
>>> unnecessary. Also note that Base.maxabs can compute the maximum of absolute
>>> value more efficiently than maximum(abs( ... ))
>>>
>>> In terms of measurement, did you pre-compile the function before
>>> measuring the runtime?
>>>
>>> A side note about code style. It seems that it uses a lot of Java-ish
>>> descriptive names with camel case. Julia practice tends to encourage more
>>> concise naming.
>>>
>>> Dahua
>>>
>>>
>>>
>>> On Monday, June 16, 2014 10:55:50 AM UTC-5, John Myles White wrote:
>>>>
>>>> Maybe it would be good to verify the claim made at
>>>> https://github.com/jesusfv/Comparison-Programming-Languages-Economics/blob/master/RBC_Julia.jl#L9
>>>>
>>>>
>>>> I would think that specifying all those types wouldn’t matter much if
>>>> the code doesn’t have type-stability problems.
>>>>
>>>> — John
>>>>
>>>> On Jun 16, 2014, at 8:52 AM, Florian Oswald <[email protected]>
>>>> wrote:
>>>>
>>>> > Dear all,
>>>> >
>>>> > I thought you might find this paper interesting:
>>>> http://economics.sas.upenn.edu/~jesusfv/comparison_languages.pdf
>>>> >
>>>> > It takes a standard model from macro economics and computes it's
>>>> solution with an identical algorithm in several languages. Julia is
>>>> roughly
>>>> 2.6 times slower than the best C++ executable. I was bit puzzled by the
>>>> result, since in the benchmarks on http://julialang.org/, the slowest
>>>> test is 1.66 times C. I realize that those benchmarks can't cover all
>>>> possible situations. That said, I couldn't really find anything unusual in
>>>> the Julia code, did some profiling and removed type inference, but still
>>>> that's as fast as I got it. That's not to say that I'm disappointed, I
>>>> still think this is great. Did I miss something obvious here or is there
>>>> something specific to this algorithm?
>>>> >
>>>> > The codes are on github at
>>>> >
>>>> > https://github.com/jesusfv/Comparison-Programming-Languages-Economics
>>>> >
>>>> >
>>>>
>>>>