[julia-users] Re: Ancient Gfortran/GCC version in Travis-CI image

2016-10-04 Thread Helge Eichhorn
Thanks Tony, that worked!

Here is my new Travis config for the public record:
https://github.com/KosmosML/LLEA.jl/blob/a2ea6ffa80f8a8d8c2f3198ec94a26136281c3ee/.travis.yml

Am Montag, 3. Oktober 2016 23:44:50 UTC+2 schrieb Tony Kelman:
>
> Check the travis docs for the apt sources and packages addons to install 
> newer compiler versions from the Ubuntu toolchain ppa.



[julia-users] Ancient Gfortran/GCC version in Travis-CI image

2016-10-03 Thread Helge Eichhorn
Dear all,

I just ran into trouble when I tried to build a package with a 
Fortran2008-based dependency on Travis. The most modern version of Gfortran 
I could install was 4.6 which cannot build the Fortran code. The package 
list 
(https://github.com/travis-ci/apt-package-whitelist/blob/master/ubuntu-precise) 
on the other hand says that Gfortran 6 should be available.

I have no clue about how the base images for Travis's container 
infrastructure are generated but does the above mean that the image for 
Julia needs to updated?

Best regards,
Helge


[julia-users] Re: Any suggested mono font for julia with unicode?

2016-07-13 Thread Helge Eichhorn
I switched from Source Code Pro to Fira Mono 
 because of the latter's better unicode 
support.

Am Mittwoch, 13. Juli 2016 02:43:55 UTC+2 schrieb Po Choi:
>
> I find that some mono fonts doesn't display distinguishable greek letters 
> in unicode.
> Any suggested mono font for julia with unicode?
>


[julia-users] Re: Wrap fortran 90 interface code for DDE

2016-06-24 Thread Helge Eichhorn
Oops, sent the message by accident while editing.

The code I posted is an example for a C interface from Dopri.jl.

Am Mittwoch, 8. Juni 2016 04:46:40 UTC-4 schrieb Dupont:
>
> Dear users,
>
> I would like to wrap the code 
> to solve delay 
> differential equations. I have been wrapping C code in the past, but I 
> don't know much about fortran. 
>
> In the file *dde_solver_m.f90*, it is said that one must call the fortran 
> function *DDE_SOLVER *through an interface*. *Hence, I compiled the code 
> as a library 
>
> gfortran -Wall -shared -o libdde_solver.dylib -lm -fPIC dde_solver_m.f90
>
> but when I looked at
>
> nm -a libdde_solver.dylib
>
> I could not find the function DDE_SOLVER. I know that this may seem more 
> like a fortran question than a Julia one, but I would be gratefull if one 
> could give me a hint on how to call this library from Julia,
>
> Thank you for your help,
>
> Best regards
>
>
>
>
>

[julia-users] Re: Wrap fortran 90 interface code for DDE

2016-06-24 Thread Helge Eichhorn
You might want to have a look at my Dopri.jl 
 package. In my experience the best way 
to wrap modern Fortran (90+) in Julia is to implement a C interface to the 
Fortran code via the ISO_C_BINDING intrinsic module. You can then call the 
interface easily from Julia.

subroutine c_dop853(n, cfcn, x, y, xend, rtol, atol,&
itol, csolout, iout, work, lwork, iwork,&
liwork, tnk, idid) bind(c)
integer(c_int), intent(in) :: n
type(c_funptr), intent(in), value :: cfcn
real(c_double), intent(inout) :: x
real(c_double), dimension(n), intent(inout) :: y
real(c_double), intent(in) :: xend
real(c_double), dimension(n), intent(in) :: rtol
real(c_double), dimension(n), intent(in) :: atol
integer(c_int), intent(in) :: itol
type(c_funptr), intent(in), value :: csolout
integer(c_int), intent(in) :: iout
real(c_double), dimension(lwork), intent(inout) :: work
integer(c_int), intent(in) :: lwork
integer(c_int), dimension(liwork), intent(inout) :: iwork
integer(c_int), intent(in) :: liwork type(c_ptr), intent(in) :: tnk
integer(c_int), intent(out) :: idid procedure(c_fcn), pointer :: fcn
procedure(c_solout), pointer :: solout

call c_f_procpointer(cfcn, fcn)
call c_f_procpointer(csolout, solout)
call dop853(n, fcn, x, y, xend, rtol, atol,&
itol, solout, iout, work, lwork, iwork,& liwork, tnk, idid)end subroutine 
c_dop853

Am Mittwoch, 8. Juni 2016 04:46:40 UTC-4 schrieb Dupont:
>
> Dear users,
>
> I would like to wrap the code 
> to solve delay 
> differential equations. I have been wrapping C code in the past, but I 
> don't know much about fortran. 
>
> In the file *dde_solver_m.f90*, it is said that one must call the fortran 
> function *DDE_SOLVER *through an interface*. *Hence, I compiled the code 
> as a library 
>
> gfortran -Wall -shared -o libdde_solver.dylib -lm -fPIC dde_solver_m.f90
>
> but when I looked at
>
> nm -a libdde_solver.dylib
>
> I could not find the function DDE_SOLVER. I know that this may seem more 
> like a fortran question than a Julia one, but I would be gratefull if one 
> could give me a hint on how to call this library from Julia,
>
> Thank you for your help,
>
> Best regards
>
>
>
>
>

[julia-users] Constructors for types with Nullable fields

2016-06-10 Thread Helge Eichhorn
Hi,

let's say I have the following type with two Nullable fields:

type WithNulls 
a::Nullable{Float64} 
b::Nullable{Float64}
end


I now want the user to be able to create an instance of this type without 
caring about Nullables. For this I use a constructor with keyword arguments.


function WithNulls(;
a = Nullable{Float64}(),
b = Nullable{Float64}(),
)
WithNulls(a, b)
end


This works for Float64 but not for the other leaf types of Real.


# Works
WithNulls(a=3.0)

# Does not work
WithNulls(a=pi)


This can be fixed by adding the following methods to convert:


Base.convert{T<:Real}(::Type{Nullable{T}}, v::T) = Nullable{T}(v)
Base.convert{T<:Real,S<:Real}(::Type{Nullable{T}}, v::S) = Nullable{T}(
convert(T,v))


Finally the question:

Should the above convert methods not be part of Base? I think converting 
between different Nullable{T<:Real} values might be a common use case. Is 
there a more elegant way to do this?


Re: [julia-users] Bizarre Segfault during ccall on OSX

2016-06-01 Thread Helge Eichhorn
jl_call_method_internal at 
/Users/helge/projects/julia/src/./julia_internal.h:88
jl_apply_generic at /Users/helge/projects/julia/src/gf.c:1595
typeinf_ext at ./inference.jl:1542
jl_call_method_internal at 
/Users/helge/projects/julia/src/./julia_internal.h:88
jl_apply_generic at /Users/helge/projects/julia/src/gf.c:1595
jl_apply at /Users/helge/projects/julia/src/./julia.h:1386
jl_type_infer at /Users/helge/projects/julia/src/gf.c:219
cache_method at /Users/helge/projects/julia/src/gf.c:678
jl_mt_assoc_by_type at /Users/helge/projects/julia/src/gf.c:711
jl_apply_generic at /Users/helge/projects/julia/src/gf.c:1574
do_test at /Users/helge/.julia/v0.5/BaseTestNext/src/BaseTestNext.jl:181
unknown function (ip: 0x3199d05ad)
jl_call_method_internal at 
/Users/helge/projects/julia/src/./julia_internal.h:88
jl_apply_generic at /Users/helge/projects/julia/src/gf.c:1585
macro expansion; at /Users/helge/.julia/v0.5/Dopri/test/lowlevel.jl:104
unknown function (ip: 0x3199bfc31)
jl_call_method_internal at 
/Users/helge/projects/julia/src/./julia_internal.h:88
jl_toplevel_eval_flex at /Users/helge/projects/julia/src/toplevel.c:556
jl_parse_eval_all at /Users/helge/projects/julia/src/ast.c:780
jl_load at /Users/helge/projects/julia/src/toplevel.c:585
jl_load_ at /Users/helge/projects/julia/src/toplevel.c:591
include_from_node1 at ./loading.jl:426
unknown function (ip: 0x10bebc07c)
jl_call_method_internal at 
/Users/helge/projects/julia/src/./julia_internal.h:88
jl_apply_generic at /Users/helge/projects/julia/src/gf.c:1595
do_call at /Users/helge/projects/julia/src/interpreter.c:58
eval at /Users/helge/projects/julia/src/interpreter.c:174
jl_interpret_toplevel_expr at 
/Users/helge/projects/julia/src/interpreter.c:25
jl_toplevel_eval_flex at /Users/helge/projects/julia/src/toplevel.c:545
jl_parse_eval_all at /Users/helge/projects/julia/src/ast.c:780
jl_load at /Users/helge/projects/julia/src/toplevel.c:585
jl_load_ at /Users/helge/projects/julia/src/toplevel.c:591
include_from_node1 at ./loading.jl:426
unknown function (ip: 0x10bebc07c)
jl_call_method_internal at 
/Users/helge/projects/julia/src/./julia_internal.h:88
jl_apply_generic at /Users/helge/projects/julia/src/gf.c:1595
process_options at ./client.jl:266
_start at ./client.jl:322
unknown function (ip: 0x10bf138f4)
jl_call_method_internal at 
/Users/helge/projects/julia/src/./julia_internal.h:88
jl_apply_generic at /Users/helge/projects/julia/src/gf.c:1595
jl_apply at /Users/helge/projects/julia/usr/bin/julia-debug (unknown line)
true_main at /Users/helge/projects/julia/usr/bin/julia-debug (unknown line)
main at /Users/helge/projects/julia/usr/bin/julia-debug (unknown line)
Allocations: 2928538 (Pool: 2927071; Big: 1467); GC: 6

I have run the test suite 100x on a different Mac and on Linux without any 
problems. Could this be a hardware issue?

Am Montag, 30. Mai 2016 18:57:55 UTC+2 schrieb Helge Eichhorn:
>
> Thanks for testing, Rob!
>
> It was pre-compilation problem. I switched on pre-compilation for a 
> package that depends on Dopri.jl which itself does not use pre-compilation. 
> My bad...
>
> Am Montag, 30. Mai 2016 18:25:29 UTC+2 schrieb Rob J Goedman:
>>
>> Helge,
>>
>> Not sure if this helps, but below the (successful) output on my machine. 
>> I’m on OS X 11.6-beta, but I don’t think that is important. I do remember 
>> installing an updated Xcode.
>>
>> Regards,
>> Rob
>>
>> On May 30, 2016, at 08:41, Helge Eichhorn <he...@helgeeichhorn.de> wrote:
>>
>>
>>
>> Hi!
>>
>> I am having a strange problem with a segfault that seemingly appeared out 
>> of nowhere while calling Fortran code on OSX.
>>
>> When I try to run the tests of my Dopri.jl 
>> <https://github.com/helgee/Dopri.jl> package on Julia 0.4.5 on OSX 
>> 10.11.5 the process consistently fails with this error message:
>> signal (11): Segmentation fault: 11
>> unknown function (ip: 0x0)
>>
>> Tests pass on master and 0.3.12. Could somebody try to reproduce this 
>> behavior on their machine? What puzzles me is that this did not happen 23 
>> days ago when I fixed the lastest issue with the package. In the meantime I 
>> did not change anything about neither the code nor the Julia runtime. I did 
>> install an OS update, though.
>>
>> Any ideas?
>>
>> The only other data point I have is that the tests also segfault on 0.3 
>> on Travis but not consistently. On the third to fourth re-run the build 
>> usually succeeds. I have attached the log.
>>
>> Cheers,
>> Helge
>> 
>>
>>
>>
>> *julia> **Pkg.add("Dopri")*
>> *INFO: No packages to install, update or remove*
>> *INFO: Package database updated*
>>
>> *julia> **using Dopri*
>>
>> *julia> **Pkg.test("Dopri"

[julia-users] Bizarre Segfault during ccall on OSX

2016-05-30 Thread Helge Eichhorn


Hi!

I am having a strange problem with a segfault that seemingly appeared out 
of nowhere while calling Fortran code on OSX.

When I try to run the tests of my Dopri.jl 
 package on Julia 0.4.5 on OSX 10.11.5 
the process consistently fails with this error message:
signal (11): Segmentation fault: 11
unknown function (ip: 0x0)

Tests pass on master and 0.3.12. Could somebody try to reproduce this 
behavior on their machine? What puzzles me is that this did not happen 23 
days ago when I fixed the lastest issue with the package. In the meantime I 
did not change anything about neither the code nor the Julia runtime. I did 
install an OS update, though.

Any ideas?

The only other data point I have is that the tests also segfault on 0.3 on 
Travis but not consistently. On the third to fourth re-run the build 
usually succeeds. I have attached the log.

Cheers,
Helge
travis_fold:start:worker_info
Worker information
hostname: worker-jupiter-brain:16379610-1a24-4cf8-a86c-f609d20857aa
version: v2.0.0 
https://github.com/travis-ci/worker/tree/ca6cb0c5d3920912b1c3acc87c44a5da2120a971
instance: 8788b526-6ab4-47ae-8e76-5dce8d508685:
startup: 1m5.751060889s
travis_fold:end:worker_info
travis_fold:start:system_info
Build system information

Build language: julia

Build group: stable

Build dist: precise

travis_fold:end:system_info


travis_fold:start:fix.CVE-2015-7547
$ export DEBIAN_FRONTEND=noninteractive

travis_fold:end:fix.CVE-2015-7547
Fix WWDRCA Certificate

travis_fold:start:git.checkout
travis_time:start:0f8b7642
$ git clone --depth=50 --branch=v0.1.2 
https://github.com/helgee/Dopri.jl.git helgee/Dopri.jl

Cloning into 'helgee/Dopri.jl'...

remote: Counting objects: 177, done.

remote: Compressing objects:  20% (1/5)   
remote: Compressing objects:  40% (2/5)   
remote: Compressing objects:  60% (3/5)   
remote: Compressing objects:  80% (4/5)   
remote: Compressing objects: 100% (5/5)   
remote: Compressing objects: 100% (5/5), done.

Receiving objects:   0% (1/177)   
Receiving objects:   1% (2/177)   
Receiving objects:   2% (4/177)   
Receiving objects:   3% (6/177)   
Receiving objects:   4% (8/177)   
Receiving objects:   5% (9/177)   
Receiving objects:   6% (11/177)   
Receiving objects:   7% (13/177)   
Receiving objects:   8% (15/177)   
Receiving objects:   9% (16/177)   
Receiving objects:  10% (18/177)   
Receiving objects:  11% (20/177)   
Receiving objects:  12% (22/177)   
Receiving objects:  13% (24/177)   
Receiving objects:  14% (25/177)   
Receiving objects:  15% (27/177)   
Receiving objects:  16% (29/177)   
Receiving objects:  17% (31/177)   
Receiving objects:  18% (32/177)   
Receiving objects:  19% (34/177)   
Receiving objects:  20% (36/177)   
Receiving objects:  21% (38/177)   
Receiving objects:  22% (39/177)   
Receiving objects:  23% (41/177)   
Receiving objects:  24% (43/177)   
Receiving objects:  25% (45/177)   
Receiving objects:  26% (47/177)   
Receiving objects:  27% (48/177)   
Receiving objects:  28% (50/177)   
Receiving objects:  29% (52/177)   
Receiving objects:  30% (54/177)   
Receiving objects:  31% (55/177)   
Receiving objects:  32% (57/177)   
Receiving objects:  33% (59/177)   
Receiving objects:  34% (61/177)   
Receiving objects:  35% (62/177)   
Receiving objects:  36% (64/177)   
Receiving objects:  37% (66/177)   
Receiving objects:  38% (68/177)   
Receiving objects:  39% (70/177)   
Receiving objects:  40% (71/177)   
Receiving objects:  41% (73/177)   
Receiving objects:  42% (75/177)   
Receiving objects:  43% (77/177)   
Receiving objects:  44% (78/177)   
Receiving objects:  45% (80/177)   
Receiving objects:  46% (82/177)   
Receiving objects:  47% (84/177)   
Receiving objects:  48% (85/177)   
remote: Total 177 (delta 0), reused 5 (delta 0), pack-reused 172

Receiving objects:  49% (87/177)   
Receiving objects:  50% (89/177)   
Receiving objects:  51% (91/177)   
Receiving objects:  52% (93/177)   
Receiving objects:  53% (94/177)   
Receiving objects:  54% (96/177)   
Receiving objects:  55% (98/177)   
Receiving objects:  56% (100/177)   
Receiving objects:  57% (101/177)   
Receiving objects:  58% (103/177)   
Receiving objects:  59% (105/177)   
Receiving objects:  60% (107/177)   
Receiving objects:  61% (108/177)   
Receiving objects:  62% (110/177)   
Receiving objects:  63% (112/177)   
Receiving objects:  64% (114/177)   
Receiving objects:  65% (116/177)   
Receiving objects:  66% (117/177)   
Receiving objects:  67% (119/177)   
Receiving objects:  68% (121/177)   
Receiving objects:  69% (123/177)   
Receiving objects:  70% (124/177)   
Receiving objects:  71% (126/177)   
Receiving objects:  72% (128/177)   
Receiving objects:  73% (130/177)   
Receiving objects:  74% (131/177)   
Receiving objects:  75% (133/177)   
Receiving objects:  76% (135/177)   
Receiving objects:  77% (137/177)   
Receiving objects:  

[julia-users] Re: Implicit constructor for custom types through Base.convert

2016-02-17 Thread Helge Eichhorn
https://github.com/JuliaLang/julia/issues/15120

Am Dienstag, 16. Februar 2016 22:54:54 UTC+1 schrieb Cedric St-Jean:
>
> That looks like a bug to me. I couldn't find any github issue covering it, 
> maybe you should submit one? 
>
> On Tuesday, February 16, 2016 at 10:01:05 AM UTC-5, Helge Eichhorn wrote:
>>
>> When I run the following code
>>
>> type A
>> v::Float64
>> end
>>
>> type B
>> v::Float64
>> end
>>
>> Base.convert(::Type{A}, b::B) = A(b.v/2)
>> b = B(12)
>> A(b)
>>
>> it fails with this error message because the implicit constructor is 
>> called.
>>
>> LoadError: MethodError: `convert` has no method matching 
>> convert(::Type{Float64}, ::B)
>> This may have arisen from a call to the constructor Float64(...),
>> since type constructors fall back to convert methods.
>> Closest candidates are:
>>   call{T}(::Type{T}, ::Any)
>>   convert(::Type{Float64}, !Matched::Int8)
>>   convert(::Type{Float64}, !Matched::Int16)
>>   ...
>> while loading In[1], in expression starting on line 11
>>
>>  in call at In[1]:2
>>
>>
>> Since the manual said "defining Base.convert(::Type{T}, args...) = 
>> ...automatically 
>> defines a constructor T(args...) = ", I expected this to work.
>> If I define a convert method for a built-in type like Float64, it works. 
>> Am I doing something wrong or is this intended? If so, it should be clearly 
>> stated in the manual.
>>
>

[julia-users] Implicit constructor for custom types through Base.convert

2016-02-16 Thread Helge Eichhorn
When I run the following code

type A
v::Float64
end

type B
v::Float64
end

Base.convert(::Type{A}, b::B) = A(b.v/2)
b = B(12)
A(b)

it fails with this error message because the implicit constructor is called.

LoadError: MethodError: `convert` has no method matching 
convert(::Type{Float64}, ::B)
This may have arisen from a call to the constructor Float64(...),
since type constructors fall back to convert methods.
Closest candidates are:
  call{T}(::Type{T}, ::Any)
  convert(::Type{Float64}, !Matched::Int8)
  convert(::Type{Float64}, !Matched::Int16)
  ...
while loading In[1], in expression starting on line 11

 in call at In[1]:2


Since the manual said "defining Base.convert(::Type{T}, args...) = 
...automatically 
defines a constructor T(args...) = ", I expected this to work.
If I define a convert method for a built-in type like Float64, it works. Am 
I doing something wrong or is this intended? If so, it should be clearly 
stated in the manual.


[julia-users] Re: BinDeps, Windows, and libraries with unclear licenses (NAIF SPICE)

2016-02-09 Thread Helge Eichhorn
I have a few additional gray hairs now but also working Windows binaries 
and NASA's permission to distribute them.

I am still curious about my first question, though. Does the 'os' parameter 
do anything?

Am Mittwoch, 3. Februar 2016 15:34:29 UTC+1 schrieb Tony Kelman:
>
> Windows users are very rarely going to have a working build toolchain 
> installed that will be compatible with julia. Providing dll's is the only 
> sane path forward. If redistribution isn't allowed, try to pursue getting 
> upstream to provide a shared library dll alternative for binaries?



[julia-users] BinDeps, Windows, and libraries with unclear licenses (NAIF SPICE)

2016-02-03 Thread Helge Eichhorn
Hi, 

I am developing a wrapper for the NAIF SPICE toolkit (
http://naif.jpl.nasa.gov/naif/) and I have run into some problems with 
BinDeps.jl along the way. My build.jl file is attached.

*1. How can I provide OS-specific sources?*
I tried the lines below but BinDeps downloads the OSX file regardless of 
platform.
provides(Sources, 
URI("http://naif.jpl.nasa.gov/pub/naif/toolkit/C/MacIntel_OSX_AppleC_64bit/packages/cspice.tar.Z;),
 
cspice, os=:Darwin)
provides(Sources, 
URI("http://naif.jpl.nasa.gov/pub/naif/toolkit/C/PC_Linux_GCC_64bit/packages/cspice.tar.Z;),
 
cspice, os=:Linux)
provides(Sources, 
URI("http://naif.jpl.nasa.gov/pub/naif/toolkit/C/PC_Windows_VisualC_64bit/packages/cspice.zip;),
 
cspice, os=:Windows)
I only got it working correctly by wrapping those in @osx_only etc. blocks.


*2. How do I define the build process for Windows?*It is unfortunately not 
documented (or I overlooked it) that SimpleBuild and BuildProcess are not 
supposed to be used on Windows. Is the following the correct way to enable 
those regardless?
push!(BinDeps.defaults, SimpleBuild)
I realize that the recommended practice for Windows is to distribute 
binaries but NASA only provides static libraries and it is unclear if 
binary redistribution is allowed in this case.
The rules for the SPICE toolkit state the following:

*Toolkit Redistribution*
Simple redistribution of the complete Toolkit, such as from a mirror site, 
is prohibited without prior clearance from NAIF. However, including the 
SPICE Toolkit library modules and relevant SPICE Toolkit programs and 
allied User Guides as part of a package supporting a customer-built 
SPICE-based tool is entirely appropriate.

Any opinions on that? I will try to get clearance anyhow if distributing 
binaries is the only sensible way forward.

Many thanks in advance and best regards,
Helge


build.jl
Description: Binary data


[julia-users] ANN: Dopri.jl - A Julia wrapper for the DOPRI5 and DOP853 integrators.

2015-07-23 Thread Helge Eichhorn
Hello julia-users,

I have released my high-level Julia wrapper for the DOPRI5 and DOP853 
Fortran integrators: https://github.com/helgee/Dopri.jl

It features an (almost) ODE.jl-compatible API, integration step callbacks 
and dense output.
Bug reports, feature requests and PRs are always welcome.

Cheers,
Helge


[julia-users] Re: intel compiler builded julia

2015-03-20 Thread Helge Eichhorn
I reported the same issue here:
https://github.com/JuliaLang/julia/issues/9145

Am Donnerstag, 19. März 2015 20:45:48 UTC+1 schrieb Wen Ling:

 thanks for the info

 On Thursday, March 19, 2015 at 2:10:25 PM UTC-4, Tony Kelman wrote:

 Looks like osxunwind has some inline assembly that the intel compiler 
 doesn't like.


 On Thursday, March 19, 2015 at 9:29:03 AM UTC-7, Wen Ling wrote:

 and I build on MacBook pro, osx10.10

 On Thursday, March 19, 2015 at 4:16:02 AM UTC-4, Tony Kelman wrote:

 What was the error? Are you trying to build master or a release 0.3 
 version? See https://github.com/JuliaLang/julia/issues/9656 for a 
 known issue on master.


 On Wednesday, March 18, 2015 at 1:24:53 PM UTC-7, Wen Ling wrote:

 Hi, all:
Does any one tried to build julia with intel compiler, I tried, but 
 fails in the middle.
any ideas where I may find help on this matter? IRC or developer 
 mailing list?

 best regards

 wen



Re: [julia-users] Re: Julia port of DOP853 50-70x slower than Fortran

2013-12-18 Thread Helge Eichhorn
Thanks Ivar and Tom, that did the trick, and the performance is in reach of
Fortran:

*9.5e-5s*
I will implement the missing functionality and do some cleanup so this can
go into ODE.jl. Thanks for the help everybody!


2013/12/18 Ivar Nesje iva...@gmail.com

 I did not intend to suggest that it would be good for the interface to
 hard code the function. I just noted that some overhead vanishes. There
 might be some metaprogramming tricks you can do to get your desired
 interface back.

 The usual idea in Julia for returning an array, without allocation is the
 same as in C. You must allocate the array in the calling function and
 modify it inside the function. In juila the convention is that such
 functions end in ! and modify their first argument.


 function gravity!(ret::Vector{Float64}, t::Float64, y::Vector{Float64}, 
 mu::Float64)
 r = norm(y[1:3])
 ret[1:3] = y[4:6]
 ret[4:6] = -mu*y[1:3]/r/r/r
 end


 Unfortunately Julia does not support efficient slicing so I was unsure if
 you could gain anything from this approach, becuase you use the returned
 array on multiple occasions later.

 kl. 15:09:18 UTC+1 onsdag 18. desember 2013 skrev Helge Eichhorn følgende:

 I have updated the 
 Gisthttps://www.google.com/url?q=https%3A%2F%2Fgist.github.com%2Fhelgee%2F8019521sa=Dsntz=1usg=AFQjCNFFC_2GgoDiVMWwe6WpBpKmBv2W5w
 .


 2013/12/18 Helge Eichhorn he...@helgeeichhorn.de

 @Ivar: Using a regular function is only possible in this special case,
 because generally the user must supply the callback function.

 @Tim: I think that I can also rule out type problems. At least the
 output of code_typed is clean of Unions. Also did I forget to mention that
 I already did a lot of profiling.

 What did help was re-writing the callback function like so:

 function gravity1(t::Float64, y::Vector{Float64}, mu::Float64)
 x, y, z, vx, vy, vz = y
 r = sqrt(x*x+y*y+z*z)
 r3 = r*r*r
 [vx, vy, vz, -mu*x/r3, -mu*y/r3, -mu*z/r3]
 end

 and especially reducing memory allocations by writing the intermediary
 solutions in the core integrator to a single large work array. This feels
 worse style-wise since the function has now side effects, but the runtime
 is down to *1.7e-4s*.

 According to the sampling profiler the remaing bottleneck is the
 allocation of the array that is returned by the callback function. I was
 thinking about returning a tuple instead but that seems rather unpractical
 because the data will be written to the work array afterwards. Other ideas?


 2013/12/18 Tim Holy tim@gmail.com

  Also, I just added a new section
 http://docs.julialang.org/en/latest/manual/performance-tips/#tools
 that advertises the available tools for helping you diagnose performance
 problems.

 Without taking the time to look at your code, I'll just add that
 whenever I
 see an orders-of-magnitude discrepancy between C/Fortran and Julia, my
 first
 instinct is to suspect a type problem. The fact that a vectorized
 version is a
 bit faster than one written with loops might also support this
 diagnosis.

 Best,
 --Tim

 On Wednesday, December 18, 2013 03:26:11 AM Ivar Nesje wrote:
  My first suggestion to anyone trying to write fast Julia programs is
 to
  read http://docs.julialang.org/en/latest/manual/performance-tips/,
 those
  are all good tips that I do not think will get obsolete when Julia
  improves. It seems to me like you know those points.
 
  I think you get an important hint from the fact that devectorization
 does
  not matter. To me it seems like the current bottleneck is because you
 use a
  anonymous function instead of a regular function. When I replace f(
 by
  gravity( i get some improvement and then your devectorisation
 attempts
  makes significant difference. Further you might want to try to reduce
 the
  amount of memory allocated, but that seems to complicate your code
 quite
  much.
 
  My improvements reduces the timing as follows for 1000 iterations.
  ivar@Ivar-ubuntu:~/tmp$ julia doptest.jl
  elapsed time: 0.878398771 seconds (513399840 bytes allocated)
  ivar@Ivar-ubuntu:~/tmp$ julia dopitest.jl
  elapsed time: 0.16916126 seconds (122423840 bytes allocated)
 
  kl. 11:07:30 UTC+1 onsdag 18. desember 2013 skrev Helge Eichhorn
 følgende:
   Hi,
  
   I spent the last few days porting the well known
   DOP853http://www.unige.ch/~hairer/software.htmlintegrator to
 Julia. The
   process was quite smooth and I have implemented the core
 functionality.
   However when I run my reference case, a numerical solution of the
 two-body
  
   problem, I get the following timings:
  - *Fortran* (gfortran 4.8.2 no optimizations): *~1.7e-5s*
  - *Julia* (master, looped): *~1.3e-3s*
  
  - *Julia* (master, vectorized):
   *~1e-3s (!) *
  
   I have posted the Julia code and the Fortran reference in this
   Gisthttps://gist.github.com/helgee/8019521. The computationally
   expensive part seems to be contained in the *dopcore *or *dopcorevec
   *function, respectively. What I really