JuMP won't be getting any faster, its entirely limited by the speed of the
MIP solver. Which one did you use?
On Tuesday, July 1, 2014 6:47:04 PM UTC-4, Iain Dunning wrote:
>
> I was unable to run Bench.jl (ERROR: varzm! not defined), but, on my
> computer just using runtests.jl, a fixed seed, and total time for 100 random
>
> *Initial
> elapsed time: 1.641434988 seconds (282491732 bytes allocated, 5.99% gc
> time)
>
> *Change globals to const
> elapsed time: 1.563094028 seconds (261818132 bytes allocated, 6.61% gc
> time)
>
> * Changing from using a Dict{Int64, *} for the Grid types to just a
> Vector{*}, as well as those other globals
> elapsed time: 1.373703078 seconds (191864592 bytes allocated, 4.91% gc
> time)
>
>
>
>
>
>
>
>
> On Tuesday, July 1, 2014 6:27:15 PM UTC-4, andy hayden wrote:
>>
>> Bench.jl has a bench_compare method which returns a DataFrame of times (I
>> then divide the median of Python vs Julia columns), I'll add this output to
>> the Bench script as it's useful to see (would be nice to add more stats, as
>> it's just a DataFrame of all the solved puzzles in seconds). By default it
>> runs a hundred random sudoku's on Julia, Python, and JuMP (the same on
>> each)...
>>
>> Thanks Steven: Making those const makes a huge difference, Julia wins
>> (from 20% slower to 10% faster for me with just that change).
>> I will have a play and see how your other suggestions play out.
>>
>> I was also very impressed with JuMP here... and it may be the latest is
>> even faster (I'm using the version from the last release rather than
>> master, and it has changed since then).
>>
>>
>> On Tuesday, 1 July 2014 15:11:27 UTC-7, Iain Dunning wrote:
>>>
>>> I'm working on improving this, but I'm not sure how you are measuring
>>> that 20% slower - can you be more specific?
>>>
>>> On Tuesday, July 1, 2014 1:37:00 PM UTC-4, andy hayden wrote:
>>>>
>>>> I recently ported Norvig's Solve Every Sudoku Puzzle
>>>> <http://norvig.com/sudoku.html> to Julia:
>>>> https://github.com/hayd/Sudoku.jl
>>>>
>>>> Some simple benchmarks suggest my Julia implementation solves around
>>>> 20% slower* than the Python version, and 3 times faster than the
>>>> implementation on JuMP (vendorized from the latest release), against the
>>>> random puzzles. I tried to include the solver from attractivechaos/plb
>>>> <https://github.com/attractivechaos/plb/tree/master/sudoku> but
>>>> couldn't get it working for comparison...
>>>>
>>>> I'm new to Julia so would love to hear people's thoughts / any
>>>> performance tips!
>>>> I've not delved too deeply into the Profile, but @time suggests 10% of
>>>> time is GC.
>>>>
>>>> **I'm sure I've lost some performance in translation which could be
>>>> easily sped up...*
>>>>
>>>> Best,
>>>> Andy
>>>>
>>>