Using a single BitArray increases performance by 10x! I've pushed that to 
github. It also feels a much neater solution.

I still can't work out how to strip out the dictionary in solve..

On Tuesday, 1 July 2014 17:28:06 UTC-7, andy hayden wrote:
>
> It could be what quinnj's port does 
> https://github.com/attractivechaos/plb/blob/master/sudoku/sudoku_v1.jl. I 
> couldn't get this working to compare...
>
>
>
> On 1 July 2014 17:13, Iain Dunning <[email protected]> wrote:
>
>> Stripping out the dictionary stuff in search and doing it in a single 
>> array pass has knocked me down to 
>> elapsed time: 1.305257143 seconds (183884144 bytes allocated, 3.85% gc 
>> time)
>>
>> Changing to bitarrays would be the real fix, and I got halfway, but 
>> converting the code was so painful I might just write my own solver based 
>> on the same bruteforce idea.
>>
>> On Tuesday, July 1, 2014 7:58:44 PM UTC-4, andy hayden wrote:
>>>
>>> Ha, I had exactly the same issue (I pushed a perf update with a 
>>> commented out impl), I assumed it was something (very) wrong in my 
>>> understanding of control flow!
>>>
>>> I don't think see how it would rely on anything, ordering (?)... 
>>> perplexing.
>>>
>>> On Tuesday, 1 July 2014 16:45:04 UTC-7, Iain Dunning wrote:
>>>>
>>>> Yeah we changed the example, so best to take it from the one in the 
>>>> release version...
>>>>
>>>> I removed the dictionary from search() but its now no longer solving 
>>>> all the problems(!) - does the algorithm rely somehow on the way the 
>>>> dictionary is constructed?
>>>>
>>>>
>>>> On Tuesday, July 1, 2014 6:59:02 PM UTC-4, andy hayden wrote:
>>>>>
>>>>> I was using Cbc.
>>>>>
>>>>> SolveModel is a copy and paste job from JuMP (from the last release 
>>>>> rather than master) so may not work with JuMP from master - I couldn't 
>>>>> get 
>>>>> the version from master working since it was incompatible with the JuMP 
>>>>> release I had! It'd be great to just be able to just include the file, 
>>>>> but 
>>>>> I couldn't get that working so I just pasted it in (I should probably 
>>>>> clean 
>>>>> Bench as I made quite a mess, apologies about that.)... so it may be you 
>>>>> need to update SolveModel from JuMP master/your version of JuMP to get 
>>>>> Bench working.
>>>>>
>>>>> It's amazing how some small tweaks like this go so far, there's a few 
>>>>> other bits that are obvious even to me (but I just couldn't get working).
>>>>>
>>>>>
>>>>> On 1 July 2014 15:47, Iain Dunning <[email protected]> wrote:
>>>>>
>>>>>> JuMP won't be getting any faster, its entirely limited by the speed 
>>>>>> of the MIP solver. Which one did you use?
>>>>>>
>>>>>>
>>>>>> On Tuesday, July 1, 2014 6:47:04 PM UTC-4, Iain Dunning wrote:
>>>>>>>
>>>>>>> I was unable to run Bench.jl (ERROR: varzm! not defined), but, on my 
>>>>>>> computer just using runtests.jl, a fixed seed, and total time for 100 
>>>>>>> random
>>>>>>>
>>>>>>> *Initial
>>>>>>> elapsed time: 1.641434988 seconds (282491732 bytes allocated, 5.99% 
>>>>>>> gc time)
>>>>>>>
>>>>>>> *Change globals to const
>>>>>>> elapsed time: 1.563094028 seconds (261818132 bytes allocated, 6.61% 
>>>>>>> gc time)
>>>>>>>
>>>>>>> * Changing from using a Dict{Int64, *} for the Grid types to just a 
>>>>>>> Vector{*}, as well as those other globals
>>>>>>> elapsed time: 1.373703078 seconds (191864592 bytes allocated, 4.91% 
>>>>>>> gc time)
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Tuesday, July 1, 2014 6:27:15 PM UTC-4, andy hayden wrote:
>>>>>>>>
>>>>>>>> Bench.jl has a bench_compare method which returns a DataFrame of 
>>>>>>>> times (I then divide the median of Python vs Julia columns), I'll add 
>>>>>>>> this 
>>>>>>>> output to the Bench script as it's useful to see (would be nice to add 
>>>>>>>> more 
>>>>>>>> stats, as it's just a DataFrame of all the solved puzzles in seconds). 
>>>>>>>> By 
>>>>>>>> default it runs a hundred random sudoku's on Julia, Python, and JuMP 
>>>>>>>> (the 
>>>>>>>> same on each)...
>>>>>>>>
>>>>>>>> Thanks Steven: Making those const makes a huge difference, Julia 
>>>>>>>> wins (from 20% slower to 10% faster for me with just that change).
>>>>>>>> I will have a play and see how your other suggestions play out.
>>>>>>>>
>>>>>>>> I was also very impressed with JuMP here... and it may be the 
>>>>>>>> latest is even faster (I'm using the version from the last release 
>>>>>>>> rather 
>>>>>>>> than master, and it has changed since then).
>>>>>>>>
>>>>>>>>
>>>>>>>> On Tuesday, 1 July 2014 15:11:27 UTC-7, Iain Dunning wrote:
>>>>>>>>>
>>>>>>>>> I'm working on improving this, but I'm not sure how you are 
>>>>>>>>> measuring that 20% slower - can you be more specific?
>>>>>>>>>
>>>>>>>>> On Tuesday, July 1, 2014 1:37:00 PM UTC-4, andy hayden wrote:
>>>>>>>>>>
>>>>>>>>>> I recently ported Norvig's Solve Every Sudoku Puzzle 
>>>>>>>>>> <http://norvig.com/sudoku.html> to Julia: https://github.com/hayd
>>>>>>>>>> /Sudoku.jl
>>>>>>>>>>
>>>>>>>>>> Some simple benchmarks suggest my Julia implementation solves 
>>>>>>>>>> around 20% slower* than the Python version, and 3 times faster than 
>>>>>>>>>> the 
>>>>>>>>>> implementation on JuMP (vendorized from the latest release), against 
>>>>>>>>>> the 
>>>>>>>>>> random puzzles. I tried to include the solver from 
>>>>>>>>>> attractivechaos/plb 
>>>>>>>>>> <https://github.com/attractivechaos/plb/tree/master/sudoku> but 
>>>>>>>>>> couldn't get it working for comparison...
>>>>>>>>>>
>>>>>>>>>> I'm new to Julia so would love to hear people's thoughts / any 
>>>>>>>>>> performance tips!
>>>>>>>>>> I've not delved too deeply into the Profile, but @time suggests 
>>>>>>>>>> 10% of time is GC.
>>>>>>>>>>
>>>>>>>>>>  **I'm sure I've lost some performance in translation which 
>>>>>>>>>> could be easily sped up...*
>>>>>>>>>>
>>>>>>>>>> Best,
>>>>>>>>>> Andy
>>>>>>>>>>
>>>>>>>>>
>>>>>
>

Reply via email to