Bench.jl has a bench_compare method which returns a DataFrame of times (I then divide the median of Python vs Julia columns), I'll add this output to the Bench script as it's useful to see (would be nice to add more stats, as it's just a DataFrame of all the solved puzzles in seconds). By default it runs a hundred random sudoku's on Julia, Python, and JuMP (the same on each)...
Thanks Steven: Making those const makes a huge difference, Julia wins (from 20% slower to 10% faster for me with just that change). I will have a play and see how your other suggestions play out. I was also very impressed with JuMP here... and it may be the latest is even faster (I'm using the version from the last release rather than master, and it has changed since then). On Tuesday, 1 July 2014 15:11:27 UTC-7, Iain Dunning wrote: > > I'm working on improving this, but I'm not sure how you are measuring that > 20% slower - can you be more specific? > > On Tuesday, July 1, 2014 1:37:00 PM UTC-4, andy hayden wrote: >> >> I recently ported Norvig's Solve Every Sudoku Puzzle >> <http://norvig.com/sudoku.html> to Julia: >> https://github.com/hayd/Sudoku.jl >> >> Some simple benchmarks suggest my Julia implementation solves around 20% >> slower* than the Python version, and 3 times faster than the implementation >> on JuMP (vendorized from the latest release), against the random puzzles. I >> tried to include the solver from attractivechaos/plb >> <https://github.com/attractivechaos/plb/tree/master/sudoku> but couldn't >> get it working for comparison... >> >> I'm new to Julia so would love to hear people's thoughts / any >> performance tips! >> I've not delved too deeply into the Profile, but @time suggests 10% of >> time is GC. >> >> **I'm sure I've lost some performance in translation which could be >> easily sped up...* >> >> Best, >> Andy >> >
