They are run daily and published to

From: Antoine Pitrou <>
Sent: Friday, April 13, 2018 4:28:11 AM
Subject: Re: Continuous benchmarking setup

Nice! Are the benchmark results published somewhere?

Le 13/04/2018 à 02:50, Tom Augspurger a écrit :
> is the setup for the projects 
> currently running. Adding arrow to  
> might 
> work. I'll have to redeploy with the update.
> ________________________________
> From: Wes McKinney <>
> Sent: Thursday, April 12, 2018 7:24:20 PM
> To:
> Subject: Re: Continuous benchmarking setup
> hi Antoine,
> I have a bare metal machine at home (affectionately known as the
> "pandabox") that's available via SSH that we've been using for
> continuous benchmarking for other projects. Arrow is welcome to use
> it. I can give you access to the machine if you would like. Hopefully,
> we can suitably the process of setting up a continuous benchmarking
> machine so that if we need to migrate to a new machine, it is not too
> much of a hardship to do so.
> Thanks
> Wes
> On Wed, Apr 11, 2018 at 9:40 AM, Antoine Pitrou <> wrote:
>> Hello
>> With the following changes, it seems we might reach the point where
>> we're able to run the Python-based benchmark suite accross multiple
>> commits (at least the ones not anterior to those changes):
>> To make this truly useful, we would need a dedicated host.  Ideally a
>> (Linux) OS running on bare metal, with SMT/HyperThreading disabled.
>> If running virtualized, the VM should have dedicated physical CPU cores.
>> That machine would run the benchmarks on a regular basis (perhaps once
>> per night) and publish the results in static HTML form somewhere.
>> (note: nice to have in the future might be access to NVidia hardware,
>> but right now there are no CUDA benchmarks in the Python benchmarks)
>> What should be the procedure here?
>> Regards
>> Antoine.

Reply via email to