alamb opened a new pull request, #6134:
URL: https://github.com/apache/arrow-datafusion/pull/6134

   # Which issue does this PR close?
   
   Part of https://github.com/apache/arrow-datafusion/issues/6127
   
   # Rationale for this change
   I want to be able to easily understand what is being compared and reported. 
Currently the reporting from `compare.py` is somewhat opaque: 
   1. It averages benchmark run timings (which increases the variance)
   2. The header columns are non sensical
   
   # What changes are included in this PR?
   1. Use `min` of the iteration times to report the benchmark results -- I 
think this minimizes run to run variance and is the fairest way to asses 
performance
   2. Use the filename as the column header
   
   # Are these changes tested?
   No, it is a tool only change
   
   # Are there any user-facing changes?
   
   <!--
   If there are user-facing changes then we may require documentation to be 
updated before approving the PR.
   -->
   
   <!--
   If there are any breaking changes to public APIs, please add the `api 
change` label.
   -->


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@arrow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to