Joshua Gatcomb [EMAIL PROTECTED] wrote:
If you would like to see any of these ideas
implemented, or you have some of your own - please
respond to this on the list.
I've amother one. Parrot has some internal settings and tweakable magic
constants, mainly all inside the garbage collector. It
Matt Diephouse [EMAIL PROTECTED] wrote:
Joshua Gatcomb and I have been working a little under a week to set up
an automated build system for parrot that tracks performance (with
help from Dan's box). We have collected benchmark data for regular and
optimized builds with and without JIT from
At 11:08 PM -0800 11/2/04, Jeff Clites wrote:
On Nov 2, 2004, at 7:10 PM, Matt Diephouse wrote:
Joshua Gatcomb and I have been working a little under a week to set up
an automated build system for parrot that tracks performance (with
help from Dan's box). We have collected benchmark data for
At 11:08 PM -0800 11/2/04, Jeff Clites wrote:
On Nov 2, 2004, at 7:10 PM, Matt Diephouse wrote:
Joshua Gatcomb and I have been working a little
under a week to set up
an automated build system for parrot that tracks
performance (with
help from Dan's box). We have collected benchmark
data
On 04/11/02 22:10 -0500, Matt Diephouse wrote:
We have collected benchmark data for regular and
optimized builds with and without JIT from June 1st through October.
What about comparing against perl*, python and ruby?
Nice work,
Jerome
--
[EMAIL PROTECTED]
On Wed, 3 Nov 2004 18:30:58 +0100, Jerome Quelin [EMAIL PROTECTED] wrote:
What about comparing against perl*, python and ruby?
What about it? Many of the benchmarks are parrot only: the gc tests,
for example. The others should remain mostly static, unless we do
daily checkouts, which is a lot of
All:
Matt Diephouse and I spent the majority of our time
coming up with a flexible design and gathering
historical statistics. We didn't spend a lot of time
in how to present the data since everybody has their
own opinion (including us).
What we would like to do is determine if what we have
done
All~
I think it would be really cool if commits that had a significant
increase or descrease in speed would be flagged. Possibly just a
section of the page could be a table with commit dates and the percent
effect they had. This table would not contain all commits dates, but
only the most
On Wed, 3 Nov 2004 16:04:38 -0500, Matt Fowles [EMAIL PROTECTED] wrote:
I think it would be really cool if commits that had a significant
increase or descrease in speed would be flagged. Possibly just a
section of the page could be a table with commit dates and the percent
effect they had.
At 5:25 PM -0500 11/3/04, Matt Diephouse wrote:
On Wed, 3 Nov 2004 16:04:38 -0500, Matt Fowles [EMAIL PROTECTED] wrote:
I think it would be really cool if commits that had a significant
increase or descrease in speed would be flagged. Possibly just a
section of the page could be a table with
On Wed, 3 Nov 2004 12:52:26 -0800 (PST), Joshua Gatcomb
[EMAIL PROTECTED] wrote:
[snip]
What we would like to do is determine if what we have
done so far is sufficient or, if not, what specifically
people would like to see. Some of our unimplemented
ideas so far are:
1. Include the computed
Joshua Gatcomb and I have been working a little under a week to set up
an automated build system for parrot that tracks performance (with
help from Dan's box). We have collected benchmark data for regular and
optimized builds with and without JIT from June 1st through October.
With some help from
On Nov 2, 2004, at 7:10 PM, Matt Diephouse wrote:
Joshua Gatcomb and I have been working a little under a week to set up
an automated build system for parrot that tracks performance (with
help from Dan's box). We have collected benchmark data for regular and
optimized builds with and without JIT
13 matches
Mail list logo