Hi,

This is a problem that I saw early on in my development of the
no-pred-ty branch but went away ~1 week ago. I have not been able to
reproduce it since then. (I'm on OS X x86_64 as well, FWIW)

Nonetheless, due to the nondeterministic nature of the perf tests it
may be caused by my patch - I did have to bump 1 test's memory limits
up. If this test fails for anyone else we should probably bump the
T3064 limit as well.

I did try to investigate GHC memory usage by following the
instructions at
http://hackage.haskell.org/trac/ghc/wiki/Debugging/ProfilingGhc but
they didn't actually build a -prof GHC for me. Does anyone know if
those instructions are meant to work, or if are they now out of date?

Max

On 10 September 2011 07:50, Manuel M T Chakravarty <[email protected]> wrote:
> I am seeing the following performance regression on OS X x86_64:
>
>  cd ./perf/compiler && $MAKE -s --no-print-directory T4007    </dev/null 
> >T4007.run.stdout 2>T4007.run.stderr
>  max_bytes_used 5428072 is more than maximum allowed 5000000
>  *** unexpected failure for T3064(normal)
>
> Is this specific to OS X x86_64?
>
> Manuel
> _______________________________________________
> Cvs-ghc mailing list
> [email protected]
> http://www.haskell.org/mailman/listinfo/cvs-ghc
>

_______________________________________________
Cvs-ghc mailing list
[email protected]
http://www.haskell.org/mailman/listinfo/cvs-ghc

Reply via email to