Hello, compiling 2.5.0 on (linux 4.0.1, 64 bit) and ghc 7.10.1 with
/usr/bin/time -f %M gives values between 2.3 GB and 2.6 GB on multiple runs.
(2.4.1 was requesting more than 3 GB, cannot say how much precisely)
Thanks Sven
2015-05-03 16:45 GMT+02:00 George Colpitts george.colpi...@gmail.com:
2015-05-02 12:01 GMT+02:00 Paolino paolo.verone...@gmail.com:
Hello, I succeded in compiling
https://hackage.haskell.org/package/OpenGLRaw-2.4.1.0/docs/src/Graphics-Rendering-OpenGL-Raw-Functions.html
on a 32 bit machine with 2GB of memory with ghc 7.10.1. O_O
To alleviate the pain a bit, I've
I think this helps quite a bit. Although it still peaks briefly at over 3
GB mem usage on my Mac according to the Activity Monitor it seems to spend
much of its time using 400 - 800 mb memory use. I can't be sure as I never
tried to compile this before. I'm compiling by simply doing cabal install
Hello, I succeded in compiling
https://hackage.haskell.org/package/OpenGLRaw-2.4.1.0/docs/src/Graphics-Rendering-OpenGL-Raw-Functions.html
on a 32 bit machine with 2GB of memory with ghc 7.10.1. O_O
HTH, paolino
2015-05-01 21:32 GMT+02:00 Michal Terepeta michal.terep...@gmail.com:
On Fri, May
[mailto:glasgow-haskell-users-boun...@haskell.org]
On Behalf Of Paolino
Sent: 01 May 2015 15:30
To: George Colpitts
Cc: glasgow-haskell-users@haskell.org
Subject: Re: Increased memory usage with GHC 7.10.1
here is another file , which is small, which cannot be compiled within 4GB
memory.
https
.
Thanks
SImon
*From:* Glasgow-haskell-users [mailto:
glasgow-haskell-users-boun...@haskell.org] *On Behalf Of *Paolino
*Sent:* 01 May 2015 15:30
*To:* George Colpitts
*Cc:* glasgow-haskell-users@haskell.org
*Subject:* Re: Increased memory usage with GHC 7.10.1
here is another file
here is another file , which is small, which cannot be compiled within 4GB
memory.
https://raw.githubusercontent.com/benl23x5/gloss/master/gloss-examples/raster/Fluid/src-repa/Stage/Linear.hs
I'd just want to add that this problem is a nasty one if one doesn't set
the max heap: a remote machine
On Fri, May 1, 2015 at 5:05 PM Simon Peyton Jones simo...@microsoft.com
wrote:
It would be amazingly helpful if someone (anyone) could diagnose a bit.
It may be a bug in GHC but it may also be a bug in the pragmas in a
library. If someone can produce evidence for the former, I’ll gladly
Hello, I'm using ghc 7.10.1 to compile OpenGLRaw which is now impossible
with -O1 and -O2 due to ghc : out of memory error on a 4GB linux host.
The file making memory explode is Graphics.Rendering.OpenGL.Raw.Functions
` (both for 7.10 and prior versions) and showing the output
would be helpful.
Simon
*From:* Glasgow-haskell-users [mailto:
glasgow-haskell-users-boun...@haskell.org] *On Behalf Of *Paolino
*Sent:* 01 May 2015 09:42
*To:* glasgow-haskell-users@haskell.org
*Subject:* Re: Increased memory
To: glasgow-haskell-users@haskell.org
Subject: Re: Increased memory usage with GHC 7.10.1
Hello, I'm using ghc 7.10.1 to compile OpenGLRaw which is now impossible with
-O1 and -O2 due to ghc : out of memory error on a 4GB linux host. The file
making memory explode
*From:* Glasgow-haskell-users [mailto:
glasgow-haskell-users-boun...@haskell.org] *On Behalf Of *Paolino
*Sent:* 01 May 2015 09:42
*To:* glasgow-haskell-users@haskell.org
*Subject:* Re: Increased memory usage with GHC 7.10.1
Hello, I'm using ghc 7.10.1 to compile OpenGLRaw which is now
:* Re: Increased memory usage with GHC 7.10.1
Hello, I'm using ghc 7.10.1 to compile OpenGLRaw which is now impossible
with -O1 and -O2 due to ghc : out of memory error on a 4GB linux host.
The file making memory explode is
Graphics.Rendering.OpenGL.Raw.Functions
https://hackage.haskell.org
Should we recommend that all library developers compile their libraries
with a max heap of 4G (to pick an arbitrary number) so that we can catch
some of these issues earlier?
On Fri, May 1, 2015 at 5:42 AM, Paolino paolo.verone...@gmail.com wrote:
Hello, I'm using ghc 7.10.1 to compile
Hi Richard,
Thanks updating the ticket! What I did was: build GHC HEAD/7.8.4 with build
flavor 'prof' then get the haskell-src-exts sources, install the
dependencies and finally add +RTS -p -RTS to the cabal file and compile it,
the resulting ghc.prof contains the profiling information.
For
Hi all,
I've been playing around with profiling GHC recently, so I thought I'd
chime in with a pointer that might save people searching for the right docs
- you could configure a cabal sandbox to work with multiple version of ghc,
which I've found useful:
I've pasted Michal's numbers in #9630, which seems like a good place to track
this. Michal, would you mind fleshing out a bit precisely what you did to get
those numbers? That would be helpful (though you've already been very helpful
indeed in getting the data together)!
Thanks,
Richard
On
Hi,
Am Dienstag, den 14.04.2015, 21:54 +0200 schrieb Michal Terepeta:
On Mon, Apr 13, 2015 at 10:34 PM, Christiaan Baaij
christiaan.ba...@gmail.com wrote:
Actually, I meant only with -fno-specialise.
Done. Helps quite a bit but CallArity is still a pretty expensive.
I’m on that, and I
On Mon, Apr 13, 2015 at 10:34 PM, Christiaan Baaij
christiaan.ba...@gmail.com wrote:
Actually, I meant only with -fno-specialise.
Done. Helps quite a bit but CallArity is still a pretty expensive.
Tue Apr 14 21:46 2015 Time and Allocation Profiling Report (Final)
ghc +RTS -p -RTS [...]
On Mon, Apr 13, 2015 at 12:20 PM, Christiaan Baaij
christiaan.ba...@gmail.com wrote:
Hi,
I wonder if this might be in any way related to the HUGE amount of new
terms/types/coercions created by the specialiser as documented in:
https://ghc.haskell.org/trac/ghc/ticket/9630#comment:10
I
Actually, I meant only with -fno-specialise.
On 13 April 2015 at 21:09, Michal Terepeta michal.terep...@gmail.com
wrote:
On Mon, Apr 13, 2015 at 12:20 PM, Christiaan Baaij
christiaan.ba...@gmail.com wrote:
Hi,
I wonder if this might be in any way related to the HUGE amount of new
Hi,
I wonder if this might be in any way related to the HUGE amount of new
terms/types/coercions created by the specialiser as documented in:
https://ghc.haskell.org/trac/ghc/ticket/9630#comment:10
https://ghc.haskell.org/trac/ghc/ticket/9630#comment:10
I don’t have a profiled version of GHC,
So I've tried to compile Idris/Agda with prof compilers but this
didn't quite work out due to deps not compiling (apparently it's not
possible to use template haskell with a profiled compiler).
Out of curiosity I had a look at compiling haskell-src-exts since that
takes quite a while. I've used
On a machine with an SSD instead of a hard disk, swapping greatly reduces
the lifespan of the storage device.
On Fri, Apr 3, 2015 at 10:14 AM, Bertram Felgenhauer
bertram.felgenha...@googlemail.com wrote:
George Colpitts wrote:
I'm curious why the amount of RAM is relevant as all of our OS
George Colpitts wrote:
I'm curious why the amount of RAM is relevant as all of our OS have virtual
memory so it is only the size of the heap and the amount of swap that
should be relevant for an Out Of Memory error, right?
The computer may not be your own. VPSs are essentially priced based on
I will. But I was curious whether this is only me or is anyone else seeing
similar behaviour. And
what about performance comparisson between 7.8.4 and 7.10.1? Do we have any
numbers?
Janek
Dnia czwartek, 2 kwietnia 2015, Richard Eisenberg napisał:
Post a bug report! :)
On Apr 2, 2015, at
Post a bug report! :)
On Apr 2, 2015, at 8:19 AM, Jan Stolarek jan.stola...@p.lodz.pl wrote:
An update frrom my second machine, this time with 4GB of RAM. Compiling Agda
ran out of memory
(again Agda.TypeChecking.Serialise module) and I had to kill the build. But
once I restarted the
An update frrom my second machine, this time with 4GB of RAM. Compiling Agda
ran out of memory
(again Agda.TypeChecking.Serialise module) and I had to kill the build. But
once I restarted the
build the module was compiled succesfully in a matter of minutes and using
around 50% of memory.
I'm curious why the amount of RAM is relevant as all of our OS have virtual
memory so it is only the size of the heap and the amount of swap that
should be relevant for an Out Of Memory error, right? How big is your heap?
Amount of RAM should only affect speed (i.e. if there is excessive
In a 64-bit machine with Ubuntu 12.04 and 4 GB of RAM, I can compile
Agda using GHC 7.10.1 without problems. Agda's compilation time
increased ~26% with respect to GHC 7.8.4.
On 2 April 2015 at 07:19, Jan Stolarek jan.stola...@p.lodz.pl wrote:
An update frrom my second machine, this time with
I'm curious why the amount of RAM is relevant as all of our OS have virtual
memory so it is only the size of the heap and the amount of swap that
should be relevant for an Out Of Memory error, right? How big is your heap?
Amount of RAM should only affect speed (i.e. if there is excessive paging)
I have run out of memory before when compiling on small machines using GHC 7.8,
where small machines have 4GB RAM, no swap, say small Dual Core Atom, almost
embedded design. That forced me to compile on a laptop and mount file systems
to run it. But since Ubuntu runs well on a NUC, it is nice
You aren't the only one. The vector test suite also has these kind of
issues. In its case, it's hard for me to tell how big the code is, because
template haskell is being used to generate it. However, I don't think the
template haskell is what's using the additional performance, because the
test
33 matches
Mail list logo