Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-04 Thread Felix
From: Peter Bex peter@xs4all.nl Subject: Re: [Chicken-users] Segfault with large data-structures (bug) Date: Mon, 4 Feb 2013 00:16:47 +0100 On Mon, Feb 04, 2013 at 12:10:16AM +0100, Felix wrote: But why not just use ulimit? It can be set per process, so I don't see the need to have

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-04 Thread Jim Ursetto
On Feb 4, 2013, at 2:28 PM, Felix wrote: Perhaps, but I really don't see a problem of allowing a limit on heap allocation in the runtime system. I think a segfault is an appropriate response to OOM, but I wonder if it's possible to panic() instead if the heap size can't be increased as

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Peter Bex
On Sat, Feb 02, 2013 at 08:06:41PM -0600, Jim Ursetto wrote: (bug found -- tl;dr see end of message) Figured it out: you're exceeding the default maximal heap size, which is 2GB. Speaking of which, I wondered about this before: why do we even _have_ a maximum heap size? This is arbitrary and

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Arthur Maciel
Jim, that's great! Thank you so much! I've read that facebook reached out billions of users. As I'm testing graph implementations to create a graph database, do you believe this code could handle billions nodes or I would need a lot more RAM to run it? I'm not experienced in programming so I

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Arthur Maciel
Oh, and just to add info from another language #include iostream #include boost/graph/adjacency_list.hpp using namespace std; using namespace boost; typedef adjacency_listvecS, vecS, directedS Graph; int main() { const int VERTEXES = 25; const int EDGES = 1000; Graph g(VERTEXES);

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Christian Kellermann
* Arthur Maciel arthurmac...@gmail.com [130203 14:11]: Oh, and just to add info from another language #include iostream #include boost/graph/adjacency_list.hpp using namespace std; using namespace boost; typedef adjacency_listvecS, vecS, directedS Graph; int main() { const int

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread John Cowan
Peter Bex scripsit: Speaking of which, I wondered about this before: why do we even _have_ a maximum heap size? This is arbitrary and awkward. For instance, on my trusty old G4 iBook, 2G was way more than I actually had (512 MB), while at work and on my new laptop it's a relatively small

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Peter Bex
On Sun, Feb 03, 2013 at 11:15:12AM -0500, John Cowan wrote: Peter Bex scripsit: Speaking of which, I wondered about this before: why do we even _have_ a maximum heap size? This is arbitrary and awkward. For instance, on my trusty old G4 iBook, 2G was way more than I actually had (512

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread John Cowan
Blunderingly I wrote: On a 32-bit system, you can't by any means get more than a 4G memory for any single process, short of heroic measures in the kernel that allow you to assign the same virtual addresses to different physical addresses at the same time. I meant, of course, at different

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Felix
From: Peter Bex peter@xs4all.nl Subject: Re: [Chicken-users] Segfault with large data-structures (bug) Date: Sun, 3 Feb 2013 12:53:16 +0100 On Sat, Feb 02, 2013 at 08:06:41PM -0600, Jim Ursetto wrote: (bug found -- tl;dr see end of message) Figured it out: you're exceeding the default

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Peter Bex
On Sun, Feb 03, 2013 at 11:37:42PM +0100, Felix wrote: The intention is to provide some sort of soft ulimit at the application level, in case you want to make sure a certain maximum amount of memory is not exceeded. Or if you want to benchmark memory consumption, or do other whacky things. So

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread John Cowan
Peter Bex scripsit: But why not just use ulimit? It can be set per process, so I don't see the need to have a second ulimit-like limit inside each process. +1 -- John Cowan co...@ccil.org http://www.ccil.org/~cowan Dievas dave dantis; Dievas duos duonos --Lithuanian proverb Deus

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Felix
From: Peter Bex peter@xs4all.nl Subject: Re: [Chicken-users] Segfault with large data-structures (bug) Date: Sun, 3 Feb 2013 23:47:39 +0100 On Sun, Feb 03, 2013 at 11:37:42PM +0100, Felix wrote: The intention is to provide some sort of soft ulimit at the application level, in case you want

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread Peter Bex
On Mon, Feb 04, 2013 at 12:10:16AM +0100, Felix wrote: But why not just use ulimit? It can be set per process, so I don't see the need to have a second ulimit-like limit inside each process. Not everybody uses UNIX, you know. I keep forgetting not everybody is lucky enough to use it.

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-03 Thread John Cowan
Peter Bex scripsit: Not everybody uses UNIX, you know. I keep forgetting not everybody is lucky enough to use it. More seriously, do modern OSes not have some sort of sane limiting system? ulimit must be several decades old... Windows System Resource Manager is our friend here: it

[Chicken-users] Segfault with large data-structures

2013-02-02 Thread Arthur Maciel
Hello! I don't know if it is related to Ivan's problem, but when I compile and run this code: (use srfi-69) (define NODES 25) (define EDGES 1000) (define graph (make-hash-table)) (define (insert-edges) (printf ~N Hash-tables - Inserting edges ~N) (do ((n 1 (+ n 1))) ((= n NODES))

Re: [Chicken-users] Segfault with large data-structures

2013-02-02 Thread Jim Ursetto
What version of chicken, and if 4.8.0 for example could you try 4.7? On Feb 2, 2013, at 11:51, Arthur Maciel arthurmac...@gmail.com wrote: Hello! I don't know if it is related to Ivan's problem, but when I compile and run this code: (use srfi-69) (define NODES 25) (define EDGES

Re: [Chicken-users] Segfault with large data-structures

2013-02-02 Thread Kristian Lein-Mathisen
I'm getting the same result here, when I run it through csc. When I run it through csi, though, it never seems to finish - is the task that big? I had to kill it after 2-3 hours. [klm@kth ~]$ csi -version CHICKEN (c)2008-2012 The Chicken Team (c)2000-2007 Felix L. Winkelmann Version 4.8.1 (rev

Re: [Chicken-users] Segfault with large data-structures

2013-02-02 Thread Arthur Maciel
Jim, I was running 4.8.0.1, but I tried 4.7.0.6 and got the same results. Thanks for the attention. 2013/2/2 Jim Ursetto zbignie...@gmail.com What version of chicken, and if 4.8.0 for example could you try 4.7? On Feb 2, 2013, at 11:51, Arthur Maciel arthurmac...@gmail.com wrote: Hello! I

Re: [Chicken-users] Segfault with large data-structures

2013-02-02 Thread Ivan Raikov
I can also confirm experiencing the same kind of problems with 4.7.0. However, this was always in conjunction with some FFI code, and only recently I began suspecting that segfaults can occur in pure Scheme code. Ivan On Feb 3, 2013 9:11 AM, Arthur Maciel arthurmac...@gmail.com wrote: Jim, I

Re: [Chicken-users] Segfault with large data-structures

2013-02-02 Thread Jim Ursetto
On Feb 2, 2013, at 3:46 PM, Kristian Lein-Mathisen wrote: I'm getting the same result here, when I run it through csc. When I run it through csi, though, it never seems to finish - is the task that big? I had to kill it after 2-3 hours. It's a hash table with 250,000 entries and 1,000 items

Re: [Chicken-users] Segfault with large data-structures

2013-02-02 Thread Arthur Maciel
Kristian, thanks for reporting that. I've been running through csi for aproxiamtely 10 hours and it never seems to finish. I'm not sure this task is that big. 2013/2/2 Kristian Lein-Mathisen kristianl...@gmail.com I'm getting the same result here, when I run it through csc. When I run it

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-02 Thread Jim Ursetto
(bug found -- tl;dr see end of message) Figured it out: you're exceeding the default maximal heap size, which is 2GB. For whatever reason, Chicken doesn't terminate reliably and with an error in this situation, it just tries to continue. Simply run your program with -:d to see: $

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-02 Thread Jim Ursetto
On Feb 2, 2013, at 8:06 PM, Jim Ursetto wrote: Uh oh, we've hit an actual bug now. Although we can get nodes up to 85000 by increasing max heap size from 2GB to 8GB, it appears to bomb after the heap exceeds 4GB, maybe indicating some 32-bit sizes left laying around in the code. Hmm, could

Re: [Chicken-users] Segfault with large data-structures (bug)

2013-02-02 Thread Jim Ursetto
OK, I patched the core and the program runs to completion. Patch forthcoming. $ ./list-partials -:d -:hm16G [debug] application startup... [debug] heap resized to 1048576 bytes [debug] stack bottom is 0x7fff6f80f4b0. [debug] entering toplevel toplevel... [debug] entering toplevel