Yes this must be the problem. If I try running locally accessing the files 
on the server through sshfs, everything seems to work fine. Would be 
interested to know what the problem is though? And if there are any 
workarounds? 

Thanks
Marc

On Saturday, 11 April 2015 21:34:04 UTC+1, Marc Williams wrote:
>
> I'm working on a managed cluster that runs scientific lunix. I get the 
> following with ulimit -a when I've requested a session with 4gb:
>
> core file size          (blocks, -c) unlimited
>
> data seg size           (kbytes, -d) unlimited
>
> scheduling priority             (-e) 0
>
> file size               (blocks, -f) unlimited
>
> pending signals                 (-i) 191971
>
> max locked memory       (kbytes, -l) 137435659880
>
> max memory size         (kbytes, -m) unlimited
>
> open files                      (-n) 1024
>
> pipe size            (512 bytes, -p) 8
>
> POSIX message queues     (bytes, -q) 819200
>
> real-time priority              (-r) 0
>
> stack size              (kbytes, -s) unlimited
>
> cpu time               (seconds, -t) unlimited
>
> max user processes              (-u) 191971
>
> virtual memory          (kbytes, -v) 4194304
>
> file locks                      (-x) unlimited
> Thanks
> Marc
>
> On Saturday, 11 April 2015 21:01:20 UTC+1, Tim Holy wrote:
>>
>> I'll bet you're working within a constrained environment. If you're on a 
>> unix 
>> platform, what does 'ulimit -a' say? 
>>
>> Best, 
>> --Tim 
>>
>> On Saturday, April 11, 2015 12:04:03 PM Marc Williams wrote: 
>> > So I get the error when I call mmap_array() as follows: 
>> > 
>> >     s = open("matrixfile.bin") 
>> > 
>> >     m = read(s, Float64) 
>> > 
>> >     weight = mmap_array(Float64, (int64(m),int64(m)), s) 
>> > 
>> >     close(s) 
>> > 
>> > When my "matrixfile.bin" is small everything works fine, but when I get 
>> to 
>> > a stage where the file size is similar to the amount of RAM available I 
>> get 
>> > the following error: 
>> > 
>> > ERROR: memory mapping failed: Cannot allocate memory 
>> > 
>> >  in mmap at mmap.jl:35 
>> > 
>> >  in mmap_array at mmap.jl:110 
>> > 
>> >  in readinfile at none:4 
>> > 
>> > On Friday, 10 April 2015 15:12:14 UTC+1, tshort wrote: 
>> > > More information would help, especially a concise reproducible 
>> example. 
>> > > 
>> > > On Fri, Apr 10, 2015 at 8:00 AM, Marc Williams <[email protected] 
>> > > 
>> > > <javascript:>> wrote: 
>> > >> Hi, 
>> > >> 
>> > >> I'm doing some analysis where I need to compute some large matrices 
>> (up 
>> > >> to about 50,000 X 50,000) so I quickly run out of RAM, it seems like 
>> > >> using 
>> > >> a memory mapped array could be a useful approach so I compute the 
>> > >> matrices 
>> > >> and save them to a binary file and then read them in using 
>> mmap_array(). 
>> > >> For the smaller matrices everything works fine but when the size of 
>> the 
>> > >> binary file is greater than the amount of RAM available I get the 
>> > >> following 
>> > >> error: 
>> > >> 
>> > >> ERROR: memory mapping failed: Cannot allocate memory 
>> > >> 
>> > >> Once I've read the matrices in using mmap_array() I do some basic 
>> > >> calculation like computing the sum over all the elements, the sum 
>> over 
>> > >> rows 
>> > >> and I access every element a couple of times. 
>> > >> 
>> > >> I've not used memory mapping before so am I using it in the right 
>> way and 
>> > >> is there anything I'm missing that I need to do to make this a 
>> solution 
>> > >> to 
>> > >> my RAM issue? 
>> > >> 
>> > >> Many Thanks 
>> > >> Marc 
>>
>>

Reply via email to