I tried around with that a bit, but then it gets much worse: From ~1s to 
~6s, allocation as shown:

153710487     mat = Array{Complex64}(dims...)
  4722450       file = Mmap.mmap(filename, Array{Complex64,2}, 
(dims[2],length(counter1)))
     9568          for i = 1:dims[2]
     4000             for j = 1:length(counter1)
1690462534          mat[counter1s[j],i,counter2[j]] = file[i,j]
        -                 end

I swapped the for loops around here, but that didn't matter. I can gain a 
little bit by indexing i into the first dimension of mat, but it still lags 
far behind.
Any other ideas?

On Saturday, 12 March 2016 03:15:33 UTC+1, Greg Plowman wrote:
>
> I think array slices (on right hand side of assignment) create new arrays, 
> hence the allocation.
> Try writing an explicit loop instead, something like:
>
> for j = 1:length(counter1)
>    for i = 1:size(file,1)
>        mat[counter1[j],i,counter2[j]] = file[i,j]
>    end
> end
>
>
> On Saturday, March 12, 2016 at 12:25:00 PM UTC+11, Tim Loderhose wrote:
>
>> Hi,
>>
>> I have a question regarding some allocation in my code I would like to 
>> get rid of.
>> I am memory mapping a file (which could be very large) which is part of a 
>> complex 3D matrix, and then put its contents into the preallocated matrix 
>> along the second dimension. I need the counters because the contents of 
>> file are only a subset of the full matrix.
>>
>> Here's a profiled snippet, where the file which is loaded has 120619520 
>> bytes.
>>
>> 153705063     mat = Array{Complex64}(dims...)
>>  4721282        file = Mmap.mmap(filename, Array{Complex64,2}, 
>> (dims[2],length(counter1)))
>> 16                   for i = 1:length(counter1)
>> 148179531           mat[counter1[i],:,counter2[i]] = file[:,i]
>>         -              end
>>
>> Why does the code allocate so much memory inside the for-loop (even more 
>> bytes than the contents of file)?
>> It seems like this is a trivial matter, right now I just can't get my 
>> head around it, any help is appreciated :)
>>
>> Thanks,
>> Tim
>>
>

Reply via email to