Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

2018-09-05 Thread Glasser, Matthew
Do you have a movement regressor file for all runs of all subjects? That is 
what it is suggesting and probably is not related to memory.

Matt.

From: Timothy Hendrickson mailto:hendr...@umn.edu>>
Date: Thursday, September 6, 2018 at 4:55 AM
To: Matt Glasser mailto:glass...@wustl.edu>>
Cc: Timothy Coalson mailto:tsc...@mst.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:HCP-Users@humanconnectome.org>>
Subject: Re: FW: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

Sort of continuing this conversation, I am occasionally running into another 
error with hcp_fix_multi_run. I've noticed that within a few concatenation 
folders I get a error log file, named errorLog.txt with the following message:

Error Time: 09/05/2018 12:42:16
Error using load
Unable to read file 'mc/prefiltered_func_data_mcf.par': no such file or 
directory.

As mentioned I only notice this error on a few participants whereas for others 
hcp_fix_multi_run completes without issue. Is this an error that is well known? 
I am inclined to think that this may be a memory issue.

-Tim


Timothy Hendrickson
Neuroimaging Analyst/Staff Scientist
University of Minnesota Informatics Institute
University of Minnesota
Bioinformatics M.S. Candidate
Office: 612-624-0783
Mobile: 507-259-3434 (texts okay)

On Tue, Aug 21, 2018 at 6:47 PM, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
Sorry that is actually a FSL bug that I have already reported to them.

You can work around it with (/bin/sh -c '. /usr/local/fsl/etc/fslconf/fsl.sh; 
fslmaths filtered_func_data -sub filtered_func_data -add 
filtered_func_data_clean filtered_func_data_clean’)

Matt.

From: Timothy Hendrickson mailto:hendr...@umn.edu>>
Date: Tuesday, August 21, 2018 at 6:36 PM
To: Matt Glasser mailto:glass...@wustl.edu>>

Subject: Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

Hmm, interesting. The total swap space that is available is approximately 130GB 
whereas the total amount of RAM is just as much, that would seem to me to be 
plenty enough.

What other suggestions do you have?

How important is the FSL call (/bin/sh -c '. /usr/local/fsl/etc/fslconf/fsl.sh; 
fslcpgeom filtered_func_data filtered_func_data_clean') ? Even though the 
command does not complete I can still open it with fsleyes and fslinfo without 
issue. From my brief research into fslutils fslcpgeom copies header information.

Can I perhaps comment out that command so the cleaned dtseries file can be 
generated (the following lines of code)?

-Tim

Timothy Hendrickson
Neuroimaging Analyst/Staff Scientist
University of Minnesota Informatics Institute
University of Minnesota
Bioinformatics M.S. Candidate
Office: 612-624-0783
Mobile: 507-259-3434 (texts okay)

On Tue, Aug 21, 2018 at 4:36 PM, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
You might need to configure swap space or increase the amount.

Matt.

From: Timothy Hendrickson mailto:hendr...@umn.edu>>
Date: Tuesday, August 21, 2018 at 1:54 PM

To: Timothy Coalson mailto:tsc...@mst.edu>>
Cc: Matt Glasser mailto:glass...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:HCP-Users@humanconnectome.org>>
Subject: Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

Okay I have gotten past the previous issue with read_avg, however, I am 
experiencing a new error.

See below:

Error using call_fsl (line 36)
FSL call (/bin/sh -c '. /usr/local/fsl/etc/fslconf/fsl.sh; fslcpgeom 
filtered_func_data filtered_func_data_clean') failed, Unable to allocate memory 
for copy: Cannot allocate memory

Error in fix_3_clean (line 112)

On Fri, Aug 17, 2018 at 4:53 PM, Timothy Coalson 
mailto:tsc...@mst.edu>> wrote:
In other words, if you have matlab installed, set the matlab mode to 1 (in 
settings.sh).  If not, you will need the compiled matlab to be up to date with 
the matlab scripts, and use mode 0.

The alternative, figuring out how to build octave to handle larger matrices, 
sounds like an adventure.

Tim


On Fri, Aug 17, 2018 at 4:50 PM, Timothy Coalson 
mailto:tsc...@mst.edu>> wrote:
It looks like default builds of octave have an unfortunate limit to the 
possible number of elements in any matrix:

octave:1> test=zeros(6);
error: out of memory or dimension too large for Octave's index type

The limit appears to be 2 billion elements total (signed 32-bit).

Tim


On Fri, Aug 17, 2018 at 4:42 PM, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
Is everything 64bit?  Do you have swap space?

Matt.

From: Timothy Hendrickson mailto:hendr...@umn.edu>>
Date: Friday, August 17, 2018 at 4:39 PM
To: Timothy Coalson mailto:tsc...@mst.edu>>
Cc: Matt Glasser mailto:glass...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:HCP-Users@humanconnectome.org>>
Subject: R

Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

2018-08-21 Thread Glasser, Matthew
You might need to configure swap space or increase the amount.

Matt.

From: Timothy Hendrickson mailto:hendr...@umn.edu>>
Date: Tuesday, August 21, 2018 at 1:54 PM
To: Timothy Coalson mailto:tsc...@mst.edu>>
Cc: Matt Glasser mailto:glass...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:HCP-Users@humanconnectome.org>>
Subject: Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

Okay I have gotten past the previous issue with read_avg, however, I am 
experiencing a new error.

See below:

Error using call_fsl (line 36)
FSL call (/bin/sh -c '. /usr/local/fsl/etc/fslconf/fsl.sh; fslcpgeom 
filtered_func_data filtered_func_data_clean') failed, Unable to allocate memory 
for copy: Cannot allocate memory

Error in fix_3_clean (line 112)

On Fri, Aug 17, 2018 at 4:53 PM, Timothy Coalson 
mailto:tsc...@mst.edu>> wrote:
In other words, if you have matlab installed, set the matlab mode to 1 (in 
settings.sh).  If not, you will need the compiled matlab to be up to date with 
the matlab scripts, and use mode 0.

The alternative, figuring out how to build octave to handle larger matrices, 
sounds like an adventure.

Tim


On Fri, Aug 17, 2018 at 4:50 PM, Timothy Coalson 
mailto:tsc...@mst.edu>> wrote:
It looks like default builds of octave have an unfortunate limit to the 
possible number of elements in any matrix:

octave:1> test=zeros(6);
error: out of memory or dimension too large for Octave's index type

The limit appears to be 2 billion elements total (signed 32-bit).

Tim


On Fri, Aug 17, 2018 at 4:42 PM, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
Is everything 64bit?  Do you have swap space?

Matt.

From: Timothy Hendrickson mailto:hendr...@umn.edu>>
Date: Friday, August 17, 2018 at 4:39 PM
To: Timothy Coalson mailto:tsc...@mst.edu>>
Cc: Matt Glasser mailto:glass...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:HCP-Users@humanconnectome.org>>
Subject: Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

It is one participant, with 2950 timepoints and 2mm isotropic voxels. The 
machine has 32gb of memory.

-Tim

Timothy Hendrickson
Neuroimaging Analyst/Staff Scientist
University of Minnesota Informatics Institute
University of Minnesota
Bioinformatics M.S. Candidate
Office: 612-624-0783
Mobile: 507-259-3434 (texts okay)

On Fri, Aug 17, 2018 at 4:32 PM, Timothy Coalson 
mailto:tsc...@mst.edu>> wrote:
How much memory does the machine you are running this on have?  What is the 
number of timepoints and number of voxels of your concatenated input?  Are you 
trying to run more than one subject at once on the machine?

I believe the way we are approaching multi-run fix is that we only concatenate 
scans that were taken on the same day.

Tim


On Fri, Aug 17, 2018 at 4:24 PM, Timothy Hendrickson 
mailto:hendr...@umn.edu>> wrote:
Hmm, now I am getting a different error...

Elapsed time is 1.67516 seconds.
Elapsed time is 2.38414 seconds.

error: out of memory or dimension too large for Octave's index type

Timothy Hendrickson
Neuroimaging Analyst/Staff Scientist
University of Minnesota Informatics Institute
University of Minnesota
Bioinformatics M.S. Candidate
Office: 612-624-0783
Mobile: 507-259-3434 (texts okay)

On Fri, Aug 17, 2018 at 2:04 PM, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
I have attached an updated version of read_avw_img.m that forces matlab to 
retain the same precision as the input file rather than converting everything 
to doubles.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Timothy Hendrickson mailto:hendr...@umn.edu>>
Date: Friday, August 17, 2018 at 1:58 PM
To: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:HCP-Users@humanconnectome.org>>
Subject: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

Hello,

I am attempting to run FIX cleanup on a rather large dataset (several fMRI 
scans concatenated together via hcp_fix_multi_run) and am running into the 
following error:

  In fix_3_clean at 45
Elapsed time is 1.153074 seconds.
Elapsed time is 1.077840 seconds.
Error using fread
Out of memory. Type HELP MEMORY for your options.

Error in read_avw_img (line 24)



Error in read_avw (line 34)



Error in fix_3_clean (line 63)



MATLAB:nomem


How much memory does MATLAB require for this, and is there a way to change this 
via an argument?

-Tim


___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient,

Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

2018-08-21 Thread Timothy Hendrickson
Okay I have gotten past the previous issue with read_avg, however, I am
experiencing a new error.

See below:

Error using call_fsl (line 36)
FSL call (/bin/sh -c '. /usr/local/fsl/etc/fslconf/fsl.sh; fslcpgeom
filtered_func_data filtered_func_data_clean') failed, Unable to allocate
memory for copy: Cannot allocate memory

Error in fix_3_clean (line 112)

On Fri, Aug 17, 2018 at 4:53 PM, Timothy Coalson  wrote:

> In other words, if you have matlab installed, set the matlab mode to 1 (in
> settings.sh).  If not, you will need the compiled matlab to be up to date
> with the matlab scripts, and use mode 0.
>
> The alternative, figuring out how to build octave to handle larger
> matrices, sounds like an adventure.
>
> Tim
>
>
> On Fri, Aug 17, 2018 at 4:50 PM, Timothy Coalson  wrote:
>
>> It looks like default builds of octave have an unfortunate limit to the
>> possible number of elements in any matrix:
>>
>> octave:1> test=zeros(6);
>> error: out of memory or dimension too large for Octave's index type
>>
>> The limit appears to be 2 billion elements total (signed 32-bit).
>>
>> Tim
>>
>>
>> On Fri, Aug 17, 2018 at 4:42 PM, Glasser, Matthew 
>> wrote:
>>
>>> Is everything 64bit?  Do you have swap space?
>>>
>>> Matt.
>>>
>>> From: Timothy Hendrickson 
>>> Date: Friday, August 17, 2018 at 4:39 PM
>>> To: Timothy Coalson 
>>> Cc: Matt Glasser , "hcp-users@humanconnectome.org" <
>>> HCP-Users@humanconnectome.org>
>>> Subject: Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run
>>>
>>> It is one participant, with 2950 timepoints and 2mm isotropic voxels.
>>> The machine has 32gb of memory.
>>>
>>> -Tim
>>>
>>> Timothy Hendrickson
>>> Neuroimaging Analyst/Staff Scientist
>>> University of Minnesota Informatics Institute
>>> University of Minnesota
>>> Bioinformatics M.S. Candidate
>>> Office: 612-624-0783
>>> Mobile: 507-259-3434 (texts okay)
>>>
>>> On Fri, Aug 17, 2018 at 4:32 PM, Timothy Coalson  wrote:
>>>
>>>> How much memory does the machine you are running this on have?  What is
>>>> the number of timepoints and number of voxels of your concatenated input?
>>>> Are you trying to run more than one subject at once on the machine?
>>>>
>>>> I believe the way we are approaching multi-run fix is that we only
>>>> concatenate scans that were taken on the same day.
>>>>
>>>> Tim
>>>>
>>>>
>>>> On Fri, Aug 17, 2018 at 4:24 PM, Timothy Hendrickson 
>>>> wrote:
>>>>
>>>>> Hmm, now I am getting a different error...
>>>>>
>>>>> Elapsed time is 1.67516 seconds.
>>>>> Elapsed time is 2.38414 seconds.
>>>>>
>>>>> error: out of memory or dimension too large for Octave's index type
>>>>>
>>>>> Timothy Hendrickson
>>>>> Neuroimaging Analyst/Staff Scientist
>>>>> University of Minnesota Informatics Institute
>>>>> University of Minnesota
>>>>> Bioinformatics M.S. Candidate
>>>>> Office: 612-624-0783
>>>>> Mobile: 507-259-3434 (texts okay)
>>>>>
>>>>> On Fri, Aug 17, 2018 at 2:04 PM, Glasser, Matthew 
>>>>> wrote:
>>>>>
>>>>>> I have attached an updated version of read_avw_img.m that forces
>>>>>> matlab to retain the same precision as the input file rather than
>>>>>> converting everything to doubles.
>>>>>>
>>>>>> Matt.
>>>>>>
>>>>>> From:  on behalf of Timothy
>>>>>> Hendrickson 
>>>>>> Date: Friday, August 17, 2018 at 1:58 PM
>>>>>> To: "hcp-users@humanconnectome.org" 
>>>>>> Subject: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run
>>>>>>
>>>>>> Hello,
>>>>>>
>>>>>> I am attempting to run FIX cleanup on a rather large dataset (several
>>>>>> fMRI scans concatenated together via hcp_fix_multi_run) and am running 
>>>>>> into
>>>>>> the following error:
>>>>>>
>>>>>> *  In fix_3_clean at 45 *
>>>>>> *Elapsed time is 1.153074 seconds.*
>>>>>> *Elapsed time is 1.077840 secon

Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

2018-08-17 Thread Timothy Coalson
In other words, if you have matlab installed, set the matlab mode to 1 (in
settings.sh).  If not, you will need the compiled matlab to be up to date
with the matlab scripts, and use mode 0.

The alternative, figuring out how to build octave to handle larger
matrices, sounds like an adventure.

Tim


On Fri, Aug 17, 2018 at 4:50 PM, Timothy Coalson  wrote:

> It looks like default builds of octave have an unfortunate limit to the
> possible number of elements in any matrix:
>
> octave:1> test=zeros(6);
> error: out of memory or dimension too large for Octave's index type
>
> The limit appears to be 2 billion elements total (signed 32-bit).
>
> Tim
>
>
> On Fri, Aug 17, 2018 at 4:42 PM, Glasser, Matthew 
> wrote:
>
>> Is everything 64bit?  Do you have swap space?
>>
>> Matt.
>>
>> From: Timothy Hendrickson 
>> Date: Friday, August 17, 2018 at 4:39 PM
>> To: Timothy Coalson 
>> Cc: Matt Glasser , "hcp-users@humanconnectome.org" <
>> HCP-Users@humanconnectome.org>
>> Subject: Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run
>>
>> It is one participant, with 2950 timepoints and 2mm isotropic voxels. The
>> machine has 32gb of memory.
>>
>> -Tim
>>
>> Timothy Hendrickson
>> Neuroimaging Analyst/Staff Scientist
>> University of Minnesota Informatics Institute
>> University of Minnesota
>> Bioinformatics M.S. Candidate
>> Office: 612-624-0783
>> Mobile: 507-259-3434 (texts okay)
>>
>> On Fri, Aug 17, 2018 at 4:32 PM, Timothy Coalson  wrote:
>>
>>> How much memory does the machine you are running this on have?  What is
>>> the number of timepoints and number of voxels of your concatenated input?
>>> Are you trying to run more than one subject at once on the machine?
>>>
>>> I believe the way we are approaching multi-run fix is that we only
>>> concatenate scans that were taken on the same day.
>>>
>>> Tim
>>>
>>>
>>> On Fri, Aug 17, 2018 at 4:24 PM, Timothy Hendrickson 
>>> wrote:
>>>
>>>> Hmm, now I am getting a different error...
>>>>
>>>> Elapsed time is 1.67516 seconds.
>>>> Elapsed time is 2.38414 seconds.
>>>>
>>>> error: out of memory or dimension too large for Octave's index type
>>>>
>>>> Timothy Hendrickson
>>>> Neuroimaging Analyst/Staff Scientist
>>>> University of Minnesota Informatics Institute
>>>> University of Minnesota
>>>> Bioinformatics M.S. Candidate
>>>> Office: 612-624-0783
>>>> Mobile: 507-259-3434 (texts okay)
>>>>
>>>> On Fri, Aug 17, 2018 at 2:04 PM, Glasser, Matthew 
>>>> wrote:
>>>>
>>>>> I have attached an updated version of read_avw_img.m that forces
>>>>> matlab to retain the same precision as the input file rather than
>>>>> converting everything to doubles.
>>>>>
>>>>> Matt.
>>>>>
>>>>> From:  on behalf of Timothy
>>>>> Hendrickson 
>>>>> Date: Friday, August 17, 2018 at 1:58 PM
>>>>> To: "hcp-users@humanconnectome.org" 
>>>>> Subject: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run
>>>>>
>>>>> Hello,
>>>>>
>>>>> I am attempting to run FIX cleanup on a rather large dataset (several
>>>>> fMRI scans concatenated together via hcp_fix_multi_run) and am running 
>>>>> into
>>>>> the following error:
>>>>>
>>>>> *  In fix_3_clean at 45 *
>>>>> *Elapsed time is 1.153074 seconds.*
>>>>> *Elapsed time is 1.077840 seconds.*
>>>>> *Error using fread*
>>>>> *Out of memory. Type HELP MEMORY for your options.*
>>>>>
>>>>> *Error in read_avw_img (line 24)*
>>>>>
>>>>>
>>>>>
>>>>> *Error in read_avw (line 34)*
>>>>>
>>>>>
>>>>>
>>>>> *Error in fix_3_clean (line 63)*
>>>>>
>>>>>
>>>>>
>>>>> *MATLAB:nomem*
>>>>>
>>>>>
>>>>> How much memory does MATLAB require for this, and is there a way to
>>>>> change this via an argument?
>>>>>
>>>>> -Tim
>>>>>
>>>>> ___
>>>>> HCP-Users mailing list
&g

Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

2018-08-17 Thread Timothy Coalson
It looks like default builds of octave have an unfortunate limit to the
possible number of elements in any matrix:

octave:1> test=zeros(6);
error: out of memory or dimension too large for Octave's index type

The limit appears to be 2 billion elements total (signed 32-bit).

Tim


On Fri, Aug 17, 2018 at 4:42 PM, Glasser, Matthew 
wrote:

> Is everything 64bit?  Do you have swap space?
>
> Matt.
>
> From: Timothy Hendrickson 
> Date: Friday, August 17, 2018 at 4:39 PM
> To: Timothy Coalson 
> Cc: Matt Glasser , "hcp-users@humanconnectome.org" <
> HCP-Users@humanconnectome.org>
> Subject: Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run
>
> It is one participant, with 2950 timepoints and 2mm isotropic voxels. The
> machine has 32gb of memory.
>
> -Tim
>
> Timothy Hendrickson
> Neuroimaging Analyst/Staff Scientist
> University of Minnesota Informatics Institute
> University of Minnesota
> Bioinformatics M.S. Candidate
> Office: 612-624-0783
> Mobile: 507-259-3434 (texts okay)
>
> On Fri, Aug 17, 2018 at 4:32 PM, Timothy Coalson  wrote:
>
>> How much memory does the machine you are running this on have?  What is
>> the number of timepoints and number of voxels of your concatenated input?
>> Are you trying to run more than one subject at once on the machine?
>>
>> I believe the way we are approaching multi-run fix is that we only
>> concatenate scans that were taken on the same day.
>>
>> Tim
>>
>>
>> On Fri, Aug 17, 2018 at 4:24 PM, Timothy Hendrickson 
>> wrote:
>>
>>> Hmm, now I am getting a different error...
>>>
>>> Elapsed time is 1.67516 seconds.
>>> Elapsed time is 2.38414 seconds.
>>>
>>> error: out of memory or dimension too large for Octave's index type
>>>
>>> Timothy Hendrickson
>>> Neuroimaging Analyst/Staff Scientist
>>> University of Minnesota Informatics Institute
>>> University of Minnesota
>>> Bioinformatics M.S. Candidate
>>> Office: 612-624-0783
>>> Mobile: 507-259-3434 (texts okay)
>>>
>>> On Fri, Aug 17, 2018 at 2:04 PM, Glasser, Matthew 
>>> wrote:
>>>
>>>> I have attached an updated version of read_avw_img.m that forces matlab
>>>> to retain the same precision as the input file rather than converting
>>>> everything to doubles.
>>>>
>>>> Matt.
>>>>
>>>> From:  on behalf of Timothy
>>>> Hendrickson 
>>>> Date: Friday, August 17, 2018 at 1:58 PM
>>>> To: "hcp-users@humanconnectome.org" 
>>>> Subject: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run
>>>>
>>>> Hello,
>>>>
>>>> I am attempting to run FIX cleanup on a rather large dataset (several
>>>> fMRI scans concatenated together via hcp_fix_multi_run) and am running into
>>>> the following error:
>>>>
>>>> *  In fix_3_clean at 45 *
>>>> *Elapsed time is 1.153074 seconds.*
>>>> *Elapsed time is 1.077840 seconds.*
>>>> *Error using fread*
>>>> *Out of memory. Type HELP MEMORY for your options.*
>>>>
>>>> *Error in read_avw_img (line 24)*
>>>>
>>>>
>>>>
>>>> *Error in read_avw (line 34)*
>>>>
>>>>
>>>>
>>>> *Error in fix_3_clean (line 63)*
>>>>
>>>>
>>>>
>>>> *MATLAB:nomem*
>>>>
>>>>
>>>> How much memory does MATLAB require for this, and is there a way to
>>>> change this via an argument?
>>>>
>>>> -Tim
>>>>
>>>> ___
>>>> HCP-Users mailing list
>>>> HCP-Users@humanconnectome.org
>>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>>>
>>>>
>>>> --
>>>>
>>>> The materials in this message are private and may contain Protected
>>>> Healthcare Information or other information of a sensitive nature. If you
>>>> are not the intended recipient, be advised that any unauthorized use,
>>>> disclosure, copying or the taking of any action in reliance on the contents
>>>> of this information is strictly prohibited. If you have received this email
>>>> in error, please immediately notify the sender via telephone or return 
>>>> mail.
>>>>
>>>
>>> ___
>>> HCP-Users mailing list
>>> HCP-Users@humanconnectome.org
>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>>
>>
>>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

2018-08-17 Thread Timothy Hendrickson
Everything is 64bit. I was testing on my local workstation. I'll shift it
over to a more powerful system (128gb memory) if you think it is purely a
memory issue.

Timothy Hendrickson
Neuroimaging Analyst/Staff Scientist
University of Minnesota Informatics Institute
University of Minnesota
Bioinformatics M.S. Candidate
Office: 612-624-0783
Mobile: 507-259-3434 (texts okay)

On Fri, Aug 17, 2018 at 4:42 PM, Glasser, Matthew 
wrote:

> Is everything 64bit?  Do you have swap space?
>
> Matt.
>
> From: Timothy Hendrickson 
> Date: Friday, August 17, 2018 at 4:39 PM
> To: Timothy Coalson 
> Cc: Matt Glasser , "hcp-users@humanconnectome.org" <
> HCP-Users@humanconnectome.org>
> Subject: Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run
>
> It is one participant, with 2950 timepoints and 2mm isotropic voxels. The
> machine has 32gb of memory.
>
> -Tim
>
> Timothy Hendrickson
> Neuroimaging Analyst/Staff Scientist
> University of Minnesota Informatics Institute
> University of Minnesota
> Bioinformatics M.S. Candidate
> Office: 612-624-0783
> Mobile: 507-259-3434 (texts okay)
>
> On Fri, Aug 17, 2018 at 4:32 PM, Timothy Coalson  wrote:
>
>> How much memory does the machine you are running this on have?  What is
>> the number of timepoints and number of voxels of your concatenated input?
>> Are you trying to run more than one subject at once on the machine?
>>
>> I believe the way we are approaching multi-run fix is that we only
>> concatenate scans that were taken on the same day.
>>
>> Tim
>>
>>
>> On Fri, Aug 17, 2018 at 4:24 PM, Timothy Hendrickson 
>> wrote:
>>
>>> Hmm, now I am getting a different error...
>>>
>>> Elapsed time is 1.67516 seconds.
>>> Elapsed time is 2.38414 seconds.
>>>
>>> error: out of memory or dimension too large for Octave's index type
>>>
>>> Timothy Hendrickson
>>> Neuroimaging Analyst/Staff Scientist
>>> University of Minnesota Informatics Institute
>>> University of Minnesota
>>> Bioinformatics M.S. Candidate
>>> Office: 612-624-0783
>>> Mobile: 507-259-3434 (texts okay)
>>>
>>> On Fri, Aug 17, 2018 at 2:04 PM, Glasser, Matthew 
>>> wrote:
>>>
>>>> I have attached an updated version of read_avw_img.m that forces matlab
>>>> to retain the same precision as the input file rather than converting
>>>> everything to doubles.
>>>>
>>>> Matt.
>>>>
>>>> From:  on behalf of Timothy
>>>> Hendrickson 
>>>> Date: Friday, August 17, 2018 at 1:58 PM
>>>> To: "hcp-users@humanconnectome.org" 
>>>> Subject: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run
>>>>
>>>> Hello,
>>>>
>>>> I am attempting to run FIX cleanup on a rather large dataset (several
>>>> fMRI scans concatenated together via hcp_fix_multi_run) and am running into
>>>> the following error:
>>>>
>>>> *  In fix_3_clean at 45 *
>>>> *Elapsed time is 1.153074 seconds.*
>>>> *Elapsed time is 1.077840 seconds.*
>>>> *Error using fread*
>>>> *Out of memory. Type HELP MEMORY for your options.*
>>>>
>>>> *Error in read_avw_img (line 24)*
>>>>
>>>>
>>>>
>>>> *Error in read_avw (line 34)*
>>>>
>>>>
>>>>
>>>> *Error in fix_3_clean (line 63)*
>>>>
>>>>
>>>>
>>>> *MATLAB:nomem*
>>>>
>>>>
>>>> How much memory does MATLAB require for this, and is there a way to
>>>> change this via an argument?
>>>>
>>>> -Tim
>>>>
>>>> ___
>>>> HCP-Users mailing list
>>>> HCP-Users@humanconnectome.org
>>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>>>
>>>>
>>>> --
>>>>
>>>> The materials in this message are private and may contain Protected
>>>> Healthcare Information or other information of a sensitive nature. If you
>>>> are not the intended recipient, be advised that any unauthorized use,
>>>> disclosure, copying or the taking of any action in reliance on the contents
>>>> of this information is strictly prohibited. If you have received this email
>>>> in error, please immediately notify the sender via telephone or return 
>>>> mail.
>>>>
>>>
>>> ___
>>> HCP-Users mailing list
>>> HCP-Users@humanconnectome.org
>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>>
>>
>>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

2018-08-17 Thread Glasser, Matthew
Is everything 64bit?  Do you have swap space?

Matt.

From: Timothy Hendrickson mailto:hendr...@umn.edu>>
Date: Friday, August 17, 2018 at 4:39 PM
To: Timothy Coalson mailto:tsc...@mst.edu>>
Cc: Matt Glasser mailto:glass...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:HCP-Users@humanconnectome.org>>
Subject: Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

It is one participant, with 2950 timepoints and 2mm isotropic voxels. The 
machine has 32gb of memory.

-Tim

Timothy Hendrickson
Neuroimaging Analyst/Staff Scientist
University of Minnesota Informatics Institute
University of Minnesota
Bioinformatics M.S. Candidate
Office: 612-624-0783
Mobile: 507-259-3434 (texts okay)

On Fri, Aug 17, 2018 at 4:32 PM, Timothy Coalson 
mailto:tsc...@mst.edu>> wrote:
How much memory does the machine you are running this on have?  What is the 
number of timepoints and number of voxels of your concatenated input?  Are you 
trying to run more than one subject at once on the machine?

I believe the way we are approaching multi-run fix is that we only concatenate 
scans that were taken on the same day.

Tim


On Fri, Aug 17, 2018 at 4:24 PM, Timothy Hendrickson 
mailto:hendr...@umn.edu>> wrote:
Hmm, now I am getting a different error...

Elapsed time is 1.67516 seconds.
Elapsed time is 2.38414 seconds.

error: out of memory or dimension too large for Octave's index type

Timothy Hendrickson
Neuroimaging Analyst/Staff Scientist
University of Minnesota Informatics Institute
University of Minnesota
Bioinformatics M.S. Candidate
Office: 612-624-0783
Mobile: 507-259-3434 (texts okay)

On Fri, Aug 17, 2018 at 2:04 PM, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
I have attached an updated version of read_avw_img.m that forces matlab to 
retain the same precision as the input file rather than converting everything 
to doubles.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Timothy Hendrickson mailto:hendr...@umn.edu>>
Date: Friday, August 17, 2018 at 1:58 PM
To: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:HCP-Users@humanconnectome.org>>
Subject: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

Hello,

I am attempting to run FIX cleanup on a rather large dataset (several fMRI 
scans concatenated together via hcp_fix_multi_run) and am running into the 
following error:

  In fix_3_clean at 45
Elapsed time is 1.153074 seconds.
Elapsed time is 1.077840 seconds.
Error using fread
Out of memory. Type HELP MEMORY for your options.

Error in read_avw_img (line 24)



Error in read_avw (line 34)



Error in fix_3_clean (line 63)



MATLAB:nomem


How much memory does MATLAB require for this, and is there a way to change this 
via an argument?

-Tim


___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.


___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users




The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

2018-08-17 Thread Timothy Hendrickson
It is one participant, with 2950 timepoints and 2mm isotropic voxels. The
machine has 32gb of memory.

-Tim

Timothy Hendrickson
Neuroimaging Analyst/Staff Scientist
University of Minnesota Informatics Institute
University of Minnesota
Bioinformatics M.S. Candidate
Office: 612-624-0783
Mobile: 507-259-3434 (texts okay)

On Fri, Aug 17, 2018 at 4:32 PM, Timothy Coalson  wrote:

> How much memory does the machine you are running this on have?  What is
> the number of timepoints and number of voxels of your concatenated input?
> Are you trying to run more than one subject at once on the machine?
>
> I believe the way we are approaching multi-run fix is that we only
> concatenate scans that were taken on the same day.
>
> Tim
>
>
> On Fri, Aug 17, 2018 at 4:24 PM, Timothy Hendrickson 
> wrote:
>
>> Hmm, now I am getting a different error...
>>
>> Elapsed time is 1.67516 seconds.
>> Elapsed time is 2.38414 seconds.
>>
>> error: out of memory or dimension too large for Octave's index type
>>
>> Timothy Hendrickson
>> Neuroimaging Analyst/Staff Scientist
>> University of Minnesota Informatics Institute
>> University of Minnesota
>> Bioinformatics M.S. Candidate
>> Office: 612-624-0783
>> Mobile: 507-259-3434 (texts okay)
>>
>> On Fri, Aug 17, 2018 at 2:04 PM, Glasser, Matthew 
>> wrote:
>>
>>> I have attached an updated version of read_avw_img.m that forces matlab
>>> to retain the same precision as the input file rather than converting
>>> everything to doubles.
>>>
>>> Matt.
>>>
>>> From:  on behalf of Timothy
>>> Hendrickson 
>>> Date: Friday, August 17, 2018 at 1:58 PM
>>> To: "hcp-users@humanconnectome.org" 
>>> Subject: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run
>>>
>>> Hello,
>>>
>>> I am attempting to run FIX cleanup on a rather large dataset (several
>>> fMRI scans concatenated together via hcp_fix_multi_run) and am running into
>>> the following error:
>>>
>>> *  In fix_3_clean at 45 *
>>> *Elapsed time is 1.153074 seconds.*
>>> *Elapsed time is 1.077840 seconds.*
>>> *Error using fread*
>>> *Out of memory. Type HELP MEMORY for your options.*
>>>
>>> *Error in read_avw_img (line 24)*
>>>
>>>
>>>
>>> *Error in read_avw (line 34)*
>>>
>>>
>>>
>>> *Error in fix_3_clean (line 63)*
>>>
>>>
>>>
>>> *MATLAB:nomem*
>>>
>>>
>>> How much memory does MATLAB require for this, and is there a way to
>>> change this via an argument?
>>>
>>> -Tim
>>>
>>> ___
>>> HCP-Users mailing list
>>> HCP-Users@humanconnectome.org
>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>>
>>>
>>> --
>>>
>>> The materials in this message are private and may contain Protected
>>> Healthcare Information or other information of a sensitive nature. If you
>>> are not the intended recipient, be advised that any unauthorized use,
>>> disclosure, copying or the taking of any action in reliance on the contents
>>> of this information is strictly prohibited. If you have received this email
>>> in error, please immediately notify the sender via telephone or return mail.
>>>
>>
>> ___
>> HCP-Users mailing list
>> HCP-Users@humanconnectome.org
>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>
>
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

2018-08-17 Thread Timothy Coalson
How much memory does the machine you are running this on have?  What is the
number of timepoints and number of voxels of your concatenated input?  Are
you trying to run more than one subject at once on the machine?

I believe the way we are approaching multi-run fix is that we only
concatenate scans that were taken on the same day.

Tim


On Fri, Aug 17, 2018 at 4:24 PM, Timothy Hendrickson 
wrote:

> Hmm, now I am getting a different error...
>
> Elapsed time is 1.67516 seconds.
> Elapsed time is 2.38414 seconds.
>
> error: out of memory or dimension too large for Octave's index type
>
> Timothy Hendrickson
> Neuroimaging Analyst/Staff Scientist
> University of Minnesota Informatics Institute
> University of Minnesota
> Bioinformatics M.S. Candidate
> Office: 612-624-0783
> Mobile: 507-259-3434 (texts okay)
>
> On Fri, Aug 17, 2018 at 2:04 PM, Glasser, Matthew 
> wrote:
>
>> I have attached an updated version of read_avw_img.m that forces matlab
>> to retain the same precision as the input file rather than converting
>> everything to doubles.
>>
>> Matt.
>>
>> From:  on behalf of Timothy
>> Hendrickson 
>> Date: Friday, August 17, 2018 at 1:58 PM
>> To: "hcp-users@humanconnectome.org" 
>> Subject: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run
>>
>> Hello,
>>
>> I am attempting to run FIX cleanup on a rather large dataset (several
>> fMRI scans concatenated together via hcp_fix_multi_run) and am running into
>> the following error:
>>
>> *  In fix_3_clean at 45 *
>> *Elapsed time is 1.153074 seconds.*
>> *Elapsed time is 1.077840 seconds.*
>> *Error using fread*
>> *Out of memory. Type HELP MEMORY for your options.*
>>
>> *Error in read_avw_img (line 24)*
>>
>>
>>
>> *Error in read_avw (line 34)*
>>
>>
>>
>> *Error in fix_3_clean (line 63)*
>>
>>
>>
>> *MATLAB:nomem*
>>
>>
>> How much memory does MATLAB require for this, and is there a way to
>> change this via an argument?
>>
>> -Tim
>>
>> ___
>> HCP-Users mailing list
>> HCP-Users@humanconnectome.org
>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>
>>
>> --
>>
>> The materials in this message are private and may contain Protected
>> Healthcare Information or other information of a sensitive nature. If you
>> are not the intended recipient, be advised that any unauthorized use,
>> disclosure, copying or the taking of any action in reliance on the contents
>> of this information is strictly prohibited. If you have received this email
>> in error, please immediately notify the sender via telephone or return mail.
>>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

2018-08-17 Thread Timothy Hendrickson
Hmm, now I am getting a different error...

Elapsed time is 1.67516 seconds.
Elapsed time is 2.38414 seconds.

error: out of memory or dimension too large for Octave's index type

Timothy Hendrickson
Neuroimaging Analyst/Staff Scientist
University of Minnesota Informatics Institute
University of Minnesota
Bioinformatics M.S. Candidate
Office: 612-624-0783
Mobile: 507-259-3434 (texts okay)

On Fri, Aug 17, 2018 at 2:04 PM, Glasser, Matthew 
wrote:

> I have attached an updated version of read_avw_img.m that forces matlab to
> retain the same precision as the input file rather than converting
> everything to doubles.
>
> Matt.
>
> From:  on behalf of Timothy
> Hendrickson 
> Date: Friday, August 17, 2018 at 1:58 PM
> To: "hcp-users@humanconnectome.org" 
> Subject: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run
>
> Hello,
>
> I am attempting to run FIX cleanup on a rather large dataset (several fMRI
> scans concatenated together via hcp_fix_multi_run) and am running into the
> following error:
>
> *  In fix_3_clean at 45 *
> *Elapsed time is 1.153074 seconds.*
> *Elapsed time is 1.077840 seconds.*
> *Error using fread*
> *Out of memory. Type HELP MEMORY for your options.*
>
> *Error in read_avw_img (line 24)*
>
>
>
> *Error in read_avw (line 34)*
>
>
>
> *Error in fix_3_clean (line 63)*
>
>
>
> *MATLAB:nomem*
>
>
> How much memory does MATLAB require for this, and is there a way to change
> this via an argument?
>
> -Tim
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

2018-08-17 Thread Glasser, Matthew
I have attached an updated version of read_avw_img.m that forces matlab to 
retain the same precision as the input file rather than converting everything 
to doubles.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Timothy Hendrickson mailto:hendr...@umn.edu>>
Date: Friday, August 17, 2018 at 1:58 PM
To: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:HCP-Users@humanconnectome.org>>
Subject: [HCP-Users] FSL FIX no memory with hcp_fix_multi_run

Hello,

I am attempting to run FIX cleanup on a rather large dataset (several fMRI 
scans concatenated together via hcp_fix_multi_run) and am running into the 
following error:

  In fix_3_clean at 45
Elapsed time is 1.153074 seconds.
Elapsed time is 1.077840 seconds.
Error using fread
Out of memory. Type HELP MEMORY for your options.

Error in read_avw_img (line 24)



Error in read_avw (line 34)



Error in fix_3_clean (line 63)



MATLAB:nomem


How much memory does MATLAB require for this, and is there a way to change this 
via an argument?

-Tim


___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


read_avw_img.m
Description: read_avw_img.m


[HCP-Users] FSL FIX no memory with hcp_fix_multi_run

2018-08-17 Thread Timothy Hendrickson
Hello,

I am attempting to run FIX cleanup on a rather large dataset (several fMRI
scans concatenated together via hcp_fix_multi_run) and am running into the
following error:

*  In fix_3_clean at 45 *
*Elapsed time is 1.153074 seconds.*
*Elapsed time is 1.077840 seconds.*
*Error using fread*
*Out of memory. Type HELP MEMORY for your options.*

*Error in read_avw_img (line 24)*



*Error in read_avw (line 34)*



*Error in fix_3_clean (line 63)*



*MATLAB:nomem*


How much memory does MATLAB require for this, and is there a way to change
this via an argument?

-Tim

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users