Re: [HCP-Users] hcp_fix bug

2016-10-19 Thread Glasser, Matthew
You also need the GIFTI toolbox installed for Matlab:

http://www.artefact.tk/software/matlab/gifti/

Peace,

Matt.

From: nailin yao >
Date: Wednesday, October 19, 2016 at 6:58 PM
To: Matt Glasser >
Cc: "hcp-users@humanconnectome.org" 
>
Subject: Re: [HCP-Users] hcp_fix bug

Hi Matt,

"... " in last Email refer to 
/home/fas/glahn/ny87/software/workbench/bin_linux64/

So before running fix_3_clean, I've set 
WBC='/home/fas/glahn/ny87/software/workbench/bin_linux64/wb_command' and 
CIFTI='/home/fas/glahn/ny87/software/workbench' , which have ciftiopen.m 
stored. However, I still met the problem as mentioned before. Do you know what 
could be wrong? Thank you!

Best,
Nailin




2016-10-19 19:29 GMT-04:00 Glasser, Matthew 
>:
I don’t think three dots is valid.

Peace,

Matt.

From: "ynai...@gmail.com" 
>
Date: Wednesday, October 19, 2016 at 5:04 PM

To: Matt Glasser >
Cc: "hcp-users@humanconnectome.org" 
>
Subject: Re: [HCP-Users] hcp_fix bug

Hi Matt,

I added all the paths needed in fix_3_clean, but still get the error. Is it 
possible because I add WBC as .../wb_command ?

Best,
Nailin

Sent from my iPhone

On Oct 19, 2016, at 5:46 PM, Glasser, Matthew 
> wrote:

You could also need to run the add path parts of the command.

Peace,

Matt.

From: nailin yao >
Date: Wednesday, October 19, 2016 at 4:02 PM
To: Matt Glasser >
Cc: "hcp-users@humanconnectome.org" 
>
Subject: Re: [HCP-Users] hcp_fix bug

Hi Matt,


I tried to run fix_3_clean on matlab directly, and met the error message below:


fix_3_clean('.fix',0,1,2000)

TR =

0.7200

/bin/bash: -cifti-convert: command not found

Undefined function 'gifti' for input arguments of type 'char'.

Error in ciftiopen (line 31)

cifti = gifti([tmpfile '.gii']);

Error in fix_3_clean (line 40)

  BO=ciftiopen('Atlas.dtseries.nii',WBC);


Do you know why? Thank you very much!

Best,
Nailin

2016-10-17 20:12 GMT-04:00 Glasser, Matthew 
>:
Is it a reproducible error?  What happens if you try running the fix_3_clean 
function with the parameters mentioned below?

Peace,

Matt.

From: nailin yao >
Date: Monday, October 17, 2016 at 2:58 PM
To: Matt Glasser >
Cc: "hcp-users@humanconnectome.org" 
>
Subject: Re: [HCP-Users] hcp_fix bug

Hi Matt,

It's not compiled matlab.

Best,
Nailin

2016-10-17 13:53 GMT-04:00 Glasser, Matthew 
>:
Is this compiled matlab or not compiled?

Peace,

Matt.

From: 
>
 on behalf of nailin yao >
Date: Monday, October 17, 2016 at 11:13 AM
To: "hcp-users@humanconnectome.org" 
>
Subject: [HCP-Users] hcp_fix bug

Hi ,

I received the bug report when running hcp_fix for the last step:


running FIX

FIX Feature extraction for Melodic output directory: 
rfMRI_REST1_LR_nonlin_norm_hp2000.ica

 create edge masks

 run FAST

 registration of standard space masks

 extract features

FIX Classifying components in Melodic directory: 
rfMRI_REST1_LR_nonlin_norm_hp2000.ica using training file: 
/home/fas/glahn/ny87/Pipelines/fix1.06/training_files/HCP_hp2000.RData and 
threshold 10

FIX Applying cleanup using cleanup file: 
rfMRI_REST1_LR_nonlin_norm_hp2000.ica/fix4melview_HCP_hp2000_thr10.txt and 
motion cleanup set to 1

sh: line 1: 24221 Bus error   (core dumped) 
/gpfs/apps/hpc/Apps/Matlab/R2015b/bin/matlab -nojvm -nodisplay -nodesktop 
-nosplash -r "addpath('/home/fas/glahn/ny87/Pipelines/fix1.06'); 
addpath('/gpfs/apps/hpc/Apps/FSL/5.0.6/fsl/etc/matlab'); 
fix_3_clean('.fix',0,1,2000)" >> .fix.log 2>&1

Does anyone know what's the problem? Thank you very much!

Best,
Nailin

--
Nailin Yao,  PhD

Postdoctoral Associate
Department of Psychiatry, Yale University


___
HCP-Users mailing list
HCP-Users@humanconnectome.org

Re: [HCP-Users] hcp_fix bug

2016-10-19 Thread Glasser, Matthew
I don’t think three dots is valid.

Peace,

Matt.

From: "ynai...@gmail.com" 
>
Date: Wednesday, October 19, 2016 at 5:04 PM
To: Matt Glasser >
Cc: "hcp-users@humanconnectome.org" 
>
Subject: Re: [HCP-Users] hcp_fix bug

Hi Matt,

I added all the paths needed in fix_3_clean, but still get the error. Is it 
possible because I add WBC as .../wb_command ?

Best,
Nailin

Sent from my iPhone

On Oct 19, 2016, at 5:46 PM, Glasser, Matthew 
> wrote:

You could also need to run the add path parts of the command.

Peace,

Matt.

From: nailin yao >
Date: Wednesday, October 19, 2016 at 4:02 PM
To: Matt Glasser >
Cc: "hcp-users@humanconnectome.org" 
>
Subject: Re: [HCP-Users] hcp_fix bug

Hi Matt,


I tried to run fix_3_clean on matlab directly, and met the error message below:


fix_3_clean('.fix',0,1,2000)

TR =

0.7200

/bin/bash: -cifti-convert: command not found

Undefined function 'gifti' for input arguments of type 'char'.

Error in ciftiopen (line 31)

cifti = gifti([tmpfile '.gii']);

Error in fix_3_clean (line 40)

  BO=ciftiopen('Atlas.dtseries.nii',WBC);


Do you know why? Thank you very much!

Best,
Nailin

2016-10-17 20:12 GMT-04:00 Glasser, Matthew 
>:
Is it a reproducible error?  What happens if you try running the fix_3_clean 
function with the parameters mentioned below?

Peace,

Matt.

From: nailin yao >
Date: Monday, October 17, 2016 at 2:58 PM
To: Matt Glasser >
Cc: "hcp-users@humanconnectome.org" 
>
Subject: Re: [HCP-Users] hcp_fix bug

Hi Matt,

It's not compiled matlab.

Best,
Nailin

2016-10-17 13:53 GMT-04:00 Glasser, Matthew 
>:
Is this compiled matlab or not compiled?

Peace,

Matt.

From: 
>
 on behalf of nailin yao >
Date: Monday, October 17, 2016 at 11:13 AM
To: "hcp-users@humanconnectome.org" 
>
Subject: [HCP-Users] hcp_fix bug

Hi ,

I received the bug report when running hcp_fix for the last step:


running FIX

FIX Feature extraction for Melodic output directory: 
rfMRI_REST1_LR_nonlin_norm_hp2000.ica

 create edge masks

 run FAST

 registration of standard space masks

 extract features

FIX Classifying components in Melodic directory: 
rfMRI_REST1_LR_nonlin_norm_hp2000.ica using training file: 
/home/fas/glahn/ny87/Pipelines/fix1.06/training_files/HCP_hp2000.RData and 
threshold 10

FIX Applying cleanup using cleanup file: 
rfMRI_REST1_LR_nonlin_norm_hp2000.ica/fix4melview_HCP_hp2000_thr10.txt and 
motion cleanup set to 1

sh: line 1: 24221 Bus error   (core dumped) 
/gpfs/apps/hpc/Apps/Matlab/R2015b/bin/matlab -nojvm -nodisplay -nodesktop 
-nosplash -r "addpath('/home/fas/glahn/ny87/Pipelines/fix1.06'); 
addpath('/gpfs/apps/hpc/Apps/FSL/5.0.6/fsl/etc/matlab'); 
fix_3_clean('.fix',0,1,2000)" >> .fix.log 2>&1

Does anyone know what's the problem? Thank you very much!

Best,
Nailin

--
Nailin Yao,  PhD

Postdoctoral Associate
Department of Psychiatry, Yale University


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.



--
Nailin Yao,  PhD

Postdoctoral Associate
Department of Psychiatry, Yale University



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have 

Re: [HCP-Users] hcp_fix bug

2016-10-19 Thread ynailin
Hi Matt,

I added all the paths needed in fix_3_clean, but still get the error. Is it 
possible because I add WBC as .../wb_command ? 

Best,
Nailin

Sent from my iPhone

> On Oct 19, 2016, at 5:46 PM, Glasser, Matthew  wrote:
> 
> You could also need to run the add path parts of the command.
> 
> Peace,
> 
> Matt.
> 
> From: nailin yao 
> Date: Wednesday, October 19, 2016 at 4:02 PM
> To: Matt Glasser 
> Cc: "hcp-users@humanconnectome.org" 
> Subject: Re: [HCP-Users] hcp_fix bug
> 
> Hi Matt,
> 
> 
> I tried to run fix_3_clean on matlab directly, and met the error message 
> below:
> 
> 
> fix_3_clean('.fix',0,1,2000)
> 
> TR =
> 
> 0.7200
> 
> /bin/bash: -cifti-convert: command not found
> 
> Undefined function 'gifti' for input arguments of type 'char'.
> 
> Error in ciftiopen (line 31)
> 
> cifti = gifti([tmpfile '.gii']);
> 
> Error in fix_3_clean (line 40)
> 
>   BO=ciftiopen('Atlas.dtseries.nii',WBC);
> 
> 
> Do you know why? Thank you very much!
> 
> Best,
> Nailin
> 
> 2016-10-17 20:12 GMT-04:00 Glasser, Matthew :
>> Is it a reproducible error?  What happens if you try running the fix_3_clean 
>> function with the parameters mentioned below?
>> 
>> Peace,
>> 
>> Matt.
>> 
>> From: nailin yao 
>> Date: Monday, October 17, 2016 at 2:58 PM
>> To: Matt Glasser 
>> Cc: "hcp-users@humanconnectome.org" 
>> Subject: Re: [HCP-Users] hcp_fix bug
>> 
>> Hi Matt,
>> 
>> It's not compiled matlab.
>> 
>> Best,
>> Nailin
>> 
>> 2016-10-17 13:53 GMT-04:00 Glasser, Matthew :
>>> Is this compiled matlab or not compiled?
>>> 
>>> Peace,
>>> 
>>> Matt.
>>> 
>>> From:  on behalf of nailin yao 
>>> 
>>> Date: Monday, October 17, 2016 at 11:13 AM
>>> To: "hcp-users@humanconnectome.org" 
>>> Subject: [HCP-Users] hcp_fix bug
>>> 
>>> Hi ,
>>> 
>>> I received the bug report when running hcp_fix for the last step:
>>> 
>>> running FIX
>>> 
>>> FIX Feature extraction for Melodic output directory: 
>>> rfMRI_REST1_LR_nonlin_norm_hp2000.ica
>>> 
>>>  create edge masks
>>> 
>>>  run FAST
>>> 
>>>  registration of standard space masks
>>> 
>>>  extract features
>>> 
>>> FIX Classifying components in Melodic directory: 
>>> rfMRI_REST1_LR_nonlin_norm_hp2000.ica using training file: 
>>> /home/fas/glahn/ny87/Pipelines/fix1.06/training_files/HCP_hp2000.RData and 
>>> threshold 10
>>> 
>>> FIX Applying cleanup using cleanup file: 
>>> rfMRI_REST1_LR_nonlin_norm_hp2000.ica/fix4melview_HCP_hp2000_thr10.txt and 
>>> motion cleanup set to 1
>>> 
>>> sh: line 1: 24221 Bus error   (core dumped) 
>>> /gpfs/apps/hpc/Apps/Matlab/R2015b/bin/matlab -nojvm -nodisplay -nodesktop 
>>> -nosplash -r "addpath('/home/fas/glahn/ny87/Pipelines/fix1.06'); 
>>> addpath('/gpfs/apps/hpc/Apps/FSL/5.0.6/fsl/etc/matlab'); 
>>> fix_3_clean('.fix',0,1,2000)" >> .fix.log 2>&1
>>> 
>>> 
>>> Does anyone know what's the problem? Thank you very much!
>>> 
>>> Best,
>>> Nailin
>>> 
>>> -- 
>>> Nailin Yao,  PhD
>>> 
>>> Postdoctoral Associate
>>> Department of Psychiatry, Yale University 
>>> 
>>> ___
>>> HCP-Users mailing list
>>> HCP-Users@humanconnectome.org
>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>> 
>>>  
>>> 
>>> The materials in this message are private and may contain Protected 
>>> Healthcare Information or other information of a sensitive nature. If you 
>>> are not the intended recipient, be advised that any unauthorized use, 
>>> disclosure, copying or the taking of any action in reliance on the contents 
>>> of this information is strictly prohibited. If you have received this email 
>>> in error, please immediately notify the sender via telephone or return mail.
>>> 
>> 
>> 
>> 
>> -- 
>> Nailin Yao,  PhD
>> 
>> Postdoctoral Associate
>> Department of Psychiatry, Yale University 
>> 
>>  
>> 
>> The materials in this message are private and may contain Protected 
>> Healthcare Information or other information of a sensitive nature. If you 
>> are not the intended recipient, be advised that any unauthorized use, 
>> disclosure, copying or the taking of any action in reliance on the contents 
>> of this information is strictly prohibited. If you have received this email 
>> in error, please immediately notify the sender via telephone or return mail.
>> 
> 
> 
> 
> -- 
> Nailin Yao,  PhD
> 
> Postdoctoral Associate
> Department of Psychiatry, Yale University 
> 
>  
> 
> The materials in this message are private and may contain Protected 
> Healthcare Information or other information of a sensitive nature. If you are 
> not the intended recipient, be advised that any unauthorized use, disclosure, 
> copying or the taking of any action in reliance on the contents of this 
> information is 

Re: [HCP-Users] hcp_fix bug

2016-10-19 Thread Glasser, Matthew
You could also need to run the add path parts of the command.

Peace,

Matt.

From: nailin yao >
Date: Wednesday, October 19, 2016 at 4:02 PM
To: Matt Glasser >
Cc: "hcp-users@humanconnectome.org" 
>
Subject: Re: [HCP-Users] hcp_fix bug

Hi Matt,


I tried to run fix_3_clean on matlab directly, and met the error message below:


fix_3_clean('.fix',0,1,2000)

TR =

0.7200

/bin/bash: -cifti-convert: command not found

Undefined function 'gifti' for input arguments of type 'char'.

Error in ciftiopen (line 31)

cifti = gifti([tmpfile '.gii']);

Error in fix_3_clean (line 40)

  BO=ciftiopen('Atlas.dtseries.nii',WBC);


Do you know why? Thank you very much!

Best,
Nailin

2016-10-17 20:12 GMT-04:00 Glasser, Matthew 
>:
Is it a reproducible error?  What happens if you try running the fix_3_clean 
function with the parameters mentioned below?

Peace,

Matt.

From: nailin yao >
Date: Monday, October 17, 2016 at 2:58 PM
To: Matt Glasser >
Cc: "hcp-users@humanconnectome.org" 
>
Subject: Re: [HCP-Users] hcp_fix bug

Hi Matt,

It's not compiled matlab.

Best,
Nailin

2016-10-17 13:53 GMT-04:00 Glasser, Matthew 
>:
Is this compiled matlab or not compiled?

Peace,

Matt.

From: 
>
 on behalf of nailin yao >
Date: Monday, October 17, 2016 at 11:13 AM
To: "hcp-users@humanconnectome.org" 
>
Subject: [HCP-Users] hcp_fix bug

Hi ,

I received the bug report when running hcp_fix for the last step:


running FIX

FIX Feature extraction for Melodic output directory: 
rfMRI_REST1_LR_nonlin_norm_hp2000.ica

 create edge masks

 run FAST

 registration of standard space masks

 extract features

FIX Classifying components in Melodic directory: 
rfMRI_REST1_LR_nonlin_norm_hp2000.ica using training file: 
/home/fas/glahn/ny87/Pipelines/fix1.06/training_files/HCP_hp2000.RData and 
threshold 10

FIX Applying cleanup using cleanup file: 
rfMRI_REST1_LR_nonlin_norm_hp2000.ica/fix4melview_HCP_hp2000_thr10.txt and 
motion cleanup set to 1

sh: line 1: 24221 Bus error   (core dumped) 
/gpfs/apps/hpc/Apps/Matlab/R2015b/bin/matlab -nojvm -nodisplay -nodesktop 
-nosplash -r "addpath('/home/fas/glahn/ny87/Pipelines/fix1.06'); 
addpath('/gpfs/apps/hpc/Apps/FSL/5.0.6/fsl/etc/matlab'); 
fix_3_clean('.fix',0,1,2000)" >> .fix.log 2>&1

Does anyone know what's the problem? Thank you very much!

Best,
Nailin

--
Nailin Yao,  PhD

Postdoctoral Associate
Department of Psychiatry, Yale University


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.



--
Nailin Yao,  PhD

Postdoctoral Associate
Department of Psychiatry, Yale University



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.



--
Nailin Yao,  PhD

Postdoctoral Associate
Department of Psychiatry, Yale University



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___

Re: [HCP-Users] gradient nonlinearity correction question

2016-10-19 Thread Glasser, Matthew
Some do for some sequences, but because it is not uniformly applied and because 
they are likely not to use optimal interpolation algorithms, we prefer to do 
offline correction.

Peace,

Matt.

From: 
>
 on behalf of Antonin Skoch >
Date: Wednesday, October 19, 2016 at 4:27 PM
To: "hcp-users@humanconnectome.org" 
>
Subject: [HCP-Users] gradient nonlinearity correction question

Dear experts,

during the set-up of gradunwarp scripts, it came to my mind, why scanner 
vendors standardly do not perform gradient nonlinearity correction directly on 
the scanner as part of on-line image reconstruction system (i.e. ICE in 
Siemens)?

Regards,

Antonin Skoch



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] gradient nonlinearity correction question

2016-10-19 Thread Antonin Skoch
Dear experts,

during the set-up of gradunwarp scripts, it came to my mind, why scanner 
vendors standardly do not perform gradient nonlinearity correction directly on 
the scanner as part of on-line image reconstruction system (i.e. ICE in 
Siemens)? 

Regards,

Antonin Skoch



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] hcp_fix bug

2016-10-19 Thread nailin yao
 Hi Matt,


I tried to run fix_3_clean on matlab directly, and met the error message
below:


fix_3_clean('.fix',0,1,2000)

TR =

0.7200

/bin/bash: -cifti-convert: command not found

Undefined function 'gifti' for input arguments of type 'char'.

Error in ciftiopen (line 31)

cifti = gifti([tmpfile '.gii']);

Error in fix_3_clean (line 40)

  BO=ciftiopen('Atlas.dtseries.nii',WBC);


Do you know why? Thank you very much!

Best,
Nailin

2016-10-17 20:12 GMT-04:00 Glasser, Matthew :

> Is it a reproducible error?  What happens if you try running the
> fix_3_clean function with the parameters mentioned below?
>
> Peace,
>
> Matt.
>
> From: nailin yao 
> Date: Monday, October 17, 2016 at 2:58 PM
> To: Matt Glasser 
> Cc: "hcp-users@humanconnectome.org" 
> Subject: Re: [HCP-Users] hcp_fix bug
>
> Hi Matt,
>
> It's not compiled matlab.
>
> Best,
> Nailin
>
> 2016-10-17 13:53 GMT-04:00 Glasser, Matthew :
>
>> Is this compiled matlab or not compiled?
>>
>> Peace,
>>
>> Matt.
>>
>> From:  on behalf of nailin yao <
>> ynai...@gmail.com>
>> Date: Monday, October 17, 2016 at 11:13 AM
>> To: "hcp-users@humanconnectome.org" 
>> Subject: [HCP-Users] hcp_fix bug
>>
>> Hi ,
>>
>> I received the bug report when running hcp_fix for the last step:
>>
>> running FIX
>>
>> FIX Feature extraction for Melodic output directory:
>> rfMRI_REST1_LR_nonlin_norm_hp2000.ica
>>
>>  create edge masks
>>
>>  run FAST
>>
>>  registration of standard space masks
>>
>>  extract features
>>
>> FIX Classifying components in Melodic directory:
>> rfMRI_REST1_LR_nonlin_norm_hp2000.ica using training file:
>> /home/fas/glahn/ny87/Pipelines/fix1.06/training_files/HCP_hp2000.RData
>> and threshold 10
>>
>> FIX Applying cleanup using cleanup file: rfMRI_REST1_LR_nonlin_norm_hp2
>> 000.ica/fix4melview_HCP_hp2000_thr10.txt and motion cleanup set to 1
>>
>> sh: line 1: 24221 Bus error   (core dumped)
>> /gpfs/apps/hpc/Apps/Matlab/R2015b/bin/matlab -nojvm -nodisplay
>> -nodesktop -nosplash -r "addpath('/home/fas/glahn/ny87/Pipelines/fix1.06');
>> addpath('/gpfs/apps/hpc/Apps/FSL/5.0.6/fsl/etc/matlab');
>> fix_3_clean('.fix',0,1,2000)" >> .fix.log 2>&1
>>
>> Does anyone know what's the problem? Thank you very much!
>>
>> Best,
>> Nailin
>>
>> --
>> Nailin Yao,  PhD
>>
>> Postdoctoral Associate
>> Department of Psychiatry, Yale University
>>
>> ___
>> HCP-Users mailing list
>> HCP-Users@humanconnectome.org
>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>
>>
>> --
>>
>> The materials in this message are private and may contain Protected
>> Healthcare Information or other information of a sensitive nature. If you
>> are not the intended recipient, be advised that any unauthorized use,
>> disclosure, copying or the taking of any action in reliance on the contents
>> of this information is strictly prohibited. If you have received this email
>> in error, please immediately notify the sender via telephone or return mail.
>>
>
>
>
> --
> Nailin Yao,  PhD
>
> Postdoctoral Associate
> Department of Psychiatry, Yale University
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>



-- 
Nailin Yao,  PhD

Postdoctoral Associate
Department of Psychiatry, Yale University

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] mounting the HCP data on an ec2 isntance instead of s3 access

2016-10-19 Thread Denis-Alexander Engemann
This is absolutely essential information. Thank you Tim. My standard use
case is Linux ubuntu worker which I have pre-configures, using my own AMI
that I launch in the same region in which the HCP data are provided,
us-east-1.
I then would download the data from there which can take some time,
depending on the bandwidth of the chosen instance. I then push my results
to my own s3 repositories. I was thinking that mounting the data could be a
good idea (it would potentially cut down time and solve capacity problems
of the worker which can have limited disk space. I personally would not
have a problem upgrading to a NITRIC AMI if I find a standard Linux worker.

Thank you once more,
Denis

On Wed, Oct 19, 2016 at 7:55 PM Timothy B. Brown  wrote:

Hello Denis,

I understand that Robert Oostenveld is planning to send you some materials
from the latest HCP Course that illustrate how to mount the HCP OpenAccess
S3 bucket as a directory accessible from a running EC2 instance.

However, I'd like to clarify a few things.

First, the materials you will receive from Robert assume that you are using
an Amazon EC2 instance (virtual machine) *that is based on an AMI supplied
by NITRC* (analogous to a DVD of software supplied and configured by NITRC
to be loaded on your virtual machine). In fact the instructions show you
how to create a new EC2 instance based on that NITRC AMI.

The folks at NITRC have done a lot of the work for you (like including the
necessary software to mount an S3 bucket) and provided a web interface for
you to specify your credentials for accessing the HCP OpenAccess S3 bucket.
If you want to create an EC2 instance based on the NITRC AMI, then things
should work well for you and the materials Robert sends to you should
hopefully be helpful.

But this will not be particularly useful to you if you are using an EC2
instance that is *not* based upon the NITRC AMI. If that is the case, you
will have to do a bit more work. You will need to install a tool called
*s3fs* ("S3 File System") on your instance and then configure s3fs to mount
the HCP OpenAccess S3 bucket. This configuration will include storing your
AWS access key information in a secure file on your running instance.

A good starting point for instructions for doing this can be found at:
https://forums.aws.amazon.com/message.jspa?messageID=313009

This may not cover all the issues you encounter and you may have to search
for other documentation on using s3fs under Linux to get things fully
configured. The information at:
https://rameshpalanisamy.wordpress.com/aws/adding-s3-bucket-and-mounting-it-to-linux/
may also be helpful.

Second, once you get the S3 bucket mounted, it is very important to realize
that it is *read-only* from your system. By mounting the S3 bucket using
s3fs, you have not created an actual EBS volume on your system that
contains the HCP OpenAccess data, just a mount point where you can *read*
the files in the S3 bucket.

You will likely want to create a separate EBS volume on which you will run
pipelines, generate new files, and do any further analysis that you want to
do. To work with the data, you will want the HCP OpenAccess S3 bucket data
to at least *appear* to be on that separate EBS volume. One approach would
be to selectively copy data files from the mounted S3 data onto your EBS
volume. However, this would be duplicating a lot of data onto the EBS
volume, taking a long time and costing you money for storage of data that
is already in the S3 bucket. I think a better approach is to create a
directory structure on your EBS volume that contains files which are
actually symbolic links to the read-only data that is accessible via your
S3 mount point.

The materials that Robert sent (or will send) you contain instructions for
how to get and use a script that I've written that will create such a
directory structure of symbolic links. After looking over those
instructions, if it is not obvious to you what script I'm referring to and
how to use it, feel free to send a follow up question to me.

Hope that's helpful,

  Tim
On 10/18/2016 10:51 AM, Denis-Alexander Engemann wrote:

Dear HCPers,

I recently had a conversation with Robert who suggested to me that it
should be possible to directly mount the HCP data like an EBS volume
instead of using the s3 tools for copying the data file by file.
Any hint would be appreciated.

Cheers,
Denis

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


-- 
Timothy B. Brown
Business & Technology Application Analyst III
Pipeline Developer (Human Connectome Project)
tbbrown(at)wustl.edu
--
The material in this message is private and may contain Protected
Healthcare Information (PHI). If you are not the intended recipient, be
advised that any unauthorized use, disclosure, copying or the taking of any
action in reliance on the contents of 

Re: [HCP-Users] mounting the HCP data on an ec2 isntance instead of s3 access

2016-10-19 Thread Timothy B. Brown

Hello Denis,

I understand that Robert Oostenveld is planning to send you some 
materials from the latest HCP Course that illustrate how to mount the 
HCP OpenAccess S3 bucket as a directory accessible from a running EC2 
instance.


However, I'd like to clarify a few things.

First, the materials you will receive from Robert assume that you are 
using an Amazon EC2 instance (virtual machine) /that is based on an AMI 
supplied by NITRC/ (analogous to a DVD of software supplied and 
configured by NITRC to be loaded on your virtual machine). In fact the 
instructions show you how to create a new EC2 instance based on that 
NITRC AMI.


The folks at NITRC have done a lot of the work for you (like including 
the necessary software to mount an S3 bucket) and provided a web 
interface for you to specify your credentials for accessing the HCP 
OpenAccess S3 bucket. If you want to create an EC2 instance based on the 
NITRC AMI, then things should work well for you and the materials Robert 
sends to you should hopefully be helpful.


But this will not be particularly useful to you if you are using an EC2 
instance that is /not/ based upon the NITRC AMI. If that is the case, 
you will have to do a bit more work. You will need to install a tool 
called /s3fs/ ("S3 File System") on your instance and then configure 
s3fs to mount the HCP OpenAccess S3 bucket. This configuration will 
include storing your AWS access key information in a secure file on your 
running instance.


A good starting point for instructions for doing this can be found at: 
https://forums.aws.amazon.com/message.jspa?messageID=313009


This may not cover all the issues you encounter and you may have to 
search for other documentation on using s3fs under Linux to get things 
fully configured. The information at: 
https://rameshpalanisamy.wordpress.com/aws/adding-s3-bucket-and-mounting-it-to-linux/ 
may also be helpful.


Second, once you get the S3 bucket mounted, it is very important to 
realize that it is *read-only* from your system. By mounting the S3 
bucket using s3fs, you have not created an actual EBS volume on your 
system that contains the HCP OpenAccess data, just a mount point where 
you can /read/ the files in the S3 bucket.


You will likely want to create a separate EBS volume on which you will 
run pipelines, generate new files, and do any further analysis that you 
want to do. To work with the data, you will want the HCP OpenAccess S3 
bucket data to at least /appear/ to be on that separate EBS volume. One 
approach would be to selectively copy data files from the mounted S3 
data onto your EBS volume. However, this would be duplicating a lot of 
data onto the EBS volume, taking a long time and costing you money for 
storage of data that is already in the S3 bucket. I think a better 
approach is to create a directory structure on your EBS volume that 
contains files which are actually symbolic links to the read-only data 
that is accessible via your S3 mount point.


The materials that Robert sent (or will send) you contain instructions 
for how to get and use a script that I've written that will create such 
a directory structure of symbolic links. After looking over those 
instructions, if it is not obvious to you what script I'm referring to 
and how to use it, feel free to send a follow up question to me.


Hope that's helpful,

  Tim

On 10/18/2016 10:51 AM, Denis-Alexander Engemann wrote:

Dear HCPers,

I recently had a conversation with Robert who suggested to me that it 
should be possible to directly mount the HCP data like an EBS volume 
instead of using the s3 tools for copying the data file by file.

Any hint would be appreciated.

Cheers,
Denis

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



--
Timothy B. Brown
Business & Technology Application Analyst III
Pipeline Developer (Human Connectome Project)
tbbrown(at)wustl.edu

The material in this message is private and may contain Protected 
Healthcare Information (PHI). If you are not the intended recipient, be 
advised that any unauthorized use, disclosure, copying or the taking of 
any action in reliance on the contents of this information is strictly 
prohibited. If you have received this email in error, please immediately 
notify the sender via telephone or return mail.


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] alignment issues with diffusion data

2016-10-19 Thread Maarten Vaessen
The data is from a 2d EPI sequence converted from DICOM with mrtrix
routines.
>From mri_info I get:

ras xform present
xform info: x_r =  -1., y_r =   0., z_r =   0., c_r =
3.8554
  : x_a =   0., y_a =  -0.9285, z_a =  -0.3714, c_a =
 -7.2492
  : x_s =   0., y_s =  -0.3714, z_s =   0.9285, c_s =
 15.6468
Orientation   : LPS
Primary Slice Direction: axial

The issue is then with the DICOM conversion?

Thx,

-Maarten





On Wed, Oct 19, 2016 at 3:39 PM, Harms, Michael  wrote:

>
> Is your DWI data by any chance saved with an orientation other than LAS or
> RAS?
>
> --
> Michael Harms, Ph.D.
> ---
> Conte Center for the Neuroscience of Mental Disorders
> Washington University School of Medicine
> Department of Psychiatry, Box 8134
> 660 South Euclid Ave. Tel: 314-747-6173
> St. Louis, MO  63110 Email: mha...@wustl.edu
>
> From:  on behalf of Maarten
> Vaessen 
> Date: Wednesday, October 19, 2016 at 6:50 AM
> To: "hcp-users@humanconnectome.org" 
> Subject: [HCP-Users] alignment issues with diffusion data
>
> Hello experts,
>
> I have been trying to use the HCP Diffusion preprocessing pipeline for my
> own diffusion data. However, I run into a lot of issues at the final stage
> of the pipeline where the dwi data is registered to the T1w_acpc space (the
> freesurfer part of the pipeline runs fine on the structural images, the
> eddy correction also seems to run fine).
> So far I have not had a single subject where this alignment went smoothly
> and was accurate.
> It seems to be a problem with flirt. In the DiffusiontoStructural.sh
> subscript there are several calls to flirt (or epi_ref_dof) with the T1 as
> reference and for some reason flirt gives very wrong results like AP
> inverted or cerebellum to frontal lobe matching etc...
> I have managed to solve some cases with a call to only flirt and not
> epi_reg_dof and adding -useqform to the flirt options, but some I only
> managed by doing a rough manual alignment first. When the initial reg to T1
> in ok-ish the bbregister and later parts all work very well.
> I don't think my dwi data is very special (1.5 mm^3 90 dir 2 shell), so I
> have no idea what might be the issue here or how to solve it (in a
> consistent way).
>
> -Maarten
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] alignment issues with diffusion data

2016-10-19 Thread Maarten Vaessen
Hello experts,

I have been trying to use the HCP Diffusion preprocessing pipeline for my
own diffusion data. However, I run into a lot of issues at the final stage
of the pipeline where the dwi data is registered to the T1w_acpc space (the
freesurfer part of the pipeline runs fine on the structural images, the
eddy correction also seems to run fine).
So far I have not had a single subject where this alignment went smoothly
and was accurate.
It seems to be a problem with flirt. In the DiffusiontoStructural.sh
subscript there are several calls to flirt (or epi_ref_dof) with the T1 as
reference and for some reason flirt gives very wrong results like AP
inverted or cerebellum to frontal lobe matching etc...
I have managed to solve some cases with a call to only flirt and not
epi_reg_dof and adding -useqform to the flirt options, but some I only
managed by doing a rough manual alignment first. When the initial reg to T1
in ok-ish the bbregister and later parts all work very well.
I don't think my dwi data is very special (1.5 mm^3 90 dir 2 shell), so I
have no idea what might be the issue here or how to solve it (in a
consistent way).

-Maarten

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users