Re: [HCP-Users] Troubleshooting error with gradunwarp

2019-06-27 Thread Harms, Michael

Hi,
That sounds like this issue, which we haven’t patched quite yet:
https://github.com/Washington-University/HCPpipelines/issues/119

Incidentally, you must be running off code in the current master branch, rather 
than the formally tagged “v4.0.0” release 
(https://github.com/Washington-University/HCPpipelines/releases) since this 
isn’t any issue in the tagged release.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Jayasekera, Dinal" 

Date: Thursday, June 27, 2019 at 12:05 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Troubleshooting error with gradunwarp

Dear all,

I'm trying to troubleshoot an error that I'm getting when running v4.0 
PreFreeSurferPipelineBatch. I initially thought the error arose as a result of 
grandunwarp but I don't think that is the case anymore. This is the error I am 
receiving:

'/media/functionalspinelab/RAID/Data/Dinal/mystudy/NSI_21/T2w/T2wToT1wDistortionCorrectAndReg/FieldMap/TopupField.nii.gz'
 and 
'/media/functionalspinelab/RAID/Data/Dinal/mystudy/NSI_21/T2w/T2wToT1wDistortionCorrectAndReg/FieldMap/TopupField.nii.gz'
 are the same file

I have attached the full output to stdout. Any insights?

Kind regards,
Dinal Jayasekera

PhD Candidate | InSITE Fellow
Ammar Hawasli Lab
Department of Biomedical Engineering 
| Washington University in St. Louis

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] A question about scene file

2019-06-21 Thread Harms, Michael

As part of some other updates to the StructuralQC scenes themselves, I recently 
created a more sophisticated generation script that should use relative paths 
within the scene.

Please try the just updated master branch of
https://github.com/Washington-University/StructuralQC

We have tested this internally, but it would be good to get feedback from an 
outside user before I formally tag a new release.

Thanks,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Aaron C 

Date: Friday, June 21, 2019 at 12:23 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] A question about scene file

Dear HCP experts,

The scene file (a structural processing QC scene) I generated in a Linux 
computer doesn't work in a Windows computer. It seems that the file paths were 
hard-coded in the scene file. Is there a way to make it more portable? Thank 
you.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] film_gls error pinv() svd()

2019-06-18 Thread Harms, Michael

Not really.  You could ask on the FSL list.

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: Moataz Assem 
Date: Tuesday, June 18, 2019 at 8:00 AM
To: "Glasser, Matthew" , "Harms, Michael" 
, "hcp-users@humanconnectome.org" 

Subject: RE: [HCP-Users] film_gls error pinv() svd()

Nope, Matt, no empty EVs and estimations worked fine for subcortex with the 
exact same design.

Removing ‘--sa --ms=15 --epith=5’ did make it work. Any ideas what would this 
indicate?

Moataz

From: Glasser, Matthew [mailto:glass...@wustl.edu]
Sent: 18 June 2019 02:54
To: Harms, Michael ; Moataz Assem 
; hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] film_gls error pinv() svd()

Is there anything weird about your design like empty EVs?

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of "Harms, Michael" mailto:mha...@wustl.edu>>
Date: Monday, June 17, 2019 at 12:55 PM
To: Moataz Assem 
mailto:moataz.as...@mrc-cbu.cam.ac.uk>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] film_gls error pinv() svd()


Hmmm.  Assuming the problem is reproducible, I think you’ll have to report 
this, to the FSL list.  To provide additional information, it might be helpful 
to hack the TaskfMRILevel1.sh script to not use the ‘--sa --ms=15 --epith=5’ 
flags in the call to ‘film_gls’ (or, perhaps easier, just try running the 
modified film_gls call directly from the command line).  And if that still 
fails, try turning off the autocorrelation estimation entirely with the 
‘--noest' flag.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: 
mha...@wustl.edu<mailto:mha...@wustl.edu>

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Moataz Assem 
mailto:moataz.as...@mrc-cbu.cam.ac.uk>>
Date: Monday, June 17, 2019 at 9:32 AM
To: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] film_gls error pinv() svd()

Hi,

I get the following fsl error related to the beta estimations (while running 
film_gls):

Prewhitening and Computing PEs...
Percentage done:
1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,
error: pinv(): svd failed

pinv(): svd failed

This only happens for one run (in one subject) and while estimating the left 
surface (subcortical estimation ran fine). The design matrix doesn’t contain 
any NaNs neither does the timeseries for that hemisphere (also std(timeseries) 
did not give any zeros).

I am wondering if you can suggest other problems to check for?

Thanks

Moataz



___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of 

Re: [HCP-Users] film_gls error pinv() svd()

2019-06-17 Thread Harms, Michael

Hmmm.  Assuming the problem is reproducible, I think you’ll have to report 
this, to the FSL list.  To provide additional information, it might be helpful 
to hack the TaskfMRILevel1.sh script to not use the ‘--sa --ms=15 --epith=5’ 
flags in the call to ‘film_gls’ (or, perhaps easier, just try running the 
modified film_gls call directly from the command line).  And if that still 
fails, try turning off the autocorrelation estimation entirely with the 
‘--noest' flag.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Moataz Assem 

Date: Monday, June 17, 2019 at 9:32 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] film_gls error pinv() svd()

Hi,

I get the following fsl error related to the beta estimations (while running 
film_gls):

Prewhitening and Computing PEs...
Percentage done:
1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,
error: pinv(): svd failed

pinv(): svd failed

This only happens for one run (in one subject) and while estimating the left 
surface (subcortical estimation ran fine). The design matrix doesn’t contain 
any NaNs neither does the timeseries for that hemisphere (also std(timeseries) 
did not give any zeros).

I am wondering if you can suggest other problems to check for?

Thanks

Moataz



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Training data set for multi-run fix

2019-06-14 Thread Harms, Michael

Hi,
I’m not sure if the new training file is ready for public release yet, but just 
to clarify, to clean short fMRI runs, you really need to use multi-run FIX to 
get better separation of the signal/noise components.  The new training file 
will help around the margins, but using MR-FIX is critical.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of ASHISH SAHIB 

Date: Friday, June 14, 2019 at 12:04 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Training data set for multi-run fix

Hello
 We are an HCP disease connectome site.
At the HCP-Investigators meeting it was mentioned that a new training data set 
for multi-run fix would be available that could be used to clean short (~6 
mins) fMRI runs. If the training data for FIX is already available, could 
anyone provide the necessary link to download this. We are in the process of 
running the preprocessing for our fMRI data and it would be of great help if we 
could have the new training data, or know if the previous  HCP_hp2000.RData is 
equally good enough to perform the cleaning.


Thanks
Ashish Sahib

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] RE-POSTING: Duplicate subjects in Connectome in a Box?

2019-06-07 Thread Harms, Michael

Are you sure that there is duplication of actual *data* on the different 
drives?  Just because a subject ID appears on multiple drives doesn’t mean that 
data under that subject ID is the same on the drives.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Friday, June 7, 2019 at 6:01 PM
To: "Fales, Christina L" , 
"hcp-users@humanconnectome.org" 
Cc: "Cler, Eileen" 
Subject: Re: [HCP-Users] RE-POSTING: Duplicate subjects in Connectome in a Box?


  1.  I don’t know, but perhaps someone from NRG can answer.
  2.  Those are different smoothing levels.  I do not recommend using more than 
s4 and personally don’t use anything but s2.  You can see Coalson et al 2018 
PNAS: https://www.pnas.org/content/115/27/E6356.short for the deleterious 
effects of spatial smoothing, which are worst in the volume in 3D, but still 
problematic on the surface when large kernels are used.

Matt.

From:  on behalf of "Fales, Christina L" 

Date: Friday, June 7, 2019 at 5:40 PM
To: "hcp-users@humanconnectome.org" 
Cc: "Fales, Christina L" 
Subject: [HCP-Users] RE-POSTING: Duplicate subjects in Connectome in a Box?

Hi HCP gurus:

I’m reposting the following question(s) in hopes that one of you knows the 
answer.

(1)
On looking through the data, it appears that there is duplication between the 
drives delivered. Is that intentional? We received 12 drives, but four of them 
appear to be duplicates. Specifically, in the pairs below, every subject on the 
first drive (eg, “sde1”) also occurs on the second (“sdi1”).  Files look the 
same (ie, have same file size). What is the difference between the people on 
corresponding drives in each pair?

sde1, sdi1
sdd1, sdh1
sdc1, sdg1
sdf1, sdk1

Subjects on drives sdb1, sdj1, sdbl1, and sdbm1 appear to be unique.

(2)
I cannot find anywhere an explanation of the differences between 
“analysis_s12”, “analysis_s8”, “analysis_s4”, and “analysis_s2”. What is the 
difference between these?

Thanks very much….
-Christina Fales

Christina Fales, PhD
Research Scientist
Division of Psychiatry Research
Zucker Hillside Hospital
Feinstein Institute for Medical Research
Glen Oaks, NY 11004

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

The information contained in this electronic e-mail transmission and any 
attachments are intended only for the use of the individual or entity to whom 
or to which it is addressed, and may contain information that is privileged, 
confidential and exempt from disclosure under applicable law. If the reader of 
this communication is not the intended recipient, or the employee or agent 
responsible for delivering this communication to the intended recipient, you 
are hereby notified that any dissemination, distribution, copying or disclosure 
of this communication and any attachment is strictly prohibited. If you have 
received this transmission in error, please notify the sender immediately by 
telephone and electronic mail, and delete the original communication and any 
attachment from any computer, server or other electronic recording or storage 
device or medium. Receipt by anyone other than the intended recipient is not a 
waiver of any attorney-client, physician-patient or other privilege.


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] A minor question regarding HCP 7T movie data

2019-06-07 Thread Harms, Michael

Hi,
It isn’t surprising that data will be missing in some subjects, due either to 
problems at the scanner, or problems during processing.

Cheers,
-MH


--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Reza Rajimehr 

Date: Friday, June 7, 2019 at 11:00 AM
To: Keith Jamison , hcp-users 
Subject: Re: [HCP-Users] A minor question regarding HCP 7T movie data

Thanks Keith! I am currently working with the Movie Task fMRI 2mm/32k 
FIX-Denoised (Compact) dataset, and I noticed the issues below:

1) Two subjects (126931 and 74) do not have MSMAll-registered time-series 
data.

2) Five subjects (181636, 473952, 536647, 552241, and 973770) have two runs 
(MOVIE1 and MOVIE2).

3) One subject (585256) has three runs (MOVIE1, MOVIE3, and MOVIE4).

Best,
Reza

On Mon, Jun 3, 2019 at 8:33 PM Keith Jamison 
mailto:kjami...@umn.edu>> wrote:
Hi Reza,

Your interpretation of the timing is correct. The validation segment in each 
movie scan was shifted by 40-200ms for the "v2" version. This was done in order 
to make that final 83 second clip begin precisely at the start of the next TR 
for all 4 movie sessions.

Given HRF variability, ignoring this change will probably not adversely impact 
most analyses.

-Keith


On Fri, May 31, 2019 at 9:05 PM Reza Rajimehr 
mailto:rajim...@gmail.com>> wrote:
Hi,

The HCP S1200 Reference Manual says that there are two versions of 4 movie 
files. Some subjects have been scanned with one version, and the remaining 
subjects have been scanned with another version. Version 2 includes these 
changes:

7T_MOVIE1_CC1_v2 remains unchanged.

7T_ MOVIE2_HO1_v2 removed 1 frame of rest before the validation clip.

7T_MOVIE3_CC2_v2 removed 5 frames of rest before the validation clip.

7T_MOVIE4_HO2_v2 added 4 frames of rest before the validation clip.

Here *frame* is a frame of the movie, right?

Assuming that the movies are ~25 frames per second, the deviations are between 
0 and ~200 ms in each movie file. Considering the TR of one second and the slow 
hemodynamic BOLD responses, these deviations are negligible, and we can 
possibly ignore them when concatenating time-series data across all subjects. 
Would you agree?

Thanks,
Reza

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Error while running "GenericfMRIVolumeProcessingPipeline.sh"

2019-06-05 Thread Harms, Michael

Try the very latest master, which should resolve this issue.  (I was just 
working on this 30 min ago).  Please let us know if it works.

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Joseph Orr 

Date: Wednesday, June 5, 2019 at 1:52 PM
To: Simon Wein 
Cc: HCP Users , Wilhelm Malloni 

Subject: Re: [HCP-Users] Error while running 
"GenericfMRIVolumeProcessingPipeline.sh"

We ran into the same problem with flirt. In FSL 6.0.1, flirt can't take a 4D 
file as a reference, but it could in fsl v5. Adding a line before 289 to 
extract the first volume of PhaseTwo_gdc, and modifying flirt accordingly fixes 
the problem:

${FSLDIR}/bin/fslroi ${WD}/PhaseTwo_gdc ${WD}/PhaseTwo_gdc_slice -1 -1 -1 -1 -1 
-1 0 1
${FSLDIR}/bin/flirt -dof 6 -interp spline -in ${WD}/SBRef.nii.gz -ref 
${WD}/PhaseTwo_gdc_slice -omat ${WD}/SBRef2PhaseTwo_gdc.mat -out 
${WD}/SBRef2PhaseTwo_gdc
--
Joseph M. Orr, Ph.D.
Assistant Professor
Department of Psychological and Brain Sciences
Texas A Institute for Neuroscience
Texas A University
College Station, TX


On Sat, May 25, 2019 at 8:48 AM Simon Wein 
mailto:simon.w...@psychologie.uni-regensburg.de>>
 wrote:
Thank you very much for your support.
In our pipeline we have set $GradientDistortionCoeffs = "NONE", so line 189 in

https://github.com/Washington-University/HCPpipelines/blob/master/global/scripts/TopupPreprocessingAll.sh

was unfortunately skipped. But we could avoid the error by modifying line 196 
to:

fslmaths ${WD}/BothPhases.nii.gz -mul 0 -add 1 ${WD}/Mask


Mask.nii.gz then has properties:

filenameMask.nii.gz
size of header348
data_typeFLOAT32
dim04
dim1104
dim2104
dim372
dim46
dim51
dim61
dim71
vox_unitsmm
time_unitss
datatype16
nbyper4
bitpix32
pixdim01.00
pixdim12.00
pixdim22.00
pixdim32.00
pixdim47.70
pixdim50.00
pixdim60.00
pixdim70.00
vox_offset352
cal_max0.00
cal_min0.00
scl_slope1.00
scl_inter0.00
phase_dim0
freq_dim0
slice_dim0
slice_nameUnknown
slice_code0
slice_start0
slice_end0
slice_duration0.00
toffset0.00
intentUnknown
intent_code0
intent_name
intent_p10.00
intent_p20.00
intent_p30.00
qform_nameScanner Anat
qform_code1
qto_xyz:1-1.995593 0.092717 0.094922 93.702583
qto_xyz:2-0.105543 -1.976261 -0.288537 118.937881
qto_xyz:30.080419 -0.292910 1.976799 -43.901993
qto_xyz:40.00 0.00 0.00 1.00
qform_xorientRight-to-Left
qform_yorientAnterior-to-Posterior
qform_zorientInferior-to-Superior
sform_nameScanner Anat
sform_code1
sto_xyz:1-1.995594 0.092714 0.094922 93.702583
sto_xyz:2-0.105540 -1.976261 -0.288537 118.937881
sto_xyz:30.080420 -0.292910 1.976799 -43.901993
sto_xyz:40.00 0.00 0.00 1.00
sform_xorientRight-to-Left
sform_yorientAnterior-to-Posterior
sform_zorientInferior-to-Superior
file_typeNIFTI-1+
file_code1
descrip6.0.1
aux_file
The previously mentioned problem with FSL 6.0+ fslmaths could be avoided.
Would you consider this as a good workaround, or would you recommend something 
else?

But in a later stage, we got another error:

Image Exception : #75 :: 3D only method called by higher-dimensional volume.
3D only method called by higher-dimensional volume.
Could not open matrix file 
/loctmp/CUDA/DATA/VP_101/rfMRI_REST1_PA/DistortionCorrectionAndEPIToT1wReg_FLIRTBBRAndFreeSurferBBRbased/FieldMap/SBRef2PhaseTwo_gdc.mat
Cannot read input-matrix

The file "SBRef2PhaseTwo_gdc.mat" was not created, so the problem might be in 
line 289 in "TopupPreprocessingAll.sh":

${FSLDIR}/bin/flirt -dof 6 -interp spline -in ${WD}/SBRef.nii.gz -ref 
${WD}/PhaseTwo_gdc -omat ${WD}/SBRef2PhaseTwo_gdc.mat -out 
${WD}/SBRef2PhaseTwo_gdc

Properties of SBRef.nii.gz are:

filenameSBRef.nii.gz
size of header348
data_typeUINT16
dim03
dim1104
dim2104
dim372
dim41
dim50
dim60
dim70
vox_unitsmm
time_unitss
datatype512
nbyper2
bitpix16
pixdim01.00
pixdim12.00
pixdim22.00
pixdim32.00
pixdim4 

Re: [HCP-Users] Inquiry about data

2019-05-30 Thread Harms, Michael

Hi,
You need to complete the process for “Restricted Access” in ConnectomeDB, and 
then you’ll have access to the more granular age information.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: 温景熙 
Date: Thursday, May 30, 2019 at 9:24 PM
To: "Glasser, Matthew" , "Harms, Michael" 
, "hcp-users@humanconnectome.org" 

Cc: "tang...@csu.edu.cn" 
Subject: Inquiry about data

Dear Prof.
I am from Central South University. Thank you very much for providing the 
Connectome Coordination Facility platform and collecting so much data as public 
data. This provides brain researchers with great convenience and a large amount 
of data, which is a very important and meaningful thing. I am honored to be 
able to use these data (HCP Young Adult, 1200 Subjects, Age 22-35). In my 
previous work, I have done some work on these data and published a paper 
("Brain Differences Between Men and Women: Evidence From Deep Learning" Front. 
Neurosci., 08 March 2019). Next, I want to further study the differences 
between different ages of the brain. However, HCP databases only provide age 
range information, such as 22-25. What I need is a specific age, such as 
someone who is 23 or 32 years old. Can you provide me with a table with a 
specific age? I guarantee that the age information of data will not infringe 
upon privacy issues, and I will keep these data absolutely confidential. I look 
forward to hearing from you. Wish you a happy life!

Yours Sincerely

Wen Jingxi


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Group-averaging of task fMRI data

2019-05-30 Thread Harms, Michael

Ah yes, that was probably indeed the case upon further reflection.  Thanks for 
the correction.

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: "Glasser, Matthew" 
Date: Thursday, May 30, 2019 at 3:52 PM
To: Reza Rajimehr , "Harms, Michael" , 
Nooshin Abbasi 
Cc: hcp-users 
Subject: Re: [HCP-Users] Group-averaging of task fMRI data

Hi Mike,

My recollection was that the unnamed .dscalar.nii files were zstats, not beta 
maps.  I added the beta maps later when it became clear that using statistical 
significance maps was inappropriate for parcellation.

Matt.

From:  on behalf of Reza Rajimehr 

Date: Thursday, May 30, 2019 at 12:30 PM
To: "Harms, Michael" , Nooshin Abbasi 

Cc: hcp-users 
Subject: Re: [HCP-Users] Group-averaging of task fMRI data

Thanks Michael for your detailed and helpful answers.

Best,
Reza


On Thu, May 30, 2019 at 6:10 PM Harms, Michael 
mailto:mha...@wustl.edu>> wrote:

Hi Reza,

1) We’ve already generated Cohen’s d-style effect size maps for all contrasts, 
using all subjects, as part of the “Group Average Dataset” available at 
https://db.humanconnectome.org/data/projects/HCP_1200.  If you need it computed 
for a specific subset of subjects, then yes, you can use the approach that you 
outlined.  Note that the ensuing “effect size” does not account for the family 
structure in the data (i.e., to the extent that the estimate of the std across 
subjects is biased by the family structure, then the estimate of the effect 
size is biased as well).

2) A .dtseries.nii file is still a “spatial map”.  We just didn’t bother to 
formally convert those particular outputs to a .dscalar.nii (e.g., via 
-cifti-change-mapping).  A dscalar version of all the copes (merged into a 
single file) for a given task and subject are available in the root level of 
the .feat directory containing the Level2 task analysis results for that task 
and subject.  In newer pipeline versions, we create separate merged files for 
both the “zstat” and “cope” files of the individual contrasts.  However, at the 
time of the processing of the HCP-YA data, only a single merged dscalar was 
created, and that was for the copes (and it does not unfortunately have “cope” 
as part of its filename).

3) We recommend using PALM for group statistical analysis.  You can find a 
tutorial in the “tfMRI and PALM” practical available as part of the HCP Course: 
https://store.humanconnectome.org/courses/2018/exploring-the-human-connectome.php.
  And no, you generally do *not* want to use the individual subject “zstat1” 
maps as inputs to a statistical computation, which would be “computing 
statistics of a statistic” (rather than the statistic of an effect size).

4) The outputs produced are simply the same as those produced by FSL’s FLAMEO, 
albeit in CIFTI rather than NIFTI format.  So see FSL’s FLAMEO documentation.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid 
Ave<https://www.google.com/maps/search/660+South+Euclid+Ave?entry=gmail=g>.
Tel: 314-747-6173
St. Louis, MO  63110  Email: 
mha...@wustl.edu<mailto:mha...@wustl.edu>

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Reza Rajimehr mailto:rajim...@gmail.com>>
Date: Wednesday, May 29, 2019 at 7:36 PM
To: hcp-users 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Group-averaging of task fMRI data

Hi,

For a group of subjects (e.g. 100 subjects in HCP S1200), we want to generate a 
group-average Cohen’s d map for a particular contrast in the working memory 
task. For this, we take level2 “cope1.dtseries.nii” file in cope20.feat folder 
of all those subjects, merge them using -cifti-merge, then -cifti-reduce mean, 
-cifti-reduce stdev, and -cifti-math mean/stdev.

Questions:

1) Is the above procedure correct? Or you recommend other commands?

2) Why the file name is cope1.dtseries.nii when it is not a time-series data? 
Why not naming it cope1.dscalar.nii, as it is a spatial map?

3) If we want to generate a group-average zstat map, what should we do? I guess 
it should involve using the “zstat1.dtseries.nii” files, but don’t know how.

4) Is there any documentation somewhere describing all the files within the 
cope folder? E.g. these files:

mean_random_effects_var1.dtseries.nii
pe1.dtseries.nii
res4d.dtseries.nii
tdof_t1.dtseries.nii
tstat1.dtseries.nii
varcope1.dtseries.nii
weights1.dtseries.nii

Thanks,
Reza

___
HCP

Re: [HCP-Users] Group-averaging of task fMRI data

2019-05-30 Thread Harms, Michael

Hi Reza,

1) We’ve already generated Cohen’s d-style effect size maps for all contrasts, 
using all subjects, as part of the “Group Average Dataset” available at 
https://db.humanconnectome.org/data/projects/HCP_1200.  If you need it computed 
for a specific subset of subjects, then yes, you can use the approach that you 
outlined.  Note that the ensuing “effect size” does not account for the family 
structure in the data (i.e., to the extent that the estimate of the std across 
subjects is biased by the family structure, then the estimate of the effect 
size is biased as well).

2) A .dtseries.nii file is still a “spatial map”.  We just didn’t bother to 
formally convert those particular outputs to a .dscalar.nii (e.g., via 
-cifti-change-mapping).  A dscalar version of all the copes (merged into a 
single file) for a given task and subject are available in the root level of 
the .feat directory containing the Level2 task analysis results for that task 
and subject.  In newer pipeline versions, we create separate merged files for 
both the “zstat” and “cope” files of the individual contrasts.  However, at the 
time of the processing of the HCP-YA data, only a single merged dscalar was 
created, and that was for the copes (and it does not unfortunately have “cope” 
as part of its filename).

3) We recommend using PALM for group statistical analysis.  You can find a 
tutorial in the “tfMRI and PALM” practical available as part of the HCP Course: 
https://store.humanconnectome.org/courses/2018/exploring-the-human-connectome.php.
  And no, you generally do *not* want to use the individual subject “zstat1” 
maps as inputs to a statistical computation, which would be “computing 
statistics of a statistic” (rather than the statistic of an effect size).

4) The outputs produced are simply the same as those produced by FSL’s FLAMEO, 
albeit in CIFTI rather than NIFTI format.  So see FSL’s FLAMEO documentation.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Reza Rajimehr 

Date: Wednesday, May 29, 2019 at 7:36 PM
To: hcp-users 
Subject: [HCP-Users] Group-averaging of task fMRI data

Hi,

For a group of subjects (e.g. 100 subjects in HCP S1200), we want to generate a 
group-average Cohen’s d map for a particular contrast in the working memory 
task. For this, we take level2 “cope1.dtseries.nii” file in cope20.feat folder 
of all those subjects, merge them using -cifti-merge, then -cifti-reduce mean, 
-cifti-reduce stdev, and -cifti-math mean/stdev.

Questions:

1) Is the above procedure correct? Or you recommend other commands?

2) Why the file name is cope1.dtseries.nii when it is not a time-series data? 
Why not naming it cope1.dscalar.nii, as it is a spatial map?

3) If we want to generate a group-average zstat map, what should we do? I guess 
it should involve using the “zstat1.dtseries.nii” files, but don’t know how.

4) Is there any documentation somewhere describing all the files within the 
cope folder? E.g. these files:

mean_random_effects_var1.dtseries.nii
pe1.dtseries.nii
res4d.dtseries.nii
tdof_t1.dtseries.nii
tstat1.dtseries.nii
varcope1.dtseries.nii
weights1.dtseries.nii

Thanks,
Reza

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] PRISM DWI Parameter Request

2019-05-30 Thread Harms, Michael

Hi,
All those details, including an importable VE11C protocol are available at 
http://protocols.humanconnectome.org/

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Rosso, Isabelle M." 

Date: Thursday, May 30, 2019 at 10:08 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] PRISM DWI Parameter Request

Hello,

I am emailing you as recommended by the HCP_S1200 User manual to request the 
diffusion parameters and gradient table that we would want to use the HCP 
diffusion sequence on a 3T PRISMA.
Could you kindly tell me what changes in TE, resolution, and b-values would be 
needed for the PRISMA.

Thank you in advance.

Isabelle Rosso

-- -- -- -- -- --
Isabelle M Rosso, PhD
Associate Professor of Psychology, Harvard Medical School
Director, Anxiety and Traumatic Stress Disorders Laboratory
Licensed Psychologist and Health Service Provider

McLean Hospital
Center for Depression, Anxiety and Stress Research
115 Mill Street
Belmont, MA 02478
Tel. 617-855-2607
Fax 617-855-4231
iro...@hms.harvard.edu
iro...@partners.org



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Downloading movement regressors from HYA release

2019-05-29 Thread Harms, Michael

Which specific files (i.e., which runs) is curl not finding for those remaining 
subjects?

Mike Hodge will likely need to investigate further, because I’m able to find 
the expected files on our local file systems for 101006 at least.  It may be 
something specific to how REST works, with perhaps a small number of files not 
“in sync” between what exists on the file systems vs. what the database thinks 
is available (just guessing…).

You could simply download the MPP packages for the small number of remaining 
subjects through ConnectomeDB.

Cheers,
-MH


--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: Leonardo Tozzi 
Date: Wednesday, May 29, 2019 at 1:44 PM
To: "Harms, Michael" , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] Downloading movement regressors from HYA release


I did and actually some subjects worked, but not these ones: 101006 159441 
926862 927359 942658
I still get a “cannot find file” error (10.4.5).

Leonardo Tozzi, MD, PhD
Williams PanLab | Postdoctoral Researcher
Stanford University | 401 Quarry Rd
lto...@stanford.edu<mailto:lto...@stanford.edu> | (650) 5615738


From: "Harms, Michael" 
Date: Wednesday, May 29, 2019 at 10:39 AM
To: Leonardo Tozzi , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] Downloading movement regressors from HYA release


Did you try downloading again, just in case there was a transfer glitch?  
Because the Movement_RelativeRMS.txt files appear to be present in the database 
for all 4 rfMRI runs of 101006 (the one subject I checked).

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: Leonardo Tozzi 
Date: Wednesday, May 29, 2019 at 11:40 AM
To: "Harms, Michael" , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] Downloading movement regressors from HYA release

Dear Michael,

I have now downloaded the majority of subjects with:

curl -o …/${sub}_${run}_Movement_RelativeRMS.txt -u USER:PASS 
https://db.humanconnectome.org/data/projects/HCP_1200/subjects/${sub}/experiments/${sub}_CREST/files/MNINonLinear/Results/${run}/Movement_RelativeRMS.txt<https://db.humanconnectome.org/data/projects/HCP_1200/subjects/$%7bsub%7d/experiments/$%7bsub%7d_CREST/files/MNINonLinear/Results/$%7brun%7d/Movement_RelativeRMS.txt>


However, these subjects still seem to be missing: 101006 127226 148335 159441 
188145 204016 285345 316835 368551 926862 927359 942658



Is that normal?

Thank you,

Leonardo Tozzi, MD, PhD
Williams PanLab | Postdoctoral Researcher
Stanford University | 401 Quarry Rd
lto...@stanford.edu<mailto:lto...@stanford.edu> | (650) 5615738


From: "Harms, Michael" 
Date: Wednesday, May 15, 2019 at 9:31 AM
To: Leonardo Tozzi , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] Downloading movement regressors from HYA release


Hi,
The Movement_RelativeRMS.txt files aren’t part of the specific “${run}_FIX” 
resource that you tried to reference.

If you have access to outputs in our standard file/directory organization 
(e.g., Amazon S3, Connectome-in-a-Box, or our downloaded “packages”), the 
Movement_RelativeRMS.txt files can be found at 
${subj}/MNINonLinear/Results/${run}.

If not, the “CREST” resource (search the previous HCP-User emails) is what we 
are intending users to use, which is intended to mimic the standard 
file/directory organization in terms of where files are located.

See FAQ 15 here
https://wiki.humanconnectome.org/display/PublicData/HCP+Users+FAQ
and the archived user list posts referenced therein.

Regarding your item (2), I just confirmed that Movement_Regressors.txt exists 
for all 4 REST runs for all 6 of those subjects at MNINonLinear/Results/${run} 
in our unpacked packages, and thus should exist in Amazon S3, 
Connectome-in-a-Box, and via the CREST resource as well.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Leonardo Tozzi 

Date: Tuesday, May 14, 2019 at 3:50 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Downloading movement regressors from HYA release

Dear Experts,

Similarly to what

Re: [HCP-Users] Downloading movement regressors from HYA release

2019-05-29 Thread Harms, Michael

Did you try downloading again, just in case there was a transfer glitch?  
Because the Movement_RelativeRMS.txt files appear to be present in the database 
for all 4 rfMRI runs of 101006 (the one subject I checked).

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: Leonardo Tozzi 
Date: Wednesday, May 29, 2019 at 11:40 AM
To: "Harms, Michael" , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] Downloading movement regressors from HYA release

Dear Michael,

I have now downloaded the majority of subjects with:

curl -o …/${sub}_${run}_Movement_RelativeRMS.txt -u USER:PASS 
https://db.humanconnectome.org/data/projects/HCP_1200/subjects/${sub}/experiments/${sub}_CREST/files/MNINonLinear/Results/${run}/Movement_RelativeRMS.txt<https://db.humanconnectome.org/data/projects/HCP_1200/subjects/$%7bsub%7d/experiments/$%7bsub%7d_CREST/files/MNINonLinear/Results/$%7brun%7d/Movement_RelativeRMS.txt>


However, these subjects still seem to be missing: 101006 127226 148335 159441 
188145 204016 285345 316835 368551 926862 927359 942658



Is that normal?

Thank you,

Leonardo Tozzi, MD, PhD
Williams PanLab | Postdoctoral Researcher
Stanford University | 401 Quarry Rd
lto...@stanford.edu<mailto:lto...@stanford.edu> | (650) 5615738


From: "Harms, Michael" 
Date: Wednesday, May 15, 2019 at 9:31 AM
To: Leonardo Tozzi , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] Downloading movement regressors from HYA release


Hi,
The Movement_RelativeRMS.txt files aren’t part of the specific “${run}_FIX” 
resource that you tried to reference.

If you have access to outputs in our standard file/directory organization 
(e.g., Amazon S3, Connectome-in-a-Box, or our downloaded “packages”), the 
Movement_RelativeRMS.txt files can be found at 
${subj}/MNINonLinear/Results/${run}.

If not, the “CREST” resource (search the previous HCP-User emails) is what we 
are intending users to use, which is intended to mimic the standard 
file/directory organization in terms of where files are located.

See FAQ 15 here
https://wiki.humanconnectome.org/display/PublicData/HCP+Users+FAQ
and the archived user list posts referenced therein.

Regarding your item (2), I just confirmed that Movement_Regressors.txt exists 
for all 4 REST runs for all 6 of those subjects at MNINonLinear/Results/${run} 
in our unpacked packages, and thus should exist in Amazon S3, 
Connectome-in-a-Box, and via the CREST resource as well.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Leonardo Tozzi 

Date: Tuesday, May 14, 2019 at 3:50 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Downloading movement regressors from HYA release

Dear Experts,

Similarly to what described in this thread:

https://www.mail-archive.com/hcp-users@humanconnectome.org/msg01342.html

I have downloaded motion regressors for files from the healthy young adult 
release with the following command:

curl -o …/${sub}_${run}_Movement_ Regressors.txt -u USER:PASSWORD 
https://db.humanconnectome.org/data/projects/HCP_1200/subjects/${sub}/experiments/${sub}_3T/resources/${run}_FIX/files/${run}/Movement_Regressors.txt

However, I have 2 follow-up questions.

  1.  I would like to download the relative motion file, 
Movement_RelativeRMS.txt, to do some motion censoring on the resting state 
data. But if I try the same command with Movement_RelativeRMS.txt instead of 
Movement_Regressors.txt I get an error saying that the file does not exist.
  2.  For some subjects, I can’t find even the Movement_Regressors.txt. Is 
there any reason for this? These are the IDs: '127226' '130114' '169040' 
'329844' '908860' '971160'

Thank you,


Leonardo Tozzi, MD, PhD
Williams PanLab | Postdoctoral Fellow
Stanford University | 401 Quarry Rd
lto...@stanford.edu<mailto:lto...@stanford.edu> | (650) 5615738


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohi

Re: [HCP-Users] A question about the HCP pipeline 4.0.0

2019-05-17 Thread Harms, Michael

Hmmm.  What is the specific error message that you are getting?  In the 4.0.0 
pipelines, the necessary custom scripts that you mentioned are provided with 
the pipeline distribution, and are contained in the 
HCPpipelines/FreeSurfer/custom directory.

It sounds like you don’t have something configured correctly.  i.e., perhaps 
your $HCPPIPEDIR variable is not actually set to the location of the 4.0.0 
distribution?

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Aaron C 

Date: Friday, May 17, 2019 at 11:05 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] A question about the HCP pipeline 4.0.0

Dear HCP experts,

I have a question about the HCP pipeline 4.0.0. When using 
FreeSurferPipelineBatch.sh, it cannot find recon-all.v6.hires, conf2hires, and 
longmc. Is there a specific version of FreeSurfer 6.0 for HCP? Thank you.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] volumetric segmentation of subcortical structures

2019-05-16 Thread Harms, Michael

Hi,
If you simply want all the FreeSurfer quantification in tabular form, you can 
select the “Expanded FreeSurfer Data” link under “Quick Downloads”, available 
here https://db.humanconnectome.org/data/projects/HCP_1200

(You will need a ConnectomeDB account).

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Thursday, May 16, 2019 at 6:14 AM
To: "Mazzetti, C. (Cecilia)" , 
"hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] volumetric segmentation of subcortical structures

${StudyFolder}/${Subject}/MNINonLinear/wmparc.nii.gz in the structural packages.

Matt.

From:  on behalf of "Mazzetti, C. 
(Cecilia)" 
Date: Thursday, May 16, 2019 at 3:51 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] volumetric segmentation of subcortical structures


Dear all,

I am required to come up with a proof of concept for subcortical 
lateralizations found in my study. Ideally, consistent lateralizations derived 
from a. bigger dataset such as the HCP one, would do the job more than fine. A 
colleague told me there should be a file somewhere, with structural 
segmentation data (i.e., volumes) already done for the MRIs in the database. I 
am wondering whether someone knows if: 1. this is true, 2. if yes, is it 
possible to access? and how ?



Thanks very much in advance to anyone willing to help


Best,
Cecilia


​

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Question on download dMRI.

2019-05-15 Thread Harms, Michael

Hi,
If you want to try downloading via https://db.humanconnectome.org and Aspera, 
simply select the “All Family Subjects” category under “Download Image Data” 
and then select the “Diffusion” modality on the “Download Packages” page.   
That will select all 1065 subjects with Diffusion data – it’s 1.3 TB of data.  
Certainly no need to select the subjects one at a time.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Seung Yong Hwang 

Date: Wednesday, May 15, 2019 at 8:59 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Question on download dMRI.

Dear all,

Currently, I am struggling with diffusion MRI download.

Since I need to process almost all subjects (1200 subjects) for my research, I 
can download only about 400 dMRI data through AWS.
If dMRI data are available on AWS, they are under the  directory, 
"HCP/subject_id/T1w/Diffusion" - (subejct_id: 211215 etc..)
So, I can download it by shell and R.

However, if dMRI data is not on AWS, I need to download it by Aspera and click 
all 800 subjects. (subject_id, 257946, 962058 etc..)
It is almost impossible to download it manually.
So, I am wondering anyone knows the way to download these 800 dMRI data 
automatically.

Best,
Seungyong
--
Seungyong (SEAN) Hwang
Ph.D Candidate
Graduate Group in Biostatistics
University of California, DAVIS

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Downloading movement regressors from HYA release

2019-05-15 Thread Harms, Michael

Hi,
The Movement_RelativeRMS.txt files aren’t part of the specific “${run}_FIX” 
resource that you tried to reference.

If you have access to outputs in our standard file/directory organization 
(e.g., Amazon S3, Connectome-in-a-Box, or our downloaded “packages”), the 
Movement_RelativeRMS.txt files can be found at 
${subj}/MNINonLinear/Results/${run}.

If not, the “CREST” resource (search the previous HCP-User emails) is what we 
are intending users to use, which is intended to mimic the standard 
file/directory organization in terms of where files are located.

See FAQ 15 here
https://wiki.humanconnectome.org/display/PublicData/HCP+Users+FAQ
and the archived user list posts referenced therein.

Regarding your item (2), I just confirmed that Movement_Regressors.txt exists 
for all 4 REST runs for all 6 of those subjects at MNINonLinear/Results/${run} 
in our unpacked packages, and thus should exist in Amazon S3, 
Connectome-in-a-Box, and via the CREST resource as well.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Leonardo Tozzi 

Date: Tuesday, May 14, 2019 at 3:50 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Downloading movement regressors from HYA release

Dear Experts,

Similarly to what described in this thread:

https://www.mail-archive.com/hcp-users@humanconnectome.org/msg01342.html

I have downloaded motion regressors for files from the healthy young adult 
release with the following command:

curl -o …/${sub}_${run}_Movement_ Regressors.txt -u USER:PASSWORD 
https://db.humanconnectome.org/data/projects/HCP_1200/subjects/${sub}/experiments/${sub}_3T/resources/${run}_FIX/files/${run}/Movement_Regressors.txt

However, I have 2 follow-up questions.

  1.  I would like to download the relative motion file, 
Movement_RelativeRMS.txt, to do some motion censoring on the resting state 
data. But if I try the same command with Movement_RelativeRMS.txt instead of 
Movement_Regressors.txt I get an error saying that the file does not exist.
  2.  For some subjects, I can’t find even the Movement_Regressors.txt. Is 
there any reason for this? These are the IDs: '127226' '130114' '169040' 
'329844' '908860' '971160'

Thank you,


Leonardo Tozzi, MD, PhD
Williams PanLab | Postdoctoral Fellow
Stanford University | 401 Quarry Rd
lto...@stanford.edu | (650) 5615738


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Quick questions

2019-05-14 Thread Harms, Michael

In the case of the 7T data, the 1/2/3/4 are just used to distinguish the 
different resting state runs, acquired in different scan sessions.  The numbers 
are not intended to imply that each scan was acquired on a different day.  We’d 
have to look, but I suspect that 1/2 were typically acquired on one day, and 
3/4 on a different day.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Tuesday, May 14, 2019 at 6:39 PM
To: Yoav Feldman 
Cc: Erez Simony , michael tolochinsky 
, hcp-users 
Subject: Re: [HCP-Users] Quick questions

LR/RL/AP/PA refer to phase encoding directions.  1/2/3/4 refer to acquisition 
days.  I use all of the resting state data when doing analyses.

3T: TR=0.72s, 2x2x2mm, 4x1200 frames
7T: TR=1s, 1.6x1.6x1.6mm, 4x900 frames

Matt.

From: Yoav Feldman 
Date: Tuesday, May 14, 2019 at 1:38 PM
To: "Glasser, Matthew" 
Cc: Erez Simony , michael tolochinsky 

Subject: Quick questions

Dear Matthew,

We are looking at the HCP resting-state data (grayordinate) coming from the  7T 
and 3T magnets, and have quick questions:

1) We would like to work on results of 3T and 7T data of the same 184 subjects 
that has both of them.
For the 7T there is 1 result directory of data per each resting state scan 
(1-4), for each subject:

[Screen Shot 2019-05-14 at 21.34.14.png]

For the 3T we see the following 2 directories per each resting state scan (1 & 
2):

[Screen Shot 2019-05-14 at 17.06.57.png]

what are the differences between the 2 directories LR and RL ?  which directory 
should we use?

2) Where can we find the parameters of the 7T scan (resting state ) and 3T scan 
(resting state) ? in particular , what is the difference in spatial resolution 
between the 2 magnets in resting state? (in common space)

We much appreciate your help!

Thank you in advance
Yoav and Michael



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] MSMAll vs. MSMSulc reliability in our data

2019-05-09 Thread Harms, Michael

While I’m not surprised that the ICCs would be lower for an anatomical-based 
measure for MSMAll than MSMSulc, I am surprised by the magnitude of the change 
(from 0.9 to 0.65), especially for a parcellated analysis, since only changes 
in the precise border of the parcellations should be affecting the results.

Your results imply that the test and retest MSMAll registrations are very 
different from each other.

Wouldn’t the Strain maps for MSMSulc vs MSMAll be informative here?  You might 
also want to examine some sort of measure of the distortion between the two 
MSMAll registrations directly.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Thursday, May 9, 2019 at 4:23 PM
To: Maria Sison , Stephen Smith 
Cc: HCP 讨论组 
Subject: Re: [HCP-Users] MSMAll vs. MSMSulc reliability in our data

Not running sICA+FIX might well be a part of the problem.  The TR is quite long 
as Steve says, which will limit the accuracy of sICA+FIX cleanup some also.  
Also, surface area and thickness might prefer MSMSulc due to correlations with 
folding patterns.  Myelin, task, and resting state fMRI will more tend to 
correlate with MSMAll.

Matt.

From:  on behalf of Maria Sison 

Date: Thursday, May 9, 2019 at 3:40 PM
To: Steve Smith 
Cc: HCP 讨论组 
Subject: Re: [HCP-Users] MSMAll vs. MSMSulc reliability in our data

Thank you so much, this is very helpful and interesting to think about. We 
concatenated rest and tasks and regressed out tasks to get around 1000 TRs of 
pseudo-rest which we then used for MSMAll. Still not nearly as much as HCP, but 
I would be interested to hear what a ballpark minimum data requirement for 
MSMALL would be.

Best,
Maria


From: Steve Smith 
Sent: Thursday, May 9, 2019 4:19:24 PM
To: Maria Sison
Cc: HCP 讨论组
Subject: Re: [HCP-Users] MSMAll vs. MSMSulc reliability in our data

Hi - probably the single primary thing is number of timepoints - though things 
like TR and spatial resolution will also affect this.

My guess is still that probably you don't have enough timepoints here to get 
decent single-subject RSN maps (decen enough for MSMALL that is).  Emma or Matt 
might have more direct insight into the minimum amount of data you need to get 
MSMALL working well.   Unless you can combine more of your datasets together 
(even if just for the purposes of MSM) then you might be better off with 
MSMSULC.

Cheers.

ps with this setup I would definitely push multiband at least as high as 6 if 
not 8.








On 9 May 2019, at 15:12, Maria Sison 
mailto:maria.si...@duke.edu>> wrote:

Hello,

Here’s our rfMRI protocol: each participant was scanned using a Siemens Skyra 
3T scanner equipped with a 64-channel head/neck coil. A series of 72 
interleaved axial T2-weighted functional slices were acquired using a 3-fold 
multi-band accelerated echo planar imaging sequence with the following 
parameters: TR = 2000 ms, TE = 27 msec, flip angle = 90°, field-of-view = 200 
mm, voxel size = 2 mm isotropic, slice thickness = 2 mm without gap. Total scan 
length is 496 s.

Out of curiosity, which parameters would be most important for MSMAll?

Thank you,
Maria


From: Steve Smith mailto:st...@fmrib.ox.ac.uk>>
Sent: Thursday, May 9, 2019 3:56:49 PM
To: Maria Sison
Cc: HCP 讨论组
Subject: Re: [HCP-Users] MSMAll vs. MSMSulc reliability in our data

Hi - what is your rfMRI protocol?   It might be that you're right that the 
difference is in the preprop - but my first guess might be that - if the rfMRI 
data is not as high quality as HCP rfMRI data - it might not be good enough to 
reliably drive MSMALL?

Cheers.





On 9 May 2019, at 14:45, Maria Sison 
mailto:maria.si...@duke.edu>> wrote:

Dear experts,

We have run the HCP minimal preprocessing pipelines on our data (1 mm isotropic 
T1w and FLAIR + rest and 4 tasks) and compared test-retest reliability for 
MSMSulc and MSMAll in 20 subjects. Specifically, we looked at intraclass 
correlations for parcellated cortical thickness and surface area and found that 
they were much lower for MSMAll compared to MSMSulc in our test-retest sample 
(MSMSulc on average above 0.9 and for MSMAll around 0.65 on average). When we 
looked in HCP retest data, the ICCs for MSMAll were more similar to those for 
MSMSulc (both above 0.9), but still slightly lower.

There are a few major differences in how we ran the pipeline. We skipped 
sICA+FIX and ran our own preprocessing on task and rest fMRI after fMRIVolume 
but before fMRISulc (bandpass filtering, motion correction, censoring, 
CompCorr, and regressed out tasks). We thought our processing would be ok for 
cleaning task fMRI, but I see 

Re: [HCP-Users] PALM analysis errors with HCP data - data cannot be sorted

2019-04-29 Thread Harms, Michael

Hi,
I suggest directing your inquiry to the FSL-Users list.

Cheers,
-MH

--
Michael Harms, Ph.D.

---

Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

On 4/29/19, 6:15 AM, "hcp-users-boun...@humanconnectome.org on behalf of 
Samuel Berry"  wrote:

Hello,

I have been having some issues when running one-sample PALM analysis on a 4d 
nifti image that contains 3T zmaps from a seed-based correlation analysis. The 
error I get is:

Error using palm_competitive
Data cannot be sorted. Check for NaNs that might be present, or precision 
issues that may cause over/underflow. If you are using “-approx tail”, consider 
adding “nouncorrected”.

Having read some other posts on this error I have taken the following steps.

1) Changed the seed to be ‘twist’
2) Run fslmaths -nan on my data before running the analysis
3) Converted the 4d file to ‘double’ using fslmaths <4dimg>  -odt double
4) Run with and without tail approximation, always with the -nouncorrected 
option.
5) Tried without the EB blocks
6) Used up to 10,000 permutations

Unfortunately I always get the same error. I went into the palm_competitive 
script and saved the variables created and it shows that I do indeed have NaNs 
(from the gg variable). I tried, just to see what happens, to bypass the error 
by replacing any NaNs with the mean of the ’S’ variable which did let me 
progress past the error. However my output is then a fwep image that is all ‘1’ 
and a tstat image that is just NaN. Do you have any idea what I am doing wrong? 
I have been stuck on this for a while now and can’t figure it out!

My PALM input is the following (with the variations I have mentioned above, and 
-n 5 being just for testing).

palm -i <4d_img>, -T, -eb ,  -save1-p,  -n 5, -seed twist, -m 
, -o 

Many thanks,
Sam



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] "activation" tables for reporting pscalar results

2019-04-24 Thread Harms, Michael

Well, that raises the question if surface-based results should just be 
automatically “lumped in” with volume-based results by tools such as neurosynth 
to begin with…

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: Joseph Orr 
Date: Wednesday, April 24, 2019 at 10:51 AM
To: "Harms, Michael" 
Cc: HCP Users 
Subject: Re: [HCP-Users] "activation" tables for reporting pscalar results

Well I am planning on doing that, but that doesn't necessarily help with 
automated meta-analytic tools like neurosynth that mine for tables.
--
Joseph M. Orr, Ph.D.
Assistant Professor
Department of Psychological and Brain Sciences
Texas A Institute for Neuroscience
Texas A University
College Station, TX


On Wed, Apr 24, 2019 at 10:36 AM Harms, Michael 
mailto:mha...@wustl.edu>> wrote:

Why not simply report the parcel name and its values?  And consider putting the 
scene on BALSA, so that others can easily access the data.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: 
mha...@wustl.edu<mailto:mha...@wustl.edu>

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Joseph Orr mailto:joseph@tamu.edu>>
Date: Wednesday, April 24, 2019 at 10:06 AM
To: HCP Users 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] "activation" tables for reporting pscalar results

I am trying to determine the best approach for producing tables of pscalar 
results. I haven't seen any papers reporting pscalar results that have tables, 
but I anticipate reviewers wanting to see these, and tables are critical for 
meta-analyses. Since there aren't peaks, I was thinking of calculating the 
center of mass after converting the significant parcels to a volume. Has anyone 
done this already for the Multi-Modal Parcellation? Or is there a reason that 
I'm not thinking of that doing this is not ideal or even not valid?

Thanks,
Joe
--
Joseph M. Orr, Ph.D.
Assistant Professor
Department of Psychological and Brain Sciences
Texas A Institute for Neuroscience
Texas A University
College Station, TX

___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] "activation" tables for reporting pscalar results

2019-04-24 Thread Harms, Michael

Why not simply report the parcel name and its values?  And consider putting the 
scene on BALSA, so that others can easily access the data.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Joseph Orr 

Date: Wednesday, April 24, 2019 at 10:06 AM
To: HCP Users 
Subject: [HCP-Users] "activation" tables for reporting pscalar results

I am trying to determine the best approach for producing tables of pscalar 
results. I haven't seen any papers reporting pscalar results that have tables, 
but I anticipate reviewers wanting to see these, and tables are critical for 
meta-analyses. Since there aren't peaks, I was thinking of calculating the 
center of mass after converting the significant parcels to a volume. Has anyone 
done this already for the Multi-Modal Parcellation? Or is there a reason that 
I'm not thinking of that doing this is not ideal or even not valid?

Thanks,
Joe
--
Joseph M. Orr, Ph.D.
Assistant Professor
Department of Psychological and Brain Sciences
Texas A Institute for Neuroscience
Texas A University
College Station, TX

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] hp2000 filter not applied to hp2000_clean.nii.gz volume data for some (one?) subjects?

2019-04-23 Thread Harms, Michael

For users that want to follow this, please see:
https://github.com/Washington-University/HCPpipelines/issues/108

It has something to do with the fact that we needed to apply manual 
reclassification of the FIX output in that particular subject/run.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Keith Jamison 

Date: Tuesday, April 23, 2019 at 3:59 PM
To: HCP Users 
Subject: [HCP-Users] hp2000 filter not applied to hp2000_clean.nii.gz volume 
data for some (one?) subjects?

For subject 204218, both REST1_LR and REST1_RL, I noticed a linear trend in the 
*_hp2000_clean.nii.gz NIFTI time series, but the hp2000_clean.dtseries.nii 
CIFTI files do not have this trend. See attached figures showing this issue for 
both REST1_LR and REST1_RL for 204218. The overall mean time series has a 
negative trend for NIFTI, but in the voxel time series on the left you can see 
that some have positive trend and some have negative. To test, I did run 
fslmaths-based filtering on hp2000_clean.nii.gz and I no longer see any linear 
trend.

I tried one scan in one additional subject, 102311 REST1_LR, and did not see 
this linear trend in either NIFTI or CIFTI (also attached).

Note: I did remove the overall mean for each voxel timecourse before
plotting, and for the NIFTI I'm only showing gray matter voxels, as determined 
by downsampling aparc+aseg.nii.gz and excluding labels for WM,CSF,ventricles, 
and a few misc. I also tried looking at all non-zero voxels, as well as only 
those marked in RibbonVolumeToSurfaceMapping/goodvoxels.nii.gz, but the issue 
of linear trends is the same.

Any idea what might be going on with this subject? I haven't tried this in 
anyone other than 204218 (bad) and 102311 (good).

-Keith



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] unequal length of movement regressors

2019-04-22 Thread Harms, Michael

We have seen this occur on rare occasions, and don’t have an explanation for it.

Try just running fMRIVolume again.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Moataz Assem 

Date: Monday, April 22, 2019 at 1:16 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] unequal length of movement regressors

Hi,

Is there a reason why movement_regressors.txt might have unequal length of 
columns (i.e. timepoints)?
I have seen this happen (infrequently) in different runs for different subjects 
for some HCP-style data we collected.
Preprocessing was done using v3.27.0

Moataz



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] wb_command -metric-tfce

2019-04-20 Thread Harms, Michael

Hi,
We suggest you use PALM, since you need to use permutation to determine the 
distribution of the TFCE metric.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Reza Rajimehr 

Date: Saturday, April 20, 2019 at 7:18 PM
To: hcp-users 
Subject: [HCP-Users] wb_command -metric-tfce

Hi,

We have curvature data from two groups of subjects in a common anatomical space 
(MSMAll). We performed a univariate comparison between the two groups using 
t-test. We now have a map, which shows vertex-wise curvature difference between 
the two groups (the curvature difference is shown only for vertices which have 
a significant difference). The next step is to do cluster-wise correction using 
TFCE method. It looks like this command can do what we want:

https://www.humanconnectome.org/software/workbench-command/-metric-tfce

However, we couldn’t find any example command, and its usage is a bit unclear 
for us. For example, how should we specify the two groups?

Any help would be appreciated.

Note: Our analysis here is somewhat similar to the analysis in Figure 5B in Van 
Essen et al. 2012 paper:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3432236/pdf/bhr291.pdf

Best,
Reza

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] DeDriftAndResamplePipeline error

2019-04-20 Thread Harms, Michael

That particular error message can be ignored.  We’ll fix the scripts so that it 
doesn’t occur in the future.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Marta Moreno 

Date: Saturday, April 20, 2019 at 12:02 PM
To: "Glasser, Matthew" 
Cc: HCP Users 
Subject: Re: [HCP-Users] DeDriftAndResamplePipeline error

Thanks for your response!

Case Syntax on Mac is as follow and same for unix/linux as far as I know:
  case word in [ [(] pattern [ | pattern ] ... ) list ;; ] … esac

Also there are 2 errors in the log file DeDriftAndResamplePipeline.sh.e42974 
but 3 case...esac statements in DeDriftAndResamplePipeline.sh

I do not think this is a problem since it does seem to have completed 
successfully, but do you have any other suggestion?

Thanks for your help,
Leah.


On Apr 19, 2019, at 8:26 AM, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:

It does seem to have completed successfully.  I wonder if “case” doesn’t work 
the same on Mac?

Matt.

From: Marta Moreno mailto:mmorenoort...@icloud.com>>
Date: Thursday, April 18, 2019 at 11:19 PM
To: Timothy Coalson mailto:tsc...@mst.edu>>
Cc: Matt Glasser mailto:glass...@wustl.edu>>, "Harwell, 
John" mailto:jharw...@wustl.edu>>, HCP Users 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] DeDriftAndResamplePipeline error

I think it worked now. The script finished pretty fast without prompting an 
error on the screen except for the following found In 
DeDriftAndResamplePipeline.sh.e42974:

 # Do NOT wrap the following in quotes (o.w. the entire set of commands gets 
interpreted as a single string)
 |
Error: The input character is not valid in MATLAB statements or expressions.

 # Do NOT wrap the following in quotes (o.w. the entire set of commands gets 
interpreted as a single string)
 |
Error: The input character is not valid in MATLAB statements or expressions.




I am attaching the log files to make sure the DeDriftAndResamplePipeline.sh 
script was completed successfully.


Thanks a lot!


Leah.



On Apr 18, 2019, at 2:45 PM, Timothy Coalson 
mailto:tsc...@mst.edu>> wrote:

Make sure you have the whole pipelines repo for 4.0.0, do not try to mix and 
match folders from different versions, and make sure your setup script is 
pointed to the 4.0.0 version when running things from 4.0.0.  The log_Warn 
function is defined inside global/scripts, and it should get sourced 
automatically based on HCPPIPEDIR, so make sure that is set correctly (pointed 
to the 4.0.0 version).

Tim


On Thu, Apr 18, 2019 at 1:39 PM Marta Moreno 
mailto:mmorenoort...@icloud.com>> wrote:
Thanks for your response. And sorry to bother again with this issue but I am 
still getting the following error: ReApplyFixMultiRunPipeline.sh: line 592: 
log_Warn: command not found

Please find log files attached.

Pipelines for MR+FIX, MSMAll and DeDriftAndResample are from version version 
4.0.0.
PreFreeSurfer, FreeSurfer, PostFreeSurfer, fMRIVolume, fMRISurface are from 
version  3_22
Since MR+FIX and MSMAll were run successfully, why it should be a version issue 
in ReApplyFixMultiRunPipeline.sh?

I want to be sure this is a version issue because I have already run 
PreFreeSurfer, FreeSurfer, PostFreeSurfer, fMRIVolume, fMRISurface version  
3_22 on a sample of 30 patients pre/post tx.

Thanks a lot for your help and patience.

Leah.



On Apr 15, 2019, at 9:39 PM, Timothy Coalson 
mailto:tsc...@mst.edu>> wrote:

I would also suggest changing your log level to INFO in wb_view, preferences 
(the wb_command option does not store the logging level change to preferences). 
 We should probably change the default level, or change the level of that 
volume coloring message.

Tim


On Mon, Apr 15, 2019 at 8:34 PM Timothy Coalson 
mailto:tsc...@mst.edu>> wrote:
I have pushed a similar edit to reapply MR fix, please update to the latest 
master.

Tim


On Mon, Apr 15, 2019 at 8:27 PM Timothy Coalson 
mailto:tsc...@mst.edu>> wrote:
They weren't instructions, I pushed an edit, and it was a different script.

Tim


On Mon, Apr 15, 2019 at 8:08 PM Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
Here is the error:

readlink: illegal option -- f
usage: readlink [-n] [file ...]

I believe Tim already gave you instructions for this.

Also, the log_Warn line is again concerning as to whether you followed the 
installation instructions and all version 4.0.0 files here.

Matt.

From: Marta Moreno mailto:mmorenoort...@icloud.com>>
Date: Monday, April 15, 2019 at 8:53 AM
To: Matt Glasser mailto:glass...@wustl.edu>>
Cc: HCP Users 
mailto:hcp-users@humanconnectome.org>>, Timothy 
Coalson mailto:tsc...@mst.edu>>, "Brown, Tim" 
mailto:tbbr...@wustl.edu>>
Subject: Re: [HCP-Users] 

Re: [HCP-Users] Subject list for the HCP Q2 release

2019-04-14 Thread Harms, Michael

FYI: The “prediction of fluid intelligence” aspect of Finn et al (2015) has 
already been extended to final HCP-YA release via a different research group in 
this study:

https://www.ncbi.nlm.nih.gov/pubmed/30104429
cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Sunday, April 14, 2019 at 11:06 AM
To: Manasij Venkatesh , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] Subject list for the HCP Q2 release

We might be able to dig that up, but perhaps if you contacted the authors they 
might simply have the list they used in an old script?  Also, it is worth 
keeping in mind that subjects who were released were occasionally later 
excluded for one reason or another.  Thus, it may not be possible to get the 
exact same set of subjects from the HCP DB now.  An alternative replication 
strategy would be to attempt to replicate the authors' analysis with a larger 
dataset and see if it still holds up.  This might end up making the more 
powerful replication statement and avoid the above issues.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Manasij Venkatesh mailto:mana...@umd.edu>>
Date: Sunday, April 14, 2019 at 11:02 AM
To: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Subject list for the HCP Q2 release

Hello,

I'm trying to replicate some findings from this very interesting Nature 
Neuroscience paper:  https://www.nature.com/articles/nn.4135

The following is in the subject information:
>  HCP data. We used the Q2 HCP data release, which was all the HCP data 
> publicly available at the time that this project began. The full Q2 release 
> contains data on 142 healthy subjects; we restricted our analysis to subjects 
> for whom all six fMRI sessions were available (n = 126; 40 males, age 22–35).

Unfortunately, I'm unable to find the list of subjects in the Q2 release. More 
specifically, the subjects that had all six fMRI sessions that were recorded at 
that time. Can you please help me find these subjects?

Sincerely,
Manasij

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] MSMAllPipeline error

2019-04-13 Thread Harms, Michael

We extended that feature such that it should be an accepted option for all the 
"ICAFIX"-related scripts, but we haven't had a chance yet to extend it to the 
context of MSMAll and TaskAnalysis.  Hopefully in the near future...

--
Michael Harms, Ph.D.

---

Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

On 4/13/19, 11:28 AM, "hcp-users-boun...@humanconnectome.org on behalf of 
Glasser, Matthew"  wrote:

I wouldn¹t use hp=pd2 unless you know what you are doing, as that option
has not been fully tested.  I run with hp=0.

Matt.

On 4/13/19, 10:41 AM, "hcp-users-boun...@humanconnectome.org on behalf of
Marta Moreno"  wrote:

>Dear Experts,
>
>I am running the following script MSMAllPipelineBatch.sh from
>${StudyFolder}/${Subject}/scripts after running MR ICA+FIX with success,
>and I am getting the following error:
>
>ERROR: failed to open file
>'/Volumes/data/data3/NTTMS/NTTMS_s002/NTTMS_s002_170812/MNINonLinear/Resul
>ts/RS_fMRI_MR/RS_fMRI_MR_Atlas_hppd2_clean_vn_tempcompute.dscalar.nii',
>file does not exist, or folder permissions prevent seeing it
>
>I set up the script as follow:
>
>fMRINames="RS_fMRI_MR"
>OutfMRIName="RS_fMRI_MR_REST"
>HighPass="pd2"
>fMRIProcSTRING="_Atlas_hppd2_clean"
>MSMAllTemplates="${HCPPIPEDIR}/global/templates/MSMAll"
>RegName="MSMAll_InitalReg"
>HighResMesh="164"
>LowResMesh="32"
>InRegName="MSMSulc"
>MatlabMode="1" #Mode=0 compiled Matlab, Mode=1 interpreted Matlab
>
>I checked and the file called: *tempcompute.dscalar.nii¹, is not there.
>
>What am I doing wrong? Something went wrong in the previous step while
>running MR ICA+FIX that I am not aware of?
>
>Thanks a lot!
>
>Leah.
>
>
>
>___
>HCP-Users mailing list
>HCP-Users@humanconnectome.org
>http://lists.humanconnectome.org/mailman/listinfo/hcp-users



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] file_array objects can not be permuted

2019-04-08 Thread Harms, Michael

Sorry, but I think you are going to have to get support from Anderson and the 
FSL list for this one.

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Yu Han 

Date: Monday, April 8, 2019 at 2:17 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] file_array objects can not be permuted

Dear Expert,

I am running a PALM analysis with two groups. A represents group with autism, N 
represents group without autism. I currently have four subjects from the N 
group and 1 subject from the A group. Below I listed my contrast and design 
etc. I am running into the following error. I searched the FSL blog, some 
people say it is the version of PALM, and recommended to use an older version 
of PALM instead. Any suggestions/feedbacks would be much appreciated. Thank you 
for your help in advance!

Best,
Yu


The contrast.csv file looks like this:

1,0
0,1
1,-1
-1,1

The design.csv file looks like this:
  GNU nano 2.3.1 File: design.csv

1 0
0 1
0 1
0 1
0 1


The command I ran looks like this:
palm -i tfMRI_EmoRec_ES_vs_HS_s5_beta.dscalar.nii -n 0 -transposedata -o 
EmoRec_ES_vs_HS_PALM -logp -d design.csv -t contrast.csv -ise -saveglm


And I got this error.
Found FSL in /gpfs1/arch/x86_64-rhel7/fsl-5.0.11/fsl
Found FreeSurfer in /gpfs1/arch/x86_64-rhel7/freesurfer-5.3.0-HCP
Found HCP Workbench executable in 
/gpfs1/home/y/h/yhan8/tools/workbench/bin_rh_linux64/wb_command
Reading input 1/1: tfMRI_EmoRec_ES_vs_HS_s5_beta.dscalar.nii
Error using permute 
(/users/y/h/yhan8/PALM/palm-alpha114/fileio/@file_array/permute.m:10)
file_array objects can not be permuted.

Error in gifti_Data 
(/users/y/h/yhan8/PALM/palm-alpha114/fileio/@gifti/private/read_gifti_file.m:201->permute)
Error in gifti_DataArray 
(/users/y/h/yhan8/PALM/palm-alpha114/fileio/@gifti/private/read_gifti_file.m:122->gifti_Data)
Error in read_gifti_file 
(/users/y/h/yhan8/PALM/palm-alpha114/fileio/@gifti/private/read_gifti_file.m:45->gifti_DataArray)
Error in gifti 
(/users/y/h/yhan8/PALM/palm-alpha114/fileio/@gifti/gifti.m:89->read_gifti_file)
Error in palm_ciftiread 
(/users/y/h/yhan8/PALM/palm-alpha114/palm_ciftiread.m:84->gifti)
Error in palm_miscread 
(/users/y/h/yhan8/PALM/palm-alpha114/palm_miscread.m:181->palm_ciftiread)
Error in palm_ready 
(/users/y/h/yhan8/PALM/palm-alpha114/palm_ready.m:47->palm_miscread)
Error in palm_takeargs 
(/users/y/h/yhan8/PALM/palm-alpha114/palm_takeargs.m:1645->palm_ready)
Error in palm_core 
(/users/y/h/yhan8/PALM/palm-alpha114/palm_core.m:33->palm_takeargs)
Error in palm (/users/y/h/yhan8/PALM/palm-alpha114/palm.m:81->palm_core)





Yu Han
Ph.D. Candidate
Neuroscience Graduate Program
The Vermont Complex Systems Center
University of Vermont


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] MR ICA+FIX error

2019-04-06 Thread Harms, Michael

This indicates that you aren’t using the version of hcp_fix_multi_run from the 
v4.0.0 pipelines, which supports the use of all 3 matlab modes.

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Saturday, April 6, 2019 at 2:57 PM
To: Marta Moreno , NEUROSCIENCE tim 
Cc: HCP Users 
Subject: Re: [HCP-Users] MR ICA+FIX error

I would use mode=1

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Marta Moreno 
mailto:mmorenoort...@icloud.com>>
Date: Saturday, April 6, 2019 at 2:46 PM
To: Timothy Coalson mailto:tsc...@mst.edu>>
Cc: HCP Users 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] MR ICA+FIX error

Dear Tim,

I am getting the following error:

hcp_fix_multi_run - ABORTING: Unsupported MATLAB run mode value 
(FSL_FIX_MATLAB_MODE) in 
/usr/local/bin/gaurav_folder_new/HCP/Connectome_Project_3_22/Pipelines/ICAFIX/hcp_fix_multi_run/settings.sh:


In settings.sh, I have set up the following:

# Part III General settings
# =
# This variable selects how we run the MATLAB portions of FIX.
# It takes the values 0-2:
#   0 - Try running the compiled version of the function
#   1 - Use the MATLAB script version
#   2 - Use Octave script version
if [ -z "${FSL_FIX_MATLAB_MODE}" ]; then
FSL_FIX_MATLAB_MODE=0
fi
if [[ ${FSL_FIX_MATLAB_MODE} = 2 && -z ${FSL_FIX_OCTAVE} ]]; then
echo "ERROR in $0: Can't find Octave command"
exit 1
fi


 Not sure if this is needed but FSL_FIX_OCTAVE is disabled in my settings.sh 
since I tried to install octave via MacPorts and am getting the following 
errors:


Error: Failed to build mpich-default: command execution failed
Error: See 
/opt/local/var/macports/logs/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_science_mpich/mpich-default/main.log
 for details.
Error: Follow https://guide.macports.org/#project.tickets to report a bug.
Error: Processing of port octave failed


Thanks a lot,


Leah.




On Apr 1, 2019, at 2:58 PM, Timothy Coalson 
mailto:tsc...@mst.edu>> wrote:

I have pushed a change (it launched the matlab part without error, which is 
where that path is used), try the latest master.

Tim


On Mon, Apr 1, 2019 at 1:49 PM Marta Moreno 
mailto:mmorenoort...@icloud.com>> wrote:
Thanks for your response, Tim.
What should I do specifically to make it work. I am not sure I am following the 
steps as written.

Thanks again,
Leah.

Sent from my iPhone

On Apr 1, 2019, at 2:40 PM, Timothy Coalson 
mailto:tsc...@mst.edu>> wrote:
Looks like mac's readlink is incapable of being easily useful for this - 
without -f, it doesn't even output anything if it isn't a symlink, while the 
point is to find the real location whether it is a symlink or not (because it 
needs other files from that directory).

I guess I will make it test whether "$0" is a symlink first.

Tim


On Mon, Apr 1, 2019 at 9:31 AM Harms, Michael 
mailto:mha...@wustl.edu>> wrote:
Tim will have to comment on the ‘readlink -f’ issue, since I think he 
introduced that particular syntax.

Please keep posts cc’ed to the HCP-User list, so that we can archive the 
discussion, and other users can benefit from it.

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: 
mha...@wustl.edu<mailto:mha...@wustl.edu>

From: Marta Moreno mailto:mmorenoort...@icloud.com>>
Date: Monday, April 1, 2019 at 9:21 AM
To: "Harms, Michael" mailto:mha...@wustl.edu>>
Subject: Re: [HCP-Users] ICA+FIX error

Thanks for your response, Michael. The issue with single ICA+ FIX is solved 
after running fMRISurface again.

The problem is now with MR ICA+FIX. I am copying the error below for your 
convenience. Matt said is probably related to the fact that I am using a MAC.

bash-3.2$ hcp_fix_multi_run 
RS_fMRI_1/RS_fMRI_1.nii.gz@RS_fMRI_2/RS_fMRI_2.nii.gz<mailto:RS_fMRI_1/RS_fMRI_1.nii.gz@RS_fMRI_2/RS_fMRI_2.nii.gz>
 RS_fMRI_MR 2000 TRUE
Sat Mar 30 16:45:07 EDT 2019 - hcp_fix_multi_run - HCPPIPEDIR: 
/usr/local/bin/HCP/Pipelines
Sat Mar 30 16:45:07 EDT 2019 - hcp_fix_multi_run - CARET7DIR: 
/Applications/workbench/bin_macosx64/
Sat Mar 30 16:45:07 EDT 2019 - hcp_fix_multi_run - FSLDIR: /usr/local/fsl
Sat Mar 30 16:45:07 EDT 2019 - hcp_fix_multi_run - FSL_FIXDIR: 
/usr/local/bin/HCP/Pipelines/ICAFIX/hcp_fix_multi_run
readlink: illegal option -- f

Re: [HCP-Users] MSM binaries versions

2019-04-04 Thread Harms, Michael

Also, see here
https://github.com/ecr05/MSM_HOCR/issues/5
for some info from Emma.


--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Thursday, April 4, 2019 at 3:03 PM
To: NEUROSCIENCE tim , Moataz Assem 

Cc: "hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] MSM binaries versions

Use version 3.0.0 in GitHub.  The one used in FSL is not supported by the HCP 
Pipelines as it does not have the HOCR options.  All that is needed is the MSM 
binary.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Timothy Coalson mailto:tsc...@mst.edu>>
Date: Thursday, April 4, 2019 at 2:15 PM
To: Moataz Assem 
mailto:moataz.as...@mrc-cbu.cam.ac.uk>>
Cc: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] MSM binaries versions

The 1.0 and 3.0 versions on github are nearly identical, that was just a naming 
issue.

The version in FSL may be based on version 2, and is missing a library needed 
for HOCR, so some options in v3 aren't available.  You should be able to use 
the fsl versions of the executables other than msm (so, msmresample, etc) with 
any version of msm.

I'm not sure about your other questions.

Tim


On Thu, Apr 4, 2019 at 1:29 PM Moataz Assem 
mailto:moataz.as...@mrc-cbu.cam.ac.uk>> wrote:
Hi,

What is the recommended version of the MSM binaries to use? The rep directory 
(https://github.com/ecr05/MSM_HOCR/releases) has v1.0.0 and v3.0.0 and I was 
previously using v2 from here: https://www.doc.ic.ac.uk/~ecr05/MSM_HOCR_v2/
Also what is the difference between these binaries and the ones downloaded with 
fsl? In otherwords, can I just point the MSMBINDIR in the SetUpHCPPipeline.sh 
to  ~/fsl/bin/ since it contains all the msm related functions?

Also I would appreciate a clarification on the list of compiled files to make 
sure exist in the directory pointed to for the MSMBINDIR variable.

Thanks

Moataz

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] ICA+FIX error

2019-04-01 Thread Harms, Michael

Check RS_fMRI_2_hp2000.ica/.fix.log for clues.  Also, what’s in the stderr 
output from the run?


--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Marta Moreno 

Date: Sunday, March 31, 2019 at 6:44 PM
To: HCP Users 
Subject: [HCP-Users] ICA+FIX error

Dear Experts,

I am running hip_fix and getting the following error with just one of my 
subjects, please see below. It seams the script stoped at the end, in section 
"Rename some files (relative to the default names coded in fix_3_clean)”, first 
line: "$FSLDIR/bin/immv ${fmrihp}.ica/filtered_func_data_clean 
${fmrihp}_clean”, which could not find a supported file with prefix 
".ica/filtered_func_data_clean”

(…)
Sun Mar 31 19:18:12 EDT 2019 - hcp_fix - Done running FIX
Sun Mar 31 19:18:13 EDT 2019 - hcp_fix - ABORTING: Something went wrong;  
RS_fMRI_2_hp2000.ica/Atlas_clean.dtseries.nii wasn't created

How can I solve this problem?

Thanks,

Leah.


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Question on fsl 6.0.1

2019-03-29 Thread Harms, Michael

The MR-FIX releases as part of HCPpipelines v4.0.0 should work with FSL 6.0.1.

As far as MSMAll and FSL 6.0.1, to my knowledge, we haven't explicitly tested 
that combination yet.

Cheers,
-MH

--
Michael Harms, Ph.D.

---

Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

On 3/29/19, 12:30 PM, "hcp-users-boun...@humanconnectome.org on behalf of 
Marta Moreno"  wrote:

Dear Experts,

Is FSL 6.0.1 ready for MR ICA+FIX and MSMAII? I have not seen any post yet in 
here.

Thanks,

Leah.

Sent from my iPhone

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] PostFreesurfer possible bug

2019-03-20 Thread Harms, Michael
Your --subjectDIR flag isn’t set correctly.  Take a look at 
Examples/Scripts/FreeSurferPipelineBatch.sh for how FreeSurferPipeline.sh 
should be called.  (Or modify the StudyFolder, Subjlist, and EnvironmentScript 
entries in Examples/Scripts/FreeSurferPipelineBatch.sh script and use that to 
actually run FreeSurferPipeline.sh).

The Examples/Scripts are the intended entry point for users.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: Shachar Gal 
Date: Wednesday, March 20, 2019 at 3:32 PM
To: "Harms, Michael" 
Cc: "hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] PostFreesurfer possible bug

i am using version 4.
this is my call -
/root/hcppilelines/HCPpipelines-master/FreeSurfer/FreeSurferPipeline.sh 
--subject=102 --subjectDIR=/root/hcppilelines/piano_hcp/working_directory/102 
--t1=/root/hcppilelines/piano_hcp/working_directory/102/T1w/T1w_acpc_dc_restore.nii.gz
 
--t1brain=/root/hcppilelines/piano_hcp/working_directory/102/T1w/T1w_acpc_dc_restore_brain.nii.gz
 
--t2=/root/hcppilelines/piano_hcp/working_directory/102/T1w/T2w_acpc_dc_restore.nii.gz

On Wed, 20 Mar 2019 at 22:26, Harms, Michael 
mailto:mha...@wustl.edu>> wrote:

No idea.  Are you using a modified/customized version of the HCPpipelines?  
Which exact version are you using?

What is your exact call to FreeSurferPipeline.sh?

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: 
mha...@wustl.edu<mailto:mha...@wustl.edu>

From: Shachar Gal mailto:gal.shac...@gmail.com>>
Date: Wednesday, March 20, 2019 at 3:23 PM
To: "Harms, Michael" mailto:mha...@wustl.edu>>
Cc: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] PostFreesurfer possible bug

well, for some reason, when i run the piplines it gets created under the 
subject directory
this is the ls for my subject directory
drwxr-x---  3 shachar Users 4096 Mar 19 16:24 unprocessed
drwxr-xr-x  6 rootroot  4096 Mar 20 10:47 T2w
drwxrwxr-x 10 rootroot  4096 Mar 20 11:46 102
lrwxrwxrwx  1 rootroot40 Mar 20 18:48 fsaverage -> 
/usr/local/freesurfer/subjects/fsaverage
drwxr-xr-x  8 rootroot  4096 Mar 20 21:28 T1w
drwxr-xr-x  8 rootroot  4096 Mar 20 21:28 MNINonLinear

and this is the ls for my T1w folder

-rw-r- 1 root root  9483849 Mar 20 10:25 T1w1_gdc.nii.gz
-rw-r- 1 root root  9483849 Mar 20 10:25 T1w.nii.gz
-rw-r--r-- 1 root root  8576542 Mar 20 10:26 T1w_acpc.nii.gz
drwxr-xr-x 2 root root 4096 Mar 20 10:26 ACPCAlignment
-rw-r--r-- 1 root root   150860 Mar 20 10:35 T1w_acpc_brain_mask.nii.gz
-rw-r--r-- 1 root root  3944745 Mar 20 10:35 T1w_acpc_brain.nii.gz
drwxr-xr-x 2 root root 4096 Mar 20 10:35 BrainExtraction_FNIRTbased
-rw-r--r-- 1 root root  3949152 Mar 20 10:58 T1w_acpc_dc_brain.nii.gz
-rw-r--r-- 1 root root 31765336 Mar 20 11:09 BiasField_acpc_dc.nii.gz
drwxr-xr-x 2 root root 4096 Mar 20 11:09 BiasFieldCorrection_sqrtT1wXT1w
-rw-r--r-- 1 root root 10344806 Mar 20 11:10 T1w_acpc_dc.nii.gz
-rw-r--r-- 1 root root 27636096 Mar 20 11:10 T1w_acpc_dc_restore.nii.gz
-rw-r--r-- 1 root root  7850049 Mar 20 11:10 T1w_acpc_dc_restore_brain.nii.gz
drwxr-xr-x 2 root root 4096 Mar 20 11:11 xfms
-rw-r--r-- 1 root root 12258808 Mar 20 11:11 T2w_acpc_dc.nii.gz
-rw-r--r-- 1 root root 28123873 Mar 20 11:11 T2w_acpc_dc_restore.nii.gz
-rw-r--r-- 1 root root  8000278 Mar 20 11:11 T2w_acpc_dc_restore_brain.nii.gz
drwxr-xr-x 2 root root 4096 Mar 20 21:28 Native
drwxr-xr-x 2 root root 4096 Mar 20 21:28 fsaverage_LR32k

any idea why would that be?


On Wed, 20 Mar 2019 at 21:49, Harms, Michael 
mailto:mha...@wustl.edu>> wrote:

Hi,
The FreeSurferFolder is located at $StudyFolder/$Subject/T1w/$Subject, which is 
what PostFreeSurferPipeline.sh is setting up via the following combination of 
lines:

FreeSurferFolder="$Subject"  #L101
T1wFolder="$StudyFolder"/"$Subject"/"$T1wFolder"  #L137
FreeSurferFolder="$T1wFolder"/"$FreeSurferFolder"  #L140

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  E

Re: [HCP-Users] PostFreesurfer possible bug

2019-03-20 Thread Harms, Michael

No idea.  Are you using a modified/customized version of the HCPpipelines?  
Which exact version are you using?

What is your exact call to FreeSurferPipeline.sh?

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: Shachar Gal 
Date: Wednesday, March 20, 2019 at 3:23 PM
To: "Harms, Michael" 
Cc: "hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] PostFreesurfer possible bug

well, for some reason, when i run the piplines it gets created under the 
subject directory
this is the ls for my subject directory
drwxr-x---  3 shachar Users 4096 Mar 19 16:24 unprocessed
drwxr-xr-x  6 rootroot  4096 Mar 20 10:47 T2w
drwxrwxr-x 10 rootroot  4096 Mar 20 11:46 102
lrwxrwxrwx  1 rootroot40 Mar 20 18:48 fsaverage -> 
/usr/local/freesurfer/subjects/fsaverage
drwxr-xr-x  8 rootroot  4096 Mar 20 21:28 T1w
drwxr-xr-x  8 rootroot  4096 Mar 20 21:28 MNINonLinear

and this is the ls for my T1w folder

-rw-r- 1 root root  9483849 Mar 20 10:25 T1w1_gdc.nii.gz
-rw-r- 1 root root  9483849 Mar 20 10:25 T1w.nii.gz
-rw-r--r-- 1 root root  8576542 Mar 20 10:26 T1w_acpc.nii.gz
drwxr-xr-x 2 root root 4096 Mar 20 10:26 ACPCAlignment
-rw-r--r-- 1 root root   150860 Mar 20 10:35 T1w_acpc_brain_mask.nii.gz
-rw-r--r-- 1 root root  3944745 Mar 20 10:35 T1w_acpc_brain.nii.gz
drwxr-xr-x 2 root root 4096 Mar 20 10:35 BrainExtraction_FNIRTbased
-rw-r--r-- 1 root root  3949152 Mar 20 10:58 T1w_acpc_dc_brain.nii.gz
-rw-r--r-- 1 root root 31765336 Mar 20 11:09 BiasField_acpc_dc.nii.gz
drwxr-xr-x 2 root root 4096 Mar 20 11:09 BiasFieldCorrection_sqrtT1wXT1w
-rw-r--r-- 1 root root 10344806 Mar 20 11:10 T1w_acpc_dc.nii.gz
-rw-r--r-- 1 root root 27636096 Mar 20 11:10 T1w_acpc_dc_restore.nii.gz
-rw-r--r-- 1 root root  7850049 Mar 20 11:10 T1w_acpc_dc_restore_brain.nii.gz
drwxr-xr-x 2 root root 4096 Mar 20 11:11 xfms
-rw-r--r-- 1 root root 12258808 Mar 20 11:11 T2w_acpc_dc.nii.gz
-rw-r--r-- 1 root root 28123873 Mar 20 11:11 T2w_acpc_dc_restore.nii.gz
-rw-r--r-- 1 root root  8000278 Mar 20 11:11 T2w_acpc_dc_restore_brain.nii.gz
drwxr-xr-x 2 root root 4096 Mar 20 21:28 Native
drwxr-xr-x 2 root root 4096 Mar 20 21:28 fsaverage_LR32k

any idea why would that be?


On Wed, 20 Mar 2019 at 21:49, Harms, Michael 
mailto:mha...@wustl.edu>> wrote:

Hi,
The FreeSurferFolder is located at $StudyFolder/$Subject/T1w/$Subject, which is 
what PostFreeSurferPipeline.sh is setting up via the following combination of 
lines:

FreeSurferFolder="$Subject"  #L101
T1wFolder="$StudyFolder"/"$Subject"/"$T1wFolder"  #L137
FreeSurferFolder="$T1wFolder"/"$FreeSurferFolder"  #L140

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: 
mha...@wustl.edu<mailto:mha...@wustl.edu>

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Shachar Gal mailto:gal.shac...@gmail.com>>
Date: Wednesday, March 20, 2019 at 2:42 PM
To: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] PostFreesurfer possible bug

Dear experts,

in the PostFreeSurferPipeline script, the FreeSurferFolder is set to be 
"$T1wFolder"/"$FreeSurferFolder"
even though, the free surfer folder isnt actually under the T1w folder... its 
right inside the subject's directory 
(WorkingDirectory/SubjectNumber/SubjectNumber)

is this a bug, or am i missing something?

thanks,
Shachar Gal

___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised 

Re: [HCP-Users] PostFreesurfer possible bug

2019-03-20 Thread Harms, Michael

Hi,
The FreeSurferFolder is located at $StudyFolder/$Subject/T1w/$Subject, which is 
what PostFreeSurferPipeline.sh is setting up via the following combination of 
lines:

FreeSurferFolder="$Subject"  #L101
T1wFolder="$StudyFolder"/"$Subject"/"$T1wFolder"  #L137
FreeSurferFolder="$T1wFolder"/"$FreeSurferFolder"  #L140

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Shachar Gal 

Date: Wednesday, March 20, 2019 at 2:42 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] PostFreesurfer possible bug

Dear experts,

in the PostFreeSurferPipeline script, the FreeSurferFolder is set to be 
"$T1wFolder"/"$FreeSurferFolder"
even though, the free surfer folder isnt actually under the T1w folder... its 
right inside the subject's directory 
(WorkingDirectory/SubjectNumber/SubjectNumber)

is this a bug, or am i missing something?

thanks,
Shachar Gal

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] A question about generating quality control scene file for the resting-state data

2019-03-18 Thread Harms, Michael

Running the PostFix script will generate a scene that allows you to review the 
FIX classification.

Don’t recall seeing a surface contour/glitch quite like that before.  It 
definitely merits further investigation.  E.g., what do the adjacent slices 
look like?  What does the actual surface look like?

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Aaron C 

Date: Monday, March 18, 2019 at 8:44 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] A question about generating quality control scene file for 
the resting-state data

Dear HCP experts,

I have a question about the quality control of rfMRI data processed by the HCP 
pipeline. Is there any shared script to create the quality control scene file 
("rfMRI_1.scene") described in the HCP course practical material 5 
(https://wustl.app.box.com/s/xfs2506iz6pa6t7bfhhkno3baphnppvy)?

Also, for the attached figure (native space), is this indicating a problem in 
using the HCP structural preprocessing pipeline? Thank you.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] structural preprocessing on Philips data?

2019-03-13 Thread Harms, Michael

Hi,
Sounds like a coding error.  If you’ve converted your T1w and T2w scans to 
proper NIFTIs, then you should be able to use those as inputs.  Did you review 
Examples/Scripts/PreFreeSurferPipelineBatch.sh ?

If you get it working, let us know how the myelin maps look!

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Lucy Evans 

Date: Wednesday, March 13, 2019 at 6:14 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] structural preprocessing on Philips data?

Dear HCPers,

I’m wondering if it’s possible to do the structural preprocessing pipelines on 
Philips 3T data? I’m trying to run the PreFreeSurfer script and it is not able 
to find my T1w or T2w images. These were PAR REC files that I converted to 
NIFTI. Could this be the issue or is it more likely that I’ve made a coding 
error?

Thanks in advance for your help.

Best wishes,
Lucy

Lucy Evans | PhD Student & Graduate Teaching Assistant | DNEP PGR Rep
Division of Neuroscience & Experimental Psychology | School of Biological 
Sciences | Faculty of Biology, Medicine & Health | The University of Manchester
Room 124e Zochonis Building | Brunswick Street | Manchester M13 9PL | Email: 
lucy.ev...@manchester.ac.uk


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] A question about the HCP structural pipeline

2019-03-08 Thread Harms, Michael

/T1w/ contains all the FreeSurfer output.
So, /T1w//stats contains the usual FS quantification.

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Aaron C 

Date: Friday, March 8, 2019 at 10:13 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] A question about the HCP structural pipeline

Dear HCP experts,

I have a question about the HCP structural pipeline. For my own data processed 
by the HCP structural pipeline, where could I find the values of brain tissue 
volumes? Thank you.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] rs-fMRI with-in subject comparison

2019-03-04 Thread Harms, Michael

As for a version of the parcellation with network assignments, see here:
https://github.com/ColeLab/ColeAnticevicNetPartition

cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Timothy Coalson 

Date: Monday, March 4, 2019 at 12:38 PM
To: Tali Weiss 
Cc: "hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] rs-fMRI with-in subject comparison

There isn't a dedicated command to get the parcel names, but they are in the 
output of wb_command -file-information on the parcellated file, or you can take 
them from -cifti-label-export-table on the dlabel file.

Tim


On Mon, Mar 4, 2019 at 1:55 AM Tali Weiss 
mailto:tali.we...@weizmann.ac.il>> wrote:
i did
wb_command -cifti-parcellate
wb_command -cifti-convert -to-text (to be continued in matlab)

1. Where I can find the titles of each of the 360 parcels?
2. I want to classify the parcel to networks (DMN, visual...)? is there a 
script in HCP that do it?

From: Glasser, Matthew [glass...@wustl.edu]
Sent: Sunday, March 03, 2019 5:12 PM
To: Tali Weiss; 
hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] rs-fMRI with-in subject comparison
1.  The HCP-YA data were not variance normalized.
2.  wb_command -cifti-parcellate
3.  There isn’t a good sub-cortical parcellation like the cortical parcellation 
yet unfortunately.

If you are comparing functional connectivity across runs within a subject, you 
don’t need to concatenate or variance normalize the runs.

Matt.

From: Tali Weiss mailto:tali.we...@weizmann.ac.il>>
Date: Sunday, March 3, 2019 at 4:28 AM
To: Matt Glasser mailto:glass...@wustl.edu>>, 
"hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: RE: [HCP-Users] rs-fMRI with-in subject comparison

Thank you Mattew!

1. Download Packages: State fMRI FIX-Denoised (Compact)
{Subject}_REST/MNINonLinear/Results/{fMRIName}/{fMRIName}_Atlas_MSMAll_hp2000_clean.dtseries.nii
- The raw data were zscore (overall standard deviation) and then cleaned by 
sICA+FIX?
- In wb there is a layer: .dynconn.nii. Is it for each of the 4 rs-scan of each 
subject?

2. I’m not sure which command I need to use to extract the timecourse of each 
parcel and then to apply correlation between all parcel of each network.

input_label= 
Q1-Q6_RelatedValidation210.CorticalAreas_dil_Final_Final_Areas_Group_Colors.32k_fs_LR.dlabel.nii
wb_command -cifti-all-labels-to-rois $input_label 1 ROIvalidation210.dscalar.nii

what I need to do next?

3. I like to create correlation also between subcortical (volume).
I read your article
https://www.ncbi.nlm.nih.gov/pubmed/29925602

What is your recommendation to define subcortical volume?
My analysis is “within subject” paradigm (comparing the scans in different 
days).


From: Glasser, Matthew [glass...@wustl.edu]
Sent: Thursday, February 28, 2019 3:39 AM
To: Tali Weiss; 
hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] rs-fMRI with-in subject comparison
1.  You would need to post the full path and filename.
2.  This will be handled by multi-run sICA+FIX in the future.  We will 
recommend all data be cleaned with sICA+FIX.  Really you want to be dividing by 
the unstructured noise standard deviation, rather than the overall standard 
deviation.
3.  Parcels have more statistical power than grayordinates.  The HCP’s 
multi-modal parcellation is here: https://balsa.wustl.edu/file/show/3VLx

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Tali Weiss 
mailto:tali.we...@weizmann.ac.il>>
Date: Wednesday, February 27, 2019 at 7:26 AM
To: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] rs-fMRI with-in subject comparison

Dear Prof.Smith,

I really appreciate your help.
I like to compare the second rs-fmri scan (from two different days) of the same 
subject.

1.when i open MSMAII_hp2000_clean.dtseries in WB, I get also a layer "dynconn",
for example: rfMRI_REST1_LR_Atlas_MSMAII_hp2000_clean.dynconn.nii
Are those group average?

2. It is recommended in the HCP Users FAQ: "demean and normalize the individual 
timeseries."
wb_command -cifti-math '(x - mean) / stdev' 
I am confused because it is writing demean andnormalize. (zscore include 
demean, am I missing something?).

My design is "within", so should I only apply demean? or because it is in a 
different day I should apply zscore?

3. I believe that statistically there are not enough time points in one scan to 
use all grayordinates.

Re: [HCP-Users] extract myelin content of individual subjects in Gordon parcels

2019-03-02 Thread Harms, Michael

As an aside, the S1200 Group Average dataset available at 
https://db.humanconnectome.org/data/projects/HCP_1200 includes a version of the 
Gordon parcellation with the parcels reordered according to networks, which is 
very useful when viewing matrices for network-related structure.  Based on the 
name of the dlabel.nii in your command, it appears that you are using the 
“original” version (in which the parcels are not ordered according to network).

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Saturday, March 2, 2019 at 3:16 PM
To: Antonin Skoch , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] extract myelin content of individual subjects in 
Gordon parcels

I think it looks okay.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Antonin Skoch mailto:a...@ikem.cz>>
Reply-To: Antonin Skoch mailto:a...@ikem.cz>>
Date: Saturday, March 2, 2019 at 4:40 AM
To: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] extract myelin content of individual subjects in Gordon 
parcels

Dear experts,

I want to extract average myelin content in individual subjects from parcels of 
Gordon atlas for further statistical analysis in external package.

Could you please confirm that following commands are OK?

wb_command -cifti-parcellate 
subj_ID/MNINonLinear/fsaverage_LR32k/subj_ID.MyelinMap.32k_fs_LR.dscalar.nii  
.Gordon333_workbench_parcels/Parcels/Parcels_LR.dlabel.nii COLUMN 
subj_ID_MyelinMap.pscalar.nii -method MEAN

wb_command -cifti-convert -to-text subj_ID_MyelinMap.pscalar.nii 
subj_ID_MyelinMap.txt

Thank you in advance,

Antonin Skoch

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] generating z-stat file

2019-03-02 Thread Harms, Michael

Hi,
You cannot create a z-stat map from a beta map.  We suggest that you use PALM 
to compute group-wise statistics from individual subject beta (cope) maps.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Saturday, March 2, 2019 at 3:15 PM
To: Yu Han , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] generating z-stat file

What code are you using to make the beta map?

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Yu Han mailto:yu@uvm.edu>>
Date: Saturday, March 2, 2019 at 1:36 PM
To: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] generating z-stat file

Dear experts,

I am trying to threshold a z-statistic task-fMRI dscalar file using the 
-cifti-math command.

wb_command -cifti-math ‘(x > 5.0088) + 2 * (x < -5.0088)’ \ outputfile \ input 
file

The input file needs to be a z-stat, however, my current script only generates 
the beta map file beta.dscalar.nii.

How do I generate the z-stat file? Or is there a way to convert the beta file 
to a z-stat file?

My ultimate goal is to find out whether the activation is significant or not, 
so I am looking to set a p value threshold or generate some statistics etc.

Thank you in advance!

Best,
Yu


Yu Han
Ph.D. Candidate
Neuroscience Graduate Program
University of Vermont


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] ICA FIX output missing

2019-02-26 Thread Harms, Michael

a) It may give the same answer, but if for nothing else than speed of 
execution, TaskfMRILevel1.sh should be smarter and have an option to not apply 
a temporal filter a second time.

b) If it turns out that the space-spanned by the FIX noise components overlaps 
with the task GLM (for whatever reason), then noise will be re-introduced to 
some degree.  That’s simply the math.  You’re arguing that as a practical 
matter that won’t be the case, or that the effect will be tiny, which may 
indeed turn out to be the case, but that should be empirically demonstrated 
first.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: "Glasser, Matthew" 
Date: Tuesday, February 26, 2019 at 8:19 PM
To: NEUROSCIENCE tim , Leonardo Tozzi 
Cc: "Harms, Michael" , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] ICA FIX output missing

Regarding task analysis, I don’t really agree with Mike:

a) The sICA+FIX filter is essentially equivalent to a linear detrend whereas 
the task filter is more aggressive.  Thus, applying both is going to give 
essentially the same answer as applying the more aggressive one.
b) Because the noise component removal is done as a non-aggressive regression 
(where the neural signal is protected by the neural components) and the task 
design is noiseless, I don’t see how the task design will reintroduce noise.  
There could be statistical efficiency benefits to exactly replicating the 
regression on the task design, but these are likely to be small.  Overall, this 
is not an effect that should lead one to avoid cleaning data before running a 
task analysis, as the statistical benefits of cleaning are clear, both from the 
perspective of removing biases and reducing uncorrelated variance.

Matt.

From: Timothy Coalson mailto:tsc...@mst.edu>>
Date: Tuesday, February 26, 2019 at 6:37 PM
To: Leonardo Tozzi mailto:lto...@stanford.edu>>
Cc: "Harms, Michael" mailto:mha...@wustl.edu>>, Matt Glasser 
mailto:glass...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] ICA FIX output missing

That is saying that you don't have the matlab gifti library installed (or it 
isn't on your matlab path).

Tim


On Tue, Feb 26, 2019 at 6:09 PM Leonardo Tozzi 
mailto:lto...@stanford.edu>> wrote:
Dear Michael,

Thank you very much for all the consideration on the use of FIX for the task 
data.

I have tried the addition you suggest. I think the command is detected, but I 
get the following error in tfMRI_EMOTION_RL_hp2000.ica/.fix.log:


{^HUndefined function or variable 'gifti'.

Error in ciftiopen (line 31)
cifti = gifti([tmpfile '.gii']);

Error in fix_3_clean (line 46)
  BO=ciftiopen('Atlas.dtseries.nii',WBC);
}^H




Would you have any thoughts on this?
Thank you,


Leonardo Tozzi, MD, PhD
Williams PanLab | Postdoctoral Fellow
Stanford University | 401 Quarry Rd
lto...@stanford.edu<mailto:lto...@stanford.edu> | (650) 5615738


From: "Harms, Michael" mailto:mha...@wustl.edu>>
Date: Tuesday, February 26, 2019 at 7:37 AM
To: "Glasser, Matthew" mailto:glass...@wustl.edu>>, 
Leonardo Tozzi mailto:lto...@stanford.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Cc: "Burgess, Gregory" mailto:gburg...@wustl.edu>>
Subject: Re: [HCP-Users] ICA FIX output missing


Hi Leonardo,
Couple things:

1)  In the context of FIX, things get a little convoluted, since the FIX 
distribution has its own settings.sh file that needs to be set appropriately.  
If you’ve hard-coded the FSL_FIX_WBC variable in that settings.sh file, then 
the location to wb_command in the Examples/Scripts/SetUpHCPPipeline.sh isn’t 
necessarily relevant.  In the settings.sh file for FIX on our cluster, we use 
the following construction:

if [ -x "$(command -v wb_command)" ]; then
FSL_FIX_WBC=$(command -v wb_command)
else
echo "ERROR in $0: wb_command (Workbench) must be in your path"
exit 1
fi
so that FIX does actually respect that location of wb_command that is already 
in your path.

2) Regarding MR-FIX and the TaskfMRIAnalysis scripts, while they may run after 
MR-FIX, there are two issues that need to be addressed yet:
a) The temporal filter, which was presumably already applied during MR-FIX, 
gets applied again with TaskfMRILevel1.sh.  This script needs to be modified to 
be smarter regarding the temporal filtering (i.e., provide an option to NOT 
reapply the temporal filter).
b) The 

Re: [HCP-Users] ICA FIX output missing

2019-02-26 Thread Harms, Michael

Hi Leonardo,
Couple things:

1)  In the context of FIX, things get a little convoluted, since the FIX 
distribution has its own settings.sh file that needs to be set appropriately.  
If you’ve hard-coded the FSL_FIX_WBC variable in that settings.sh file, then 
the location to wb_command in the Examples/Scripts/SetUpHCPPipeline.sh isn’t 
necessarily relevant.  In the settings.sh file for FIX on our cluster, we use 
the following construction:

if [ -x "$(command -v wb_command)" ]; then
FSL_FIX_WBC=$(command -v wb_command)
else
echo "ERROR in $0: wb_command (Workbench) must be in your path"
exit 1
fi
so that FIX does actually respect that location of wb_command that is already 
in your path.

2) Regarding MR-FIX and the TaskfMRIAnalysis scripts, while they may run after 
MR-FIX, there are two issues that need to be addressed yet:
a) The temporal filter, which was presumably already applied during MR-FIX, 
gets applied again with TaskfMRILevel1.sh.  This script needs to be modified to 
be smarter regarding the temporal filtering (i.e., provide an option to NOT 
reapply the temporal filter).
b) The space spanned by the noise regressors from FIX is not regressed out of 
the task regressor prior to the GLM, which means that variance removed during 
FIX can be reintroduced during the task GLM fitting (depending on the extent to 
which the space spanned by the noise regressors overlaps with the task GLM).

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: "Glasser, Matthew" 
Date: Monday, February 25, 2019 at 6:53 PM
To: Leonardo Tozzi , "Harms, Michael" , 
"hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] ICA FIX output missing

You’ll want to be using wb_command 1.3.2.  I am not aware of any modifications 
that are necessary to use the TaskfMRIAnalysis scripts on MR+FIX data and have 
analyzed hundreds of subjects after MR+FIX.

As for this issue, is wb_command set properly here: 
https://github.com/Washington-University/HCPpipelines/blob/master/Examples/Scripts/SetUpHCPPipeline.sh

What about on your ${PATH}?

As for MR+FIX itself, we are only waiting on an FSL 6.0.1 release as testing 
has concluded successfully.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Leonardo Tozzi mailto:lto...@stanford.edu>>
Date: Monday, February 25, 2019 at 5:09 PM
To: "Harms, Michael" mailto:mha...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] ICA FIX output missing

Dear Michael,

Thank you for pointing me to the logfiles.

It seems like the script is not finding the directory where wb_command is. In 
my case, I am loading it as a module in my HPC cluster. I have also put its 
path in ICAFIX/fix1.067/settings.sh as follows:


# Set this to the location of the HCP Workbench command for your platform
FSL_FIX_WBC='/share/software/user/open/workbench/1.3.1/bin/wb_command';



However, the script does not seem to “see” this path and instead uses the 
setting I was using on my local machine. In the logfile 
tfMRI_EMOTION_RL_hp2000.ica/.fix.log, I get the following error:


/bin/bash: /Applications/workbench/bin_macosx64/wb_command: No such file or 
directory


Is there another place in the scripts that is overriding my settings.sh?
Concerning the length or the runs, I will look into the multirun 
implementation, but indeed my intention was of using the TaskfMRIAnalysis 
scripts to get my “activations”.
Thank you,

Leonardo Tozzi, MD, PhD
Williams PanLab | Postdoctoral Fellow
Stanford University | 401 Quarry Rd
lto...@stanford.edu<mailto:lto...@stanford.edu> | (650) 5615738


From: "Harms, Michael" mailto:mha...@wustl.edu>>
Date: Monday, February 25, 2019 at 9:59 AM
To: Leonardo Tozzi mailto:lto...@stanford.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] ICA FIX output missing


Hi,
The log files for ICA FIX are a bit scattered.
In the .ica directory check: .fix_2b_predict.log (from the prediction – i.e,. R 
code) and .fix.log (from the cleaning stage).
And in the .ica/fix directory, check logMatlab.txt (I believe that is from the 
feature extraction stage).

Note that our recommendation is to use “multi-run” FIX on the task data, due to 
its shorter run length.  We hope to have an announcement on that in the near 
future.  In that regard, you would implement your desired filtering as part of 
the MR-FIX cleaning (and there is a new “polynomial detrend”

Re: [HCP-Users] ICA FIX output missing

2019-02-25 Thread Harms, Michael

Hi,
The log files for ICA FIX are a bit scattered.
In the .ica directory check: .fix_2b_predict.log (from the prediction – i.e,. R 
code) and .fix.log (from the cleaning stage).
And in the .ica/fix directory, check logMatlab.txt (I believe that is from the 
feature extraction stage).

Note that our recommendation is to use “multi-run” FIX on the task data, due to 
its shorter run length.  We hope to have an announcement on that in the near 
future.  In that regard, you would implement your desired filtering as part of 
the MR-FIX cleaning (and there is a new “polynomial detrend” option, for faster 
execution), although I believe that we haven’t quite gotten around to adapting 
the TaskfMRIAnalysis scripts to work on data from MR-FIX.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Leonardo Tozzi 

Date: Monday, February 25, 2019 at 11:42 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] ICA FIX output missing

Dear Experts,

I have been trying to use ICA FIX to denoise task data of a large number of 
subjects from the HYA release.

I have managed to make it run with no errors on our local HPC cluster, but I 
still have one problem. In the vast majority of subjects, even if FIX works and 
produces a file (“fix4melview_HCP_hp2000_thr10.txt”) which shows which 
components are noise, I don’t get the final output, for example 
“tfMRI_EMOTION_RL_Atlas_hp2000_clean.dtseries.nii”. This is especially puzzling 
since it does seem to work for a minority of subjects. The matlab log also 
shows no errors.

A related question I would have is what filter you would recommend for task 
data. My intention is to use a GLM on the cleaned data, so is 2000 (linear 
detrending) ok, since then a lower high-pass will be applied in the GLM step?

Thank you very much,



Leonardo Tozzi, MD, PhD
Williams PanLab | Postdoctoral Fellow
Stanford University | 401 Quarry Rd
lto...@stanford.edu | (650) 5615738


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Glasser parcellation with network labels

2019-02-20 Thread Harms, Michael

If you access the GitHub repo mentioned in the paper, you’ll find all the 
specifics you need.

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: Anita Sinha 
Date: Wednesday, February 20, 2019 at 9:25 AM
To: "Harms, Michael" , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] Glasser parcellation with network labels


Dr. Harms,



In the paper, I see how the network partitions were done, but is there a table 
or other figure that explicitly defines what network partition each of the 
cortical areas in Glasser parcellation belongs to? I'm trying to build an 
adjacency matrix based on groupings of connections that belong to the same 
network partition (=1) and connections between 2 ROIs in different networks 
(=0) to find possible sub-graphs, however, I can't seem to find out what 
explicit regions belong to each network partition.



If you could provide some clarification, that would be greatly appreciated!



Regards,



Anita

________
From: Harms, Michael 
Sent: Wednesday, February 20, 2019 9:07:19 AM
To: Anita Sinha; hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] Glasser parcellation with network labels




See this paper:

https://www.ncbi.nlm.nih.gov/pubmed/30291974





--

Michael Harms, Ph.D.

---

Associate Professor of Psychiatry

Washington University School of Medicine

Department of Psychiatry, Box 8134

660 South Euclid Ave.Tel: 314-747-6173

St. Louis, MO  63110  Email: mha...@wustl.edu



From:  on behalf of Anita Sinha 

Date: Wednesday, February 20, 2019 at 9:02 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Glasser parcellation with network labels



To Whom It May Concern,



Is there any resource that lists what network (visual, auditory, motor, etc.) 
each parcel in Glasser parcellation belongs to? In the Supplementary Material 
(Neuroanatomical Results) of "A multi-modal parcellation of human cerebral 
cortex" by Glasser et. al., Table 1 lists all of the areas within the 
parcellation, but doesn't list which network each area belongs to.



If you could point me to where I can find this information, that would be 
greatly appreciated.



Thanks!



Regards,



Anita

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users





The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Glasser parcellation with network labels

2019-02-20 Thread Harms, Michael

See this paper:

https://www.ncbi.nlm.nih.gov/pubmed/30291974


--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Anita Sinha 

Date: Wednesday, February 20, 2019 at 9:02 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Glasser parcellation with network labels


To Whom It May Concern,



Is there any resource that lists what network (visual, auditory, motor, etc.) 
each parcel in Glasser parcellation belongs to? In the Supplementary Material 
(Neuroanatomical Results) of "A multi-modal parcellation of human cerebral 
cortex" by Glasser et. al., Table 1 lists all of the areas within the 
parcellation, but doesn't list which network each area belongs to.



If you could point me to where I can find this information, that would be 
greatly appreciated.



Thanks!



Regards,



Anita

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Some questions about data quality check

2019-02-13 Thread Harms, Michael

Another way to put this is that hippocampus is not cortical (it is part of the 
medial wall), and thus FreeSurfer does not attempt to generate surfaces that 
follow the structures in that region.  They even have a FAQ on that specific 
question:

https://surfer.nmr.mgh.harvard.edu/fswiki/UserContributions/FAQ#Q.Thesurfacesnearthemedialwall.2Chippocampus.2Candamygdalaaren.27taccuratelyfollowingthestructuresthere.HowcanIfixthis.3F

cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Reid, Erin" 

Date: Wednesday, February 13, 2019 at 11:13 AM
To: Aaron C 
Cc: "hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] Some questions about data quality check

Hello Aaron,

1. The hippocampus is a known area that FreeSurfer has a little difficulty with 
so we don’t expect perfection in that area.  The subject shown in the document 
is quite good however.

2. Yes, "good" volume distortion has fewer blobs.  The blobs are indicating 
where the individual surface was manipulated in order to register to the atlas. 
 So the fewer manipulations the better.

Hope this helps.

Erin


On Feb 13, 2019, at 12:27 AM, Aaron C 
mailto:aaroncr...@outlook.com>> wrote:

Dear HCP experts,

I have some questions about data quality check when using the HCP pipeline.

1. In the panel on the lower left corner on page 23 of the data quality slides 
(https://wustl.app.box.com/s/krekn5svmg4rgig2ossqqhuhcbrgg8c6), is the brain 
segmentation there missing some parts of the temporal lobe?

2. On page 25 of the slides, what's the criteria of determine "good" or "bad" 
volume distortion? Fewer number of above-threshold blobs the better?

I also have a question about structural connectome of the HCP diffusion data. 
Is there any shared script to generate parcel to parcel structural connectivity 
using Dr. Glasser's multi-modal parcellation scheme? Thank you.
___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Multiple Comparison Correction on Cifti/gifti

2019-02-11 Thread Harms, Michael

Hi,
We’d suggest that you use PALM for inference.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Metcalf, Nicholas" 

Date: Monday, February 11, 2019 at 5:38 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Multiple Comparison Correction on Cifti/gifti

Hello,

Does anyone have or know of a matlab/workbench routine for performing multiple 
comparison correction on a cifti or gifti metric?

Nick

--
Senior Neuroimaging Engineer
Washington University in St Louis Medical School
Department of Neurology
4525 Scott Ave
Suite 2124
St. Louis, MO 63110
Ph: 314-362-6376
Cell: 636-375-4051



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Imaging Protocols for HCP Version 2019.01.14

2019-02-11 Thread Harms, Michael

Hi,
We are working on getting the link functioning.

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Arpinar, Volkan" 

Date: Monday, February 11, 2019 at 9:29 AM
To: hcp-users 
Subject: [HCP-Users] Imaging Protocols for HCP Version 2019.01.14

Hello HCP users,

Do anyone have “Lifespan HCP-Development and HCP-Aging Siemens Prisma protocol 
package” version 2019.01.14 zip file? Could you please share with me the 
protocol files that have the timings?

The link given below on hummanconnectome.org did not worked for me (Tried 
multiple computers and browsers)
https://www.humanconnectome.org/storage/app/media/protocols/HCP_VE11C_Prisma_2019.01.14.zip
under
https://www.humanconnectome.org/study/hcp-lifespan-aging/project-protocol/imaging-protocols-hcp-aging

Thanks,
Volkan Emre Arpinar


From:  on behalf of Caio Seguin 

Date: Sunday, February 10, 2019 at 3:59 PM
Cc: hcp-users 
Subject: Re: [HCP-Users] Sharing HCP-derived brain networks and graph-theoric 
measures

ATTENTION: This email originated from a sender outside of MCW. Use caution when 
clicking on links or opening attachments.

Hello everyone,

Thanks for all the ideas, this has been very helpful. I will try to follow your 
suggestions to upload the data to BALSA. I will let you know if I have any 
further questions.

Best,
Caio


Em sáb, 9 de fev de 2019 às 04:09, Elam, Jennifer 
mailto:e...@wustl.edu>> escreveu:

Hi Caio,

Ideally you would share the data both ways, as your original arbitrary files 
and converted and visually displayed as parcel x parcel pconn CIFTI files on 
the cortex and/or in matrices in Workbench scene files. Alternatively (or 
additionally), you could just create scenes of your figure images from your 
paper, by loading image files into Workbench and display them -- I can give you 
further instructions on how to do this.



Either way you would then have a scene file that you can upload to BALSA along 
with your arbitrary files. BALSA could then display one of your scenes and have 
your study dataset be searchable on the BALSA homepage.



If you have any questions on how to put together your dataset for BALSA, please 
let me know.



Best,

Jenn


Jennifer Elam, Ph.D.
Scientific Outreach, Human Connectome Project
Washington University School of Medicine
Department of Neuroscience, Box 8108
660 South Euclid Avenue
St. Louis, MO 63110
314-362-9387
e...@wustl.edu
www.humanconnectome.org


From: John Smith mailto:jackdaw...@gmail.com>>
Sent: Friday, February 8, 2019 10:29:45 AM
To: NEUROSCIENCE tim
Cc: Caio Seguin; Elam, Jennifer; hcp-users
Subject: Re: [HCP-Users] Sharing HCP-derived brain networks and graph-theoric 
measures

The "Upload Files" button on the Files modal should take you to a page with a 
widget that allows you to upload arbitrary files to your BALSA study. It is 
assumed that extra files are there to serve as documentation or additional 
figures, so currently the uploader accepts files with the following extensions: 
zip, txt, rtf, pdf, odt, odp, wpd, doc, docx, ppt, pptx, jpg, png, fig, m, gif, 
csv. A file of any type can be uploaded so long as it is in a directory 
contained in a zip file. Within that base directory, such files can be nested 
into other directories, and that is where they will appear when the dataset as 
a whole is downloaded. Any files uploaded outside of a zip will be assumed to 
exist at the base directory for the study. As a final note, files that are not 
directly used by a scene will not be downloaded unless the user has selected to 
download the entire study or has specifically selected those files for download.

-John

On Thu, Feb 7, 2019 at 5:08 PM Timothy Coalson 
mailto:tsc...@mst.edu>> wrote:
As I recall, BALSA also allows arbitrary additional files to be uploaded to 
studies.  I'm not sure about the details of how to do this, though (there is an 
"upload files" button in the "files" modal for a study you own, but I'm not 
sure where those files end up).

Tim


On Thu, Feb 7, 2019 at 4:52 PM Caio Seguin 
mailto:caioseg...@gmail.com>> wrote:
Thanks Tim and Matt for the quick reply.

Most of the files are NxN connectivity matrices, where N could denote, for 
instance, ROIs from different parcellation schemes or resting-state functional 
networks.

Ok, so one option is to transform these matrices into cifti files and share 
them through BALSA. On the one hand, this 

Re: [HCP-Users] Sharing HCP-derived brain networks and graph-theoric measures

2019-02-07 Thread Harms, Michael

Supporting “arbitrary” formats requires putting a whole system in place for 
describing the format, and making it queryable.   The good thing is that if the 
pconn’s are available, users could use those inputs to their own derived 
graph-theoretic measures.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Caio Seguin 

Date: Thursday, February 7, 2019 at 4:52 PM
To: NEUROSCIENCE tim 
Cc: hcp-users 
Subject: Re: [HCP-Users] Sharing HCP-derived brain networks and graph-theoric 
measures

Thanks Tim and Matt for the quick reply.

Most of the files are NxN connectivity matrices, where N could denote, for 
instance, ROIs from different parcellation schemes or resting-state functional 
networks.

Ok, so one option is to transform these matrices into cifti files and share 
them through BALSA. On the one hand, this is a nice solution for it solves the 
user term uses. On the other, it is a bit of a roundabout way to store these 
files in the context of my manuscript. The matrices are used to derive 
graph-theoretic measures about brain organization (rather than for 
visualization purposes), so researchers interested in that would need to 
convert the cifti files back to CSV.

More generally, do you suggest any methods to share HCP-derived files in an 
arbitrary format?

Thanks in advance for the help.

Best,
Caio


Em sex, 8 de fev de 2019 às 06:19, Timothy Coalson 
mailto:tsc...@mst.edu>> escreveu:
If your data is organized as a value per parcel/network, you should be able to 
turn it into parcellated cifti files, which can be displayed in wb_view (and 
therefore in scenes) as a matrix and/or as colored regions on the surfaces and 
in the volume.

See wb_command -cifti-parcellate (to make a template parcellated cifti file you 
can use to import data into), -cifti-label-import (to get your network ROIs 
into the format -cifti-parcellate wants), and -cifti-convert (and its 
-from-text option, to read csv or other text data and output cifti).

Tim


On Thu, Feb 7, 2019 at 7:05 AM Caio Seguin 
mailto:caioseg...@gmail.com>> wrote:
Dear experts,

I have used diffusion and resting-state functional MRI data from the HCP to 
derive whole brain connectomes for individual participants. I used the 
connectomes to computed graph-theoretic measures that are part of a manuscript 
I am working on.

My question concerns the sharing of these connectomes and graph-theoretic 
measures. My current understanding is that sharing this data is ok as long as I 
make sure users abide to the HCP data usage terms. What are your suggestions on 
how to do this?

I've seen BALSA proposed to this end, since it provides a built-in mechanism of 
user terms, but my files are CSV or .mat files rather than WB scenes.

Thanks in advance for your help.

Best regards,
Caio Seguin


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Some questions about using HCP structural preprocessing pipeline

2019-02-07 Thread Harms, Michael

Re 1)
If you are using a modern version of ‘dcm2niix’, then in the context of the T1 
and T2 scans, the values for T1wSampleSpacing and T2wSampleSpacing are 
available as the “DwellTime” entry in the BIDS-style json sidecar files.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Thursday, February 7, 2019 at 7:09 AM
To: Aaron C , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] Some questions about using HCP structural 
preprocessing pipeline

1.  This depends on the acquisition parameters.  You can find this in the DICOM 
header in the named fields.
2.  How do the T2w data look in the T1w folder?  That matters most.
3.  You need to update Connectome Workbench.
4.  Doesn’t matter.
5.  I hope so.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Aaron C mailto:aaroncr...@outlook.com>>
Date: Wednesday, February 6, 2019 at 11:57 PM
To: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Some questions about using HCP structural preprocessing 
pipeline


Dear HCP experts,



I have some questions about using the HCP structural preprocessing pipeline.

  1.  For the parameters “T1wSampleSpacing” and “T2wSampleSpacing”, are they 
the same across different Skyra scanners? Will this significantly affect 
preprocessing quality if setting them to “NONE”?
  2.  I noticed some cortical regions were excessively removed in the file 
“T2w_acpc_brain.nii.gz” in “T2w” folder. Would this indicate any problem of 
brain extraction? If so, would you please suggest how to fix this problem?
  3.  I received the error “Unexpected parameter: -local-affine-method” when 
using “PostFreeSurferPipelineBatch.sh”. It occurred when the script tried to 
call wb_command to do some operations with the flag “-local-affine-method”. 
Would you please suggest how to fix this error?
  4.  I received a lot of “name collision in input name” warnings. I guess 
these warnings does not matter?
  5.  Will the updated pipeline using FreeSurfer 6.0.0 possibly be available in 
the near future (sometime this month)?

Thank you.


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Transforming Freesurfer .mgz segmented files into HCP_2mm MNI Surface and Volume space

2019-02-01 Thread Harms, Michael

So, I’m assuming you ran it using the outputs from PreFreeSurfer as its inputs? 
 Otherwise, the mapping that Matt suggested you use from PostFreeSurfer won’t 
be appropriate! ☺

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: "Srivastava, Benjamin (NYSPI)" 
Date: Wednesday, January 30, 2019 at 4:40 PM
To: "Harms, Michael" , "Glasser, Matthew" 
, "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] Transforming Freesurfer .mgz segmented files into 
HCP_2mm MNI Surface and Volume space

Hi we just used the isotropic T1w scan

A. Benjamin Srivastava, M.D.
Clinical and Research Fellow
Division on Substance Use Disorders
Department of Psychiatry
Columbia University Medical Center
New York State Psychiatric Institute



From: "Harms, Michael" 
Date: Wednesday, January 30, 2019 at 9:05 AM
To: "Glasser, Matthew" , "Srivastava, Benjamin (NYSPI)" 
, "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] Transforming Freesurfer .mgz segmented files into 
HCP_2mm MNI Surface and Volume space


ATTENTION: This email came from an external source. Do not open attachments or 
click on links from unknown senders or unexpected emails.


Hi,
I’m curious, did you run the hippocampal/amygdala segmentation using only a 
isotropic T1w scan, or did you add in a high in-plane resolution T2w, as is 
often used for best performance of the hp/amyg segmentation routine?

Cheers,
-MH



--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Tuesday, January 29, 2019 at 5:31 PM
To: "Srivastava, Benjamin (NYSPI)" , 
"hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] Transforming Freesurfer .mgz segmented files into 
HCP_2mm MNI Surface and Volume space

As far as the volume space, have a look at the PostFreeSurfer pipeline for how 
wmparc.mgz gets mapped.  The hippocampus is not fully represented on FreeSurfer 
cortical surfaces.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of "Srivastava, Benjamin (NYSPI)" 
mailto:benjamin.srivast...@nyspi.columbia.edu>>
Date: Tuesday, January 29, 2019 at 1:56 PM
To: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Transforming Freesurfer .mgz segmented files into HCP_2mm 
MNI Surface and Volume space

Hi,

We have processed data using the 3_22 HCP Pipelines. We have run (successfully) 
the hippocampal amygdala segmentation script with Freesurfer 6 using 
segmentHA_T1.sh. We are now trying to transform the .mgz segmented files into 
the HCP_2mm MNI surface and volume spaces. Does anyone have any experience 
using FreeSurfer2CaretConvertAndRegisterNonlinear.sh to accomplish this?

Thanks!

A. Benjamin Srivastava, M.D.
Clinical and Research Fellow
Division on Substance Use Disorders
Department of Psychiatry
Columbia University Medical Center
New York State Psychiatric Institute



___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
https://protect2.fireeye.com/url?k=e2fa6289-bede94e8-e2f89bbc-0cc47a6d17e0-6ccae0759b4d5ca5=http://lists.humanconnectome.org/mailman/listinfo/hcp-users<https://protect2.fireeye.com/url?k=8c2a1c38-d00eea59-8c28e50d-0cc47a6d17e0-184f81f6847535cc=http://lists.humanconnectome.org/mailman/listinfo/hcp-users>


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
https://protect2.fireeye.com/url?k=48b75344-1493a525-48b5aa71-0cc47a6d17e0-293635e4995a6ca5=http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensit

Re: [HCP-Users] Transforming Freesurfer .mgz segmented files into HCP_2mm MNI Surface and Volume space

2019-01-30 Thread Harms, Michael

Hi,
I’m curious, did you run the hippocampal/amygdala segmentation using only a 
isotropic T1w scan, or did you add in a high in-plane resolution T2w, as is 
often used for best performance of the hp/amyg segmentation routine?

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Tuesday, January 29, 2019 at 5:31 PM
To: "Srivastava, Benjamin (NYSPI)" , 
"hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] Transforming Freesurfer .mgz segmented files into 
HCP_2mm MNI Surface and Volume space

As far as the volume space, have a look at the PostFreeSurfer pipeline for how 
wmparc.mgz gets mapped.  The hippocampus is not fully represented on FreeSurfer 
cortical surfaces.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of "Srivastava, Benjamin (NYSPI)" 
mailto:benjamin.srivast...@nyspi.columbia.edu>>
Date: Tuesday, January 29, 2019 at 1:56 PM
To: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Transforming Freesurfer .mgz segmented files into HCP_2mm 
MNI Surface and Volume space

Hi,

We have processed data using the 3_22 HCP Pipelines. We have run (successfully) 
the hippocampal amygdala segmentation script with Freesurfer 6 using 
segmentHA_T1.sh. We are now trying to transform the .mgz segmented files into 
the HCP_2mm MNI surface and volume spaces. Does anyone have any experience 
using FreeSurfer2CaretConvertAndRegisterNonlinear.sh to accomplish this?

Thanks!

A. Benjamin Srivastava, M.D.
Clinical and Research Fellow
Division on Substance Use Disorders
Department of Psychiatry
Columbia University Medical Center
New York State Psychiatric Institute



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Detailed manual rating score of structural images

2019-01-22 Thread Harms, Michael

Hi,
Just a small addendum: There were a small number of HCP-YA subjects for which 
we used T1 or T2 scans that we had rated “fair” to allow for subject inclusion 
into the study.  The number of such subjects was very small however.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: "Elam, Jennifer" 
Date: Tuesday, January 22, 2019 at 9:48 AM
To: Xuhong Liao , hcp-users 
, "Harms, Michael" 
Subject: Re: [HCP-Users] Detailed manual rating score of structural images


Hi Xuhong,

All of the structural data released by HCP is of good or excellent quality 
(scores of 3-4 on our QC scale). We have not publicly released the exact 
ratings for released subjects. The file you are referring to is on our 
non-public wiki and is referred to as an example in the SOP for other groups 
who are setting up neuroimaging projects to understand how we did things for 
HCP.



Best,

Jenn


Jennifer Elam, Ph.D.
Scientific Outreach, Human Connectome Project
Washington University School of Medicine
Department of Neuroscience, Box 8108
660 South Euclid Avenue
St. Louis, MO 63110
314-362-9387
e...@wustl.edu<mailto:e...@wustl.edu>
www.humanconnectome.org<http://www.humanconnectome.org/>


From: hcp-users-boun...@humanconnectome.org 
 on behalf of Xuhong Liao 

Sent: Tuesday, January 22, 2019 3:57:09 AM
To: hcp-users; Harms, Michael
Subject: Re: [HCP-Users] Detailed manual rating score of structural images

Many thanks,

I noticed that the standard QC rating (1 = Poor, to 4 = Excellent) of each 
T1/T2 weighted image was added in the 'Phasell_T1W/T2w_QC.xls' located on the 
HCP wiki (https://wiki.humanconnectome.org/dashboard.action). But this link is 
fail to open. How can I obtain these rating scores?

Regards,
Xuhong Liao

Xuhong Liao, Ph. D, Associate Professor
School of Systems Science,
Beijing Normal University
No.19 Xinjiekouwai Street,
Beijing, 100875, China
liaoxuh...@gmail.com
<mailto:liaoxuh...@gmail.com>


mailto:hcp-users-requ...@humanconnectome.org>>
 于2019年1月21日周一 上午2:00写道:
Send HCP-Users mailing list submissions to
hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>

To subscribe or unsubscribe via the World Wide Web, visit
http://lists.humanconnectome.org/mailman/listinfo/hcp-users
or, via email, send a message with subject or body 'help' to

hcp-users-requ...@humanconnectome.org<mailto:hcp-users-requ...@humanconnectome.org>

You can reach the person managing the list at

hcp-users-ow...@humanconnectome.org<mailto:hcp-users-ow...@humanconnectome.org>

When replying, please edit your Subject line so it is more specific
than "Re: Contents of HCP-Users digest..."
Today's Topics:

   1. Detailed manual rating score of structural images (Xuhong Liao)
   2. Re: Detailed manual rating score of structural images
  (Harms, Michael)



-- Forwarded message --
From: Xuhong Liao mailto:liaoxuh...@gmail.com>>
To: hcp-users 
mailto:hcp-users@humanconnectome.org>>
Cc:
Bcc:
Date: Sun, 20 Jan 2019 17:29:05 +0800
Subject: [HCP-Users] Detailed manual rating score of structural images
I have got the S-1200 dataset in which the subjects with QC issues are marked 
with codes from "A" to "E".

Could anyone tell me how to obtain the detailed manual rating scores of each 
subject described in the quality control SOP?

Regards,
Xuhong Liao, PhD
--
Xuhong Liao, Associate Professor
School of Systems Science,
Beijing Normal University
No.19 Xinjiekouwai Street,
Beijing, 100875, China
liaoxuh...@gmail.com
<mailto:liaoxuh...@gmail.com>



-- Forwarded message --
From: "Harms, Michael" mailto:mha...@wustl.edu>>
To: Xuhong Liao mailto:liaoxuh...@gmail.com>>, hcp-users 
mailto:hcp-users@humanconnectome.org>>
Cc:
Bcc:
Date: Sun, 20 Jan 2019 14:19:53 +
Subject: Re: [HCP-Users] Detailed manual rating score of structural images



Hi,



https://wiki.humanconnectome.org/pages/viewpage.action?pageId=88901591





--

Michael Harms, Ph.D.

---

Associate Professor of Psychiatry

Washington University School of Medicine

Department of Psychiatry, Box 8134

660 South Euclid Ave.Tel: 314-747-6173

St. Louis, MO  63110  Email: 
mha...@wustl.edu<mailto:mha...@wustl.edu>



From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Xuhong Liao mailto:liaoxuh...@gmail.com>>
Date: Sunday, January 20, 2019 at 3:29 AM
To: hcp-users 
mailto:hcp-users@humanconnectome.org>>
Subjec

Re: [HCP-Users] Detailed manual rating score of structural images

2019-01-20 Thread Harms, Michael

Hi,

https://wiki.humanconnectome.org/pages/viewpage.action?pageId=88901591


--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Xuhong Liao 

Date: Sunday, January 20, 2019 at 3:29 AM
To: hcp-users 
Subject: [HCP-Users] Detailed manual rating score of structural images

I have got the S-1200 dataset in which the subjects with QC issues are marked 
with codes from "A" to "E".

Could anyone tell me how to obtain the detailed manual rating scores of each 
subject described in the quality control SOP?

Regards,
Xuhong Liao, PhD
--
Xuhong Liao, Associate Professor
School of Systems Science,
Beijing Normal University
No.19 Xinjiekouwai Street,
Beijing, 100875, China
liaoxuh...@gmail.com


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] CIFTI to Matlab

2019-01-17 Thread Harms, Michael

Are you actually using 3 dots, instead of 2 dots, to specify your path?

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: Anita Sinha 
Date: Thursday, January 17, 2019 at 4:38 PM
To: "Glasser, Matthew" , "Harms, Michael" 
, "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] CIFTI to Matlab


Matt/Michael,



When I run g = gifti, I get g = gifti object 1-by-1. I have confirmed that I 
have the correct file paths to the wb_command, however, it seems to duplicate 
part of a file path string, which is why it cannot find the file. Is this a 
known issue?



This is what I'm running:


y = '/.../.../CM.dtseries.nii';
ciftiFile = ciftiopen(y,'Z:/.../workbench/bin_windows64/wb_command');
CM = ciftiFile.cdata;

and receive the same error:

Error using read_gifti_file_standalone (line 20)
[GIFTI] Loading of XML file 
C:\Users\AMS217\AppData\Local\Temp\tp366ef3da_4687_4d6e_9b4a_bb7f0ce6fd8e.gii 
failed.

Error in gifti (line 100)
this = read_gifti_file_standalone(varargin{1},giftistruct);

Error in ciftiopen (line 34)
cifti = gifti([tmpfile '.gii']);



Do I need to convert it to char or something so it is read in correctly?



Regards,



Anita Sinha

Biomedical Engineering Graduate Student

University of Wisconsin-Madison

amsi...@wisc.edu


From: Glasser, Matthew 
Sent: Thursday, January 17, 2019 4:27:17 PM
To: Harms, Michael; Anita Sinha; hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] CIFTI to Matlab

That could also be due to it not finding the file or not finding wb_command.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of "Harms, Michael" mailto:mha...@wustl.edu>>
Date: Thursday, January 17, 2019 at 4:05 PM
To: Anita Sinha mailto:amsi...@wisc.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] CIFTI to Matlab




Hi,

It sounds like you don’t have the gifti library properly installed.



Within matlab, what happens when you type

g = gifti



Do you get a gifti structure?



Cheers,

-MH





--

Michael Harms, Ph.D.

---

Associate Professor of Psychiatry

Washington University School of Medicine

Department of Psychiatry, Box 8134

660 South Euclid Ave.Tel: 314-747-6173

St. Louis, MO  63110  Email: 
mha...@wustl.edu<mailto:mha...@wustl.edu>



From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Anita Sinha mailto:amsi...@wisc.edu>>
Date: Thursday, January 17, 2019 at 3:37 PM
To: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] CIFTI to Matlab



To Whom It May Concern,



I am attempting to open up a dtseries.nii CIFTI file into Matlab to extract the 
time series matrix from rs-fMRI data. I have downloaded workbench and gifti-1.8 
and am following the directions outlined in 
https://wiki.humanconnectome.org/pages/viewpage.action?pageId=63963178 in #2 
"How do you get CIFTI files into Matlab, but it isn't working.



When I run this command:

cii = ciftiopen('path/to/file','path/to/wb_command'); CIFTIdata = cii.cdata



with the appropriate file paths, I keep receiving this error:



Error using read_gifti_file_standalone (line 20)

[GIFTI] Loading of XML file 
C:\Users\...\AppData\Local\Temp\tp904b5934_f040_4259_8a21_0b10e15aecc8.gii 
failed.



Error in gifti (line 100)

this = read_gifti_file_standalone(varargin{1},giftistruct);



Error in ciftiopen (line 34)

cifti = gifti([tmpfile '.gii']);



I have added workbench and gifti to the path and saved everything in the same 
directory to mitigate file directory mismatch, but cannot past this error.



Could you provide some help on how to resolve this?



Thank you for your time.



Regards,



Anita

___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users





The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telepho

Re: [HCP-Users] CIFTI to Matlab

2019-01-17 Thread Harms, Michael

Hi,
It sounds like you don’t have the gifti library properly installed.

Within matlab, what happens when you type
g = gifti

Do you get a gifti structure?

Cheers,
-MH


--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Anita Sinha 

Date: Thursday, January 17, 2019 at 3:37 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] CIFTI to Matlab

To Whom It May Concern,

I am attempting to open up a dtseries.nii CIFTI file into Matlab to extract the 
time series matrix from rs-fMRI data. I have downloaded workbench and gifti-1.8 
and am following the directions outlined in 
https://wiki.humanconnectome.org/pages/viewpage.action?pageId=63963178 in #2 
"How do you get CIFTI files into Matlab, but it isn't working.

When I run this command:
cii = ciftiopen('path/to/file','path/to/wb_command'); CIFTIdata = cii.cdata

with the appropriate file paths, I keep receiving this error:

Error using read_gifti_file_standalone (line 20)
[GIFTI] Loading of XML file 
C:\Users\...\AppData\Local\Temp\tp904b5934_f040_4259_8a21_0b10e15aecc8.gii 
failed.

Error in gifti (line 100)
this = read_gifti_file_standalone(varargin{1},giftistruct);

Error in ciftiopen (line 34)
cifti = gifti([tmpfile '.gii']);

I have added workbench and gifti to the path and saved everything in the same 
directory to mitigate file directory mismatch, but cannot past this error.

Could you provide some help on how to resolve this?

Thank you for your time.

Regards,

Anita

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] dMRI data order for eddy

2019-01-06 Thread Harms, Michael

Hi,
Please see comments related to this in Examples/DiffusionPreprocessingBatch.sh.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Szabolcs David 

Date: Sunday, January 6, 2019 at 6:36 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] dMRI data order for eddy

Hi All,

What was the order of the diffusion MRI data as an input for eddy?
According to the 'HCP_S1200_Release_Appendix_I.pdf' p46., image the acq order 
was:

95 RL
95 LR
96 RL
96 LR
97 RL
97 LR
Based on the eddy log files the input order should be something like this (the 
position of the b0 reveals this, since those volumes have 0 eddy-current 
related components by def.)

95 RL
96 RL
97 RL
95 LR
96 LR
97 LR

or start with LR then RL..but which one? Could you please let me know?

Best,
Szabolcs

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Temporal signal to noise ratio of HCP subjects

2018-12-21 Thread Harms, Michael

Hi,
tSNR (and a fairly sophisticated variance partitioning) is available via the 
“RestingStateStats” pipeline.
If you have the full sets of packages from HCP-YA, see the 
${subject}/MNINonLinear/Results/${fMRIName}/${fMRIName}_Atlas_stats.txt file, 
which includes a “tSNR” entry.
(Associated maps are available in the ${fMRIName}_Atlas_stats.dscalar.nii file).

Note that in the context of RestingStateStats, “tSNR” is defined as:
TSNR = MEAN ./ sqrt(UnstructNoiseVar);
i.e,. the noise estimate is derived from an estimate of the unstructured 
(“gaussian”) noise; not simply the standard deviation of the time series.
See RestingStateStats.m for the full algorithmic details.

In terms of structural QC, take a look at this:
https://github.com/Washington-University/StructuralQC
and the “Data Quality” (lecture and practical portions) slides from the 2018 
HCP Course, available here:
https://store.humanconnectome.org/courses/2018/exploring-the-human-connectome.php

cheers,
-MH


--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Aaron C 

Date: Thursday, December 20, 2018 at 11:29 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Temporal signal to noise ratio of HCP subjects

Dear HCP experts,

I am looking for temporal signal to noise ratio of the resting state data of 
HCP subjects. Is that information available somewhere?

Also, is there any existing scene file to check GM/WM segmentation quality? 
Thank you.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Time series data

2018-12-19 Thread Harms, Michael

If you are looking for a network assignment of the parcels in the Glasser 
parcellation, see:
https://github.com/ColeLab/ColeAnticevicNetPartition

cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Wednesday, December 19, 2018 at 6:30 PM
To: "Rakib Al-Fahad (ralfahad)" , 
"HCP-Users@humanconnectome.org" 
Subject: Re: [HCP-Users] Time series data

See my recommendation below.

Matt.

From: "Rakib Al-Fahad (ralfahad)" 
mailto:ralfa...@memphis.edu>>
Date: Wednesday, December 19, 2018 at 5:46 PM
To: Matt Glasser mailto:glass...@wustl.edu>>, 
"HCP-Users@humanconnectome.org" 
mailto:HCP-Users@humanconnectome.org>>
Subject: Re: [HCP-Users] Time series data

Matt,

I agree with your pint.

My specialization is in signal processing and machine learning. I am not sure 
about ICA based time series. If we consider 100 ICA components, how can we 
define each node name for graph theoretical measures? For example, the paper 
‘Chronnectomic patterns and neural flexibility underlie executive function” 
[Jason et al. NeuroImage 147 (2017): 861-871] talked about 100 components. They 
define DNN, subcortical, frontal, etc. networks. Can you give me any reference 
that can guide which component belongs to which network or how to name them?

If somehow, I discover connectivity between Component_1 and Component_3 is 
significantly related to some behavior, how can I express it in neuroscience 
term? I believe my question is clear now.


Thanks
Rakib

From: "Glasser, Matthew" mailto:glass...@wustl.edu>>
Date: Wednesday, December 19, 2018 at 5:28 PM
To: "Rakib Al-Fahad (ralfahad)" 
mailto:ralfa...@memphis.edu>>, 
"HCP-Users@humanconnectome.org" 
mailto:HCP-Users@humanconnectome.org>>
Subject: Re: [HCP-Users] Time series data

What neuroscience questions are ICA timeseries unable to answer?  Particularly 
that gyral/sulcal folding-based parcellation would be able to answer?  If you 
want a neuroanatomical parcellation into cortical areas, this is available here:

https://balsa.wustl.edu/file/show/3VLx

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of "Rakib Al-Fahad (ralfahad)" 
mailto:ralfa...@memphis.edu>>
Date: Wednesday, December 19, 2018 at 4:40 PM
To: "HCP-Users@humanconnectome.org" 
mailto:HCP-Users@humanconnectome.org>>
Subject: [HCP-Users] Time series data

Hello All,

I want to analyze time series data and dynamic brain connectivity from rfMRI 
data. I don’t want to use ICA based time series because they cannot answer lot 
of neuroscience question. I prefer ROI based analysis. A 45+ ROI template would 
be useful (e.g. FreeSurfer template). Is it possible to use workbench on 
processed data to extract ROI based time series, or I have to run FreeSurfer on 
data? Please help me with some guidance and reference.

[M logo]

Rakib Al-Fahad
Ph.D. Candidate
Electrical and Computer Engineering
The University of Memphis
901.279.4128




___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this 

Re: [HCP-Users] Noise level estimate from the data

2018-12-17 Thread Harms, Michael

Hi,
I don’t think that those values in the .fsf file get used for anything in terms 
of the HCPpipelines, so in that sense their value is irrelevant.   Perhaps Greg 
will recall if they were set in any meaningful manner way-back when we 
constructed the design.fsf templates.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Jeiran Choupan 

Date: Monday, December 17, 2018 at 12:39 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Noise level estimate from the data

Hello,

I am trying the first level FEAT analysis on Task fMRI data, and I noticed that 
`noise level%` & `Temporal smoothness` values set in the design.fsf file are 
different than the default values in the software. I was wondering how you have 
calculated the noise level and temporal smoothness values .

"
# Noise level
set fmri(noise) 0.412671

# Noise AR(1)
set fmri(noisear) 0.189140
“
The `Estimate from data` button in FEAT GUI version v6.00 in my computer is 
generating a vague large number (larger than one)!


When I script the noise level estimation myself and run it on my own data 
(collected with HCP protocol (we use TR=0.8, 2mm isotropic)) I receive the 
following values on average:

# Noise level
set fmri(noise) 0.1

# Noise AR(1)
set fmri(noisear) 0.2



I was wondering why the noise level is very different that the values used for 
subject `100307`. or, if this value looks relatively okay.

I appreciate if you could please help me with this.

Thank you

Jeiran Choupan, PhD

Research Scientist
The Laboratory for Functional and Computational Vision, USC Dornsife
Dana and David Dornsife College of Letters, Arts and Sciences
Department of Psychology, University of Southern California

3620 McClintock Ave
SGM 1017
Los Angeles, CA 90089


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] HCP_FIX

2018-12-14 Thread Harms, Michael

In terms of getting the /reg directory needed to run your own FIX training, 
Matt’s earlier point is that that is handled in the hcp_fix script as well.  
See the lines following the portion of the script with “mkdir -p reg”

i.e., if you’ve already run hcp_fix, you should then be able to run ‘fix -t’ to 
create your own training file (assuming you’ve added the necessary 
‘hand_labels_noise.txt’ file to each .ica directory).

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: "Glasser, Matthew" 
Date: Thursday, December 13, 2018 at 2:02 PM
To: Shachar Gal 
Cc: "Harms, Michael" , "hcp-users@humanconnectome.org" 
, Ido Tavor , Niv Tik 

Subject: Re: [HCP-Users] HCP_FIX

You can specify your own training for to hcp_fix as an optional last argument.

Matt.

From: Shachar Gal mailto:gal.shac...@gmail.com>>
Date: Thursday, December 13, 2018 at 6:39 AM
To: Matt Glasser mailto:glass...@wustl.edu>>
Cc: "Harms, Michael" mailto:mha...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>, Ido 
Tavor mailto:idota...@gmail.com>>, Niv Tik 
mailto:niv...@gmail.com>>
Subject: Re: [HCP-Users] HCP_FIX

if I understand correctly, what happens in these line is that if I don't have a 
training data file, it uses the hcp training data as default.
but this is not our purpose - we wish to create our own training data file, 
based on the hand classification we did on our data.

Shachar

On Tue, 11 Dec 2018 at 21:58, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
See lines 177-193 from the hcp_fix script.

Matt.

From: Shachar Gal mailto:gal.shac...@gmail.com>>
Date: Tuesday, December 11, 2018 at 6:33 AM
To: Matt Glasser mailto:glass...@wustl.edu>>
Cc: "Harms, Michael" mailto:mha...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>, Ido 
Tavor mailto:idota...@gmail.com>>, Niv Tik 
mailto:niv...@gmail.com>>
Subject: Re: [HCP-Users] HCP_FIX

I'm sorry if I'm pressing the issue too much, but I still can't see how the 
hcp_fix solves the fact the the main FIX script expects several files that do 
are not available from the registration process in the hcp pipeline.

Thanks for your patience,
Shachar

On Tue, Dec 11, 2018, 14:07 Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
You can still do training after running hcp_fix.  As for not using hcp_fix 
initially and just running the main fix script, I haven’t tried that.  You 
might need to ask on the main FSL list.

Matt.

From: Shachar Gal mailto:gal.shac...@gmail.com>>
Date: Tuesday, December 11, 2018 at 5:21 AM
To: Matt Glasser mailto:glass...@wustl.edu>>
Cc: "Harms, Michael" mailto:mha...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>, Ido 
Tavor mailto:idota...@gmail.com>>
Subject: Re: [HCP-Users] HCP_FIX

hey Matt,
I was trying to use the FIX script from the fix directory that came with this - 
downloadable FIX tar file<http://www.fmrib.ox.ac.uk/~steve/ftp/fix.tar.gz> tar 
file.
I did so because I didn't see in the content of the hcp_fix script that it runs 
the training mode, and I needed to create the training file (as I said before, 
im at the stage where I have hand labelled the components of the resting state 
data for 10 of my subjects after running MELODIC, and I wish to train FIX using 
these labels).
if I'm mistaken in my workflow, I would appreciate any further instructions as 
to the correct workflow for using the FIX pipelines.

thank you



On Mon, 10 Dec 2018 at 23:13, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
Did you use the hcp_fix script or some other method of running FIX on HCP data.

Matt.

From: Shachar Gal mailto:gal.shac...@gmail.com>>
Date: Monday, December 10, 2018 at 7:47 AM
To: "Harms, Michael" mailto:mha...@wustl.edu>>
Cc: Matt Glasser mailto:glass...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>, Ido 
Tavor mailto:idota...@gmail.com>>
Subject: Re: [HCP-Users] HCP_FIX

hey again,

thanks for pointing out the relevant lines for the dual regression.

I have some further questions about the workflow of running FIX.
so I did hand classification for 10 of my subjects, and wished to run FIX with 
the -t option, in order to create the training data.
but then I realized t

Re: [HCP-Users] fix error, vn maps

2018-12-06 Thread Harms, Michael

ciftisavereset wasn’t compiled in to the existing 1.067 compiled matlab.  I 
just tested a new compiled in which this is fixed.  Waiting for Oxford to 
either update the fix1.067 package, or release a new fix version.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Aaron R 

Date: Thursday, December 6, 2018 at 10:29 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] fix error, vn maps

Dear HCP users,
I'm getting an error with fix 1.067, the cifti variance normalization field is 
not created. The file filtered_func_data_clean_vn.nii.gz is created however, 
and everything else is ok. In .fix.log, i see:

The file
   '/usr/local/gifti/ciftisavereset.m'
   is not in the application's expanded CTF archive at
'/home/aaron/.mcrCache8.3/fix_3_1'.
This is typically caused by calls to ADDPATH in your startup.m or matlabrc.m 
files. Please see the compiler documentation and use the ISDEPLOYED function to 
ensure ADDPATH commands are not executed by deployed applications.
Previously accessible file "/usr/local/gifti/ciftisavereset.m" is now 
inaccessible.

Error in fix_3_clean (line 131)
I'm using the compiled matlab option. Any ideas?
Thanks in advance,
Aaron

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Number-to-anatomical labels: need help!

2018-12-03 Thread Harms, Michael

Hi,
It’s the standard FreeSurfer label table -- 
$FREESURFER_HOME/FreeSurferColorLUT.txt

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of giorgia cona 

Date: Monday, December 3, 2018 at 10:18 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Number-to-anatomical labels: need help!


Dear HCP users,



Thanks for your great work.



We were looking to harvest the following files in the HCP structural dataset

- aparc.a2009s+aseg.nii.gz

- aparc+aseg.nii.gz

However we have been struggling to find a file that match numbers in the 
neuroimaging with anatomical labels.

Thanks for your help,



Giorgia

--
Giorgia Cona, PhD.

Department of General Psychology
Padua Neuroscience Center (PNC)
University of Padua

Via Venezia 8, 35131
Tel: 0498276291
E-mail: giorgia.c...@unipd.it
https://www.researchgate.net/profile/Giorgia_Cona

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] PALM basic execution error - setup issue?

2018-11-26 Thread Harms, Michael

Hi,
The error in palm_miscread.m is telling you that your execution environment 
isn’t finding ‘wb_command’ in your PATH, so you need to resolve that by setting 
up your environment appropriately.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Darren Campbell 

Date: Monday, November 26, 2018 at 11:22 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] PALM basic execution error - setup issue?

Hi
   I am having a basic palm execution error. The errors seem to reflect file 
program file access issues. See feedback below.
  Background information: Currently, I am able to start FSL. Freesurfer, 
workbench, and octave from the command line. Palm also seems engaged from the 
command line, but I have not run any specific tasks with it. I am gradually 
learning how to use these various programs to examined the HCP task-fMRI data.
  Any suggestions would be appreciated.
Thanks. Darren

 palm -i Y.dtseries.nii -d ./Progs/Design.mat -t ./Progs/Design.con -o results 
-n 500 -corrcon -logp -accel tail -nouncorrected
=
 Permutation Analysis of Linear Models
=
Running PALM alpha112 using Octave 4.2.2 with the following options:
-i Y.dtseries.nii
-d ./Progs/Design.mat
-t ./Progs/Design.con
-o results
-n 500
-corrcon
-logp
-accel tail
-nouncorrected
Found FSL in /usr/local/fsl
Found FreeSurfer in /usr/local/freesurfer
sh: 1: which: not found
Reading input 1/1: Y.dtseries.nii
Error using palm_miscread (/home/brain/palm-alpha112/palm_miscread.m:171)
Currently cannot read/write CIFTI files without the HCP Workbench.

Error in palm_ready (/home/brain/palm-alpha112/palm_ready.m:47->palm_miscread)
Error in palm_takeargs 
(/home/brain/palm-alpha112/palm_takeargs.m:1644->palm_ready)
Error in palm_core (/home/brain/palm-alpha112/palm_core.m:33->palm_takeargs)
Error in palm (/home/brain/palm-alpha112/palm.m:81->palm_core)

Dr. Darren Campbell
Associate Professor
Department of Psychology
Nipissing University
Office: H236
Phone: 705-474-3450 Ext. 4524
Lab: A222F


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] HCP_FIX

2018-11-26 Thread Harms, Michael

Hi,
The cleaning of the CIFTI data is done automatically in fix_3_clean.m of the 
FIX distribution, which is called by the fix script itself.

The only relevant aspect of hcp_fix to that process is that hcp_fix creates the 
following symbolic link:

if [ -f ../${fmri_orig}_Atlas.dtseries.nii ] ; then
  $FSLDIR/bin/imln ../${fmri_orig}_Atlas.dtseries.nii Atlas.dtseries.nii
fi

since fix_3_clean.m is hard-coded to look for a file named Atlas.dtseries.nii

cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Shachar Gal 

Date: Friday, November 23, 2018 at 2:37 AM
To: "Glasser, Matthew" 
Cc: "hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] HCP_FIX

hey matt,
the thing is, I didn't see anything in the hip_fix script the projects the "fix 
cleaned" volume to surface, so when does that happen?
considering that you mentioned you're currently editing these scripts, I think 
we would just use the original FSL FIX, and later project the fix cleaned 
volume to surface using your 'fMRISurface' pipeline.

thanks,
Shachar


On Tue, 20 Nov 2018 at 04:10, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
The hcp_fix script is for enabling a few enhancements specific to HCP-Style 
processing.  These scripts are currently being edited and tested and we will 
have new versions announced soon.  PostFix generates the Workbench scenes for 
hand classification of components.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Shachar Gal mailto:gal.shac...@gmail.com>>
Date: Monday, November 19, 2018 at 3:30 AM
To: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] HCP_FIX

Dear experts,
I was was looking for some documentation regarding the differences between the 
standard FSL FIX script, and the HCP's modified version, but could not find 
such.
could you point me to such a documentation, if exists, or perhaps elaborate 
briefly on the subject?
on the same subject, is there any similar documentation regarding postfix? I'm 
not sure what's its purpose in the cleaning and applying procedure.

thanks.
Shachar Gal


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] HCP_FIX

2018-11-22 Thread Harms, Michael

Hi,
What is the overall context of this inquiry?  Are you getting errors with 
running the version of hcp_fix in ICAFIX/hcp_fix?  If so, simply revert to 
using the version of hcp_fix supplied with the FIX distribution itself.  That 
doesn’t include some of the enhancements that Matt mentioned are being worked 
on, but it works.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Thursday, November 22, 2018 at 8:29 AM
To: Shachar Gal 
Cc: "hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] HCP_FIX

hcp_fix cleans both the volume and CIFTI data in parallel.  If you regress the 
spatial ICA timecourses from the volume-based ICA into the CIFTI data, you 
produce CIFTI spatial maps.

Matt.

From: Shachar Gal mailto:gal.shac...@gmail.com>>
Date: Thursday, November 22, 2018 at 6:10 AM
To: Matt Glasser mailto:glass...@wustl.edu>>
Cc: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] HCP_FIX

hey matt,
the thing is, I didn't see anything in the hip_fix script the projects the "fix 
cleaned" volume to surface, so when does that happen?
considering that you mentioned you're currently editing these scripts, I think 
we would just use the original FSL FIX, and later project the fix cleaned 
volume to surface using your 'fMRISurface' pipeline.

thanks,
Shachar


On Tue, 20 Nov 2018 at 04:10, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
The hcp_fix script is for enabling a few enhancements specific to HCP-Style 
processing.  These scripts are currently being edited and tested and we will 
have new versions announced soon.  PostFix generates the Workbench scenes for 
hand classification of components.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Shachar Gal mailto:gal.shac...@gmail.com>>
Date: Monday, November 19, 2018 at 3:30 AM
To: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] HCP_FIX

Dear experts,
I was was looking for some documentation regarding the differences between the 
standard FSL FIX script, and the HCP's modified version, but could not find 
such.
could you point me to such a documentation, if exists, or perhaps elaborate 
briefly on the subject?
on the same subject, is there any similar documentation regarding postfix? I'm 
not sure what's its purpose in the cleaning and applying procedure.

thanks.
Shachar Gal


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Number of cortical vertices in cifti and gifti files

2018-11-16 Thread Harms, Michael

See the various -cifti-create-* commands.


--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: Reza Rajimehr 
Date: Friday, November 16, 2018 at 1:50 PM
To: "Harms, Michael" 
Cc: "Glasser, Matthew" , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] Number of cortical vertices in cifti and gifti files

Is there a command to convert LH (or RH) gifti file to hcp cifti file?


On Fri, Nov 16, 2018 at 10:55 PM Harms, Michael 
mailto:mha...@wustl.edu>> wrote:

No, it is more complicated than that.

I believe what you need is -cifti-export-dense-mapping

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid 
Ave<https://maps.google.com/?q=660+South+Euclid+Ave=gmail=g>.  
  Tel: 314-747-6173
St. Louis, MO  63110  Email: 
mha...@wustl.edu<mailto:mha...@wustl.edu>

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Reza Rajimehr mailto:rajim...@gmail.com>>
Date: Friday, November 16, 2018 at 1:17 PM
To: "Glasser, Matthew" mailto:glass...@wustl.edu>>
Cc: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] Number of cortical vertices in cifti and gifti files

Thanks! Can I simply say that:

For left hemisphere:
vertex number in cifti = vertex number (up to 29706) in LH gifti

For right hemisphere:
vertex number in cifti = vertex number (up to 29706) in RH gifti + 29706

Or the mapping is more complicated than this?


On Fri, Nov 16, 2018 at 9:56 PM Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
That is correct, the medial wall is kept out.  Usually when I want to do that I 
split the CIFTI file into hemispheric GIFTI files, but perhaps there is a good 
way to load in a specific mapping based on something we can output from 
wb_command.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Reza Rajimehr mailto:rajim...@gmail.com>>
Date: Friday, November 16, 2018 at 10:17 AM
To: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Number of cortical vertices in cifti and gifti files

Hi,

A cifti file has 91282 vertices/voxels, a combined LR cifti file has 59412 
vertices, and an individual hemisphere gifti file has 32492 vertices. So the 
number of cortical vertices in cifti files is less than the number of cortical 
vertices in gifti files (left hemi vertices + right hemi vertices = 64984). 
Looks like this is related to not having medial wall vertices in the cifti 
files, right?

We have loaded these files in Matlab. Now we want to know which vertex in right 
(or left) hemisphere gifti file corresponds to which vertex in the cifti file. 
How can we achieve this?

Best,
Reza

___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, 

Re: [HCP-Users] Number of cortical vertices in cifti and gifti files

2018-11-16 Thread Harms, Michael

No, it is more complicated than that.

I believe what you need is -cifti-export-dense-mapping

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Reza Rajimehr 

Date: Friday, November 16, 2018 at 1:17 PM
To: "Glasser, Matthew" 
Cc: "hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] Number of cortical vertices in cifti and gifti files

Thanks! Can I simply say that:

For left hemisphere:
vertex number in cifti = vertex number (up to 29706) in LH gifti

For right hemisphere:
vertex number in cifti = vertex number (up to 29706) in RH gifti + 29706

Or the mapping is more complicated than this?


On Fri, Nov 16, 2018 at 9:56 PM Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
That is correct, the medial wall is kept out.  Usually when I want to do that I 
split the CIFTI file into hemispheric GIFTI files, but perhaps there is a good 
way to load in a specific mapping based on something we can output from 
wb_command.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Reza Rajimehr mailto:rajim...@gmail.com>>
Date: Friday, November 16, 2018 at 10:17 AM
To: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Number of cortical vertices in cifti and gifti files

Hi,

A cifti file has 91282 vertices/voxels, a combined LR cifti file has 59412 
vertices, and an individual hemisphere gifti file has 32492 vertices. So the 
number of cortical vertices in cifti files is less than the number of cortical 
vertices in gifti files (left hemi vertices + right hemi vertices = 64984). 
Looks like this is related to not having medial wall vertices in the cifti 
files, right?

We have loaded these files in Matlab. Now we want to know which vertex in right 
(or left) hemisphere gifti file corresponds to which vertex in the cifti file. 
How can we achieve this?

Best,
Reza

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] List of ALL unrelated subject

2018-11-02 Thread Harms, Michael

Hi,

1)  We didn’t repeat the exercise of generating an “unrelated subjects” list 
following the complete S1200, and rather doubt that we will at this point in 
time.  Sorry.

2) There is no unique “unrelated subjects” list, since you can pick any one 
participant from each family.  We generated that list a long time ago, and I 
don’t recall the exact algorithm that we used at the time.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Kausar Abbas 

Date: Friday, November 2, 2018 at 3:18 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] List of ALL unrelated subject

Hi,

We have been using the "100 unrelated subject list from S1200" release but 
there are many more number of unrelated subjects in the HCP data.

We found a list of '340 unrelated subjects list under S900' but there are a 
couple of confusions:
1. Why don't we have a list of unrelated subjects under S1200?
2. More importantly, after comparing the subject IDs, we found that only 44 of 
the "100 unrelated subjects from S1200" are in the "340 unrelated list". The 
remaining 56 subjects are missing.

We are wondering if there is a 'perfect list' that we can't find. Thanks.

--
Dr. Kausar Abbas
Postdoctoral Research Assistant, CONNplexity Lab
https://engineering.purdue.edu/ConnplexityLab
School of Industrial Engineering
Purdue Institute of Integrative Neuroscience
Purdue University, West Lafayette, IN, USA.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Debugging IcaFIxProcessingBatch.sh

2018-11-01 Thread Harms, Michael

Hi,
Have you configured your R installation correctly, per the FSL FIX Wiki page?

https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FIX/UserGuide

cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Jayasekera, Dinal" 

Date: Thursday, November 1, 2018 at 4:10 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Debugging IcaFIxProcessingBatch.sh


Dear HCP community,



I am currently running the IcaFixProcessingBatch script on the data processing 
using the minimal functional pipelines. However, for each of my resting state 
conditions, I get the following error:


/home/Desktop/Applications/fix1.066/fix: 252: 
/home/Desktop/Applications/fix1.066/fix: R: not found
No valid labelling file specified
Could not find a supported file with prefix 
"rfMRI_REST1_AP_hp2000.ica/filtered_func_data_clean"

Has anyone had any experience with a similar issue? Is this error indicative of 
a missing step from before?

Kind regards,
Dinal Jayasekera

PhD Candidate | InSITE Fellow
Ammar Hawasli Lab
Department of Biomedical Engineering | Washington University in St. Louis

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Software version for HCP Pipeline (v 3.27.0)

2018-11-01 Thread Harms, Michael

Correction. The latest FSL is now v6.0, which just got released the other day!  
Feel free to give that a try, and report back if there are any aspects of the 
HCPpipelines that don’t work in the context of using FSL 6.0.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From: "Harms, Michael" 
Date: Thursday, November 1, 2018 at 10:25 AM
To: Aaron C , "hcp-users@humanconnectome.org" 

Cc: Timothy Brown 
Subject: Re: [HCP-Users] Software version for HCP Pipeline (v 3.27.0)


FS: 5.3.0-HCP.  We are working toward compatibility with FS 6.x – hopefully in 
the not too distant future.
FSL: latest (5.0.11 currently)
Workbench: latest (1.3.2 currently)

We know that the following is way out of date, and needs to be updated:
https://github.com/Washington-University/HCPpipelines/wiki/v3.4.0-Release-Notes,-Installation,-and-Usage

cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Aaron C 

Date: Thursday, November 1, 2018 at 10:13 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Software version for HCP Pipeline (v 3.27.0)


Dear HCP experts,



Would you please suggest FreeSurfer, FSL, and Connectome Workbench version for 
using HCP Pipeline v 3.27.0? Thank you.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Software version for HCP Pipeline (v 3.27.0)

2018-11-01 Thread Harms, Michael

FS: 5.3.0-HCP.  We are working toward compatibility with FS 6.x – hopefully in 
the not too distant future.
FSL: latest (5.0.11 currently)
Workbench: latest (1.3.2 currently)

We know that the following is way out of date, and needs to be updated:
https://github.com/Washington-University/HCPpipelines/wiki/v3.4.0-Release-Notes,-Installation,-and-Usage

cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Aaron C 

Date: Thursday, November 1, 2018 at 10:13 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Software version for HCP Pipeline (v 3.27.0)


Dear HCP experts,



Would you please suggest FreeSurfer, FSL, and Connectome Workbench version for 
using HCP Pipeline v 3.27.0? Thank you.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Question about dense functional connectome

2018-10-25 Thread Harms, Michael

As an aside, if ‘ft_read_cifti’ is expanding all the data to type double (64 
bit), then you are looking at needing > 66 GB to load a 91282x91282 dconn.nii 
in matlab.  I assume that is what is happening based on your experience.

If you want to see how far you can get in Matlab, I would at least try 
switching to using ‘ciftiopen.m’ to load it into matlab, which uses the gifti 
toolbox, which preserves the underlying data type of the .dconn.nii (type 
single; i.e., 32 bit) upon load into matlab.

I tested that just now on a system with 64 GB of RAM (and 64 GB of swap), and 
after what must have been at least 5 min, it loaded successfully into matlab 
(R2015a) with a size in matlab of 32 GB (and a type of ‘single’).  Note however 
that as that process was ongoing, the memory usage of the overall system hit 56 
GB (not counting cached memory), which decreased to 33 GB when the load was 
completed (presumably that is reflective of there essentially being two copies 
in memory during the loading process).

Working with a dconn in matlab is inevitably going to be cumbersome and slow, 
but by using this approach you might actually be able to get it loaded and see 
where you can get from that point.  Obviously, you would have to be extremely 
conscious of memory issues for whatever operations you try to perform in matlab.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Reza Rajimehr 

Date: Thursday, October 25, 2018 at 3:17 AM
To: "Glasser, Matthew" 
Cc: "hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] Question about dense functional connectome

We tried to open *.dconn.nii in Matlab using ft_read_cifti on an Ubuntu system 
with 48 GB RAM. It used 48 GB RAM and 18 GB (out of 60 GB) swap, then Matlab 
gave out of memory error! Will try to use a system with even higher RAM.

Tim: We have a list of 913 grayordinate voxels (we have their voxel 
numbers/indices). These voxels are somewhat distributed in gray matter. We are 
basically interested in the functional connectivity matrix just for these 913 
voxels (i.e. a 913 * 913 matrix). Is it possible to use wb_command to select 
part of *.dconn.nii corresponding to the voxels of interest?

Best,
Reza


On Thu, Oct 25, 2018 at 3:49 AM Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
Use swap space.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Timothy Coalson mailto:tsc...@mst.edu>>
Date: Wednesday, October 24, 2018 at 4:57 PM
To: Reza Rajimehr mailto:rajim...@gmail.com>>
Cc: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] Question about dense functional connectome

The cifti format doesn't support that, because it is designed to allow use 
without loading the entire file into memory - there is no obvious file 
organization that would allow efficient loading of a full row from a single 
triangular file, when seek times are nontrivial (rotating disks).  I don't 
believe we have these files in any other format.  I'm also not sure how easy it 
would be to work on such a matrix in matlab.  If wb_command is capable of doing 
the operations you want, it is usually possible for it to do them without 
loading the entire cifti file into memory at once.  wb_view will also display 
maps from it without loading the entire file into memory.

DDR4 has 16GB modules available at reasonable prices, which should allow recent 
computers with even 4 slots for memory to expand to 64GB.  You should also 
consider the "high-end desktop" platforms (threadripper, lga2066, lga2011), 
which typically offer 8 memory slots, and processors with more cores.

Tim


On Wed, Oct 24, 2018 at 3:56 PM, Reza Rajimehr 
mailto:rajim...@gmail.com>> wrote:
Hi,

We are trying to unzip and load dense functional connectome 
(HCP_S900_820_rfMRI_MSMAll_groupPCA_d4500ROW_zcorr.zip or 
HCP_S900_820_rfMRI_MSMSulc_groupPCA_d4500ROW_zcorr.zip from
https://www.humanconnectome.org/study/hcp-young-adult/article/announcing-release-of-s900-ptn-and-other-group-average-data)
 in Matlab on a system that has 32 GB RAM. This is of course impossible due to 
the fact that the files are ~33 GB in size. Due to matrix symmetry, it would be 
sufficient to store only the upper or lower triangular part of a functional 
connectivity matrix, reducing memory occupancy by about 50% (Loewe et al., 
2016). Does HCP provide such files that could be memory-friendly?

Best,
Reza

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



Re: [HCP-Users] T1w_acpc_dc_restore_brain.nii.gz

2018-10-09 Thread Harms, Michael

Hi,

See http://www.ncbi.nlm.nih.gov/pubmed/23668970
specifically Figure 9.

“dc” is the readout distortion correction.
“restore” is the bias field correction (for the receive bias field)

The gradient distortion correction is not included as part of the filename, so 
the file could just as well have been named 
“T1w_gdc_acpc_rdc_restore_brain.nii.gz”

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of X K 

Date: Tuesday, October 9, 2018 at 8:50 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] T1w_acpc_dc_restore_brain.nii.gz

Hi,

I am using the HCP data. Here I came across a question about the file 
T1w_acpc_dc_restore_brain.nii.gz
I am curious what have been done between T1w.nii.gz and this file. From the 
document, I guess that
acpc indicate linear registriation (flirt) with dof=6;
dc might indicate distortion correction, but how was this done? using flirt 
with dof=12?
I am not sure whether 'restore' has particiulat meaning.

Thanks.

Best,
Xiangzhen

--
-
Xiangzhen Kong (孔祥祯)
Research Staff
Language and Genetics Department,
Max Planck Institute for Psycholinguistics,
Wundtlaan 1, 6525 XD, The Netherlands.
-


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Time-of-day when scans where performed?

2018-10-01 Thread Harms, Michael

Hi,
You can find that info in the "3T/7T MRI Session Summary CSVs" available for 
download via
https://db.humanconnectome.org/data/projects/HCP_1200

cheers,
-MH

--
Michael Harms, Ph.D.

---

Associate Professor of Psychiatry

Washington University School of Medicine

Department of Psychiatry, Box 8134

660 South Euclid Ave.Tel: 314-747-6173

St. Louis, MO  63110  Email: mha...@wustl.edu

On 10/1/18, 10:43 AM, "hcp-users-boun...@humanconnectome.org on behalf of 
Nicola Toschi"  wrote:

Dear List,

I am running inference on a hypothesis that is influenced by circadian
rhythms and was wondering what the easiest way would be to retrieve, for
any particular scan, at what time of day it was performed. Is this
possible without access to the DICOM files?

Guidance much appreciated!

Best,

Nicola



---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Intermediate freesurfer files not available through CREST?

2018-09-25 Thread Harms, Michael

They should be.

Have you searched the list for previous recipes for REST calls into the CREST 
resource?  It may be simply that your path isn't correct.
The FS data folder structure begins at /T1w/, so you probably 
want something like /T1w//surf/lh.white


--
Michael Harms, Ph.D.

---

Associate Professor of Psychiatry

Washington University School of Medicine

Department of Psychiatry, Box 8134

660 South Euclid Ave.Tel: 314-747-6173

St. Louis, MO  63110  Email: mha...@wustl.edu

On 9/25/18, 9:56 AM, "Nicola Toschi"  wrote:

Hi Michael,

thank you for your reply.

I am working with the whole S1200 release would actually much prefer to
fetch only the files I need (namely surfaces, thickness, area, curvature
and jacobian - more or less) in a targeted way (not least because of
data footprint issue on compute clusters).

Are those files available via CREST?

Thank you!

nicola


On 09/25/2018 03:17 PM, Harms, Michael wrote:
> Hi,
> You can obtain the entire FreeSurfer output via the Structural Extended 
> packages.  Perhaps that would be an easier way for you to proceed?
>
> Cheers,
> -MH
>
> --
> Michael Harms, Ph.D.
>
> ---
>
> Associate Professor of Psychiatry
>
> Washington University School of Medicine
>
> Department of Psychiatry, Box 8134
>
> 660 South Euclid Ave.Tel: 314-747-6173
>
> St. Louis, MO  63110  Email: mha...@wustl.edu
>
> On 9/24/18, 6:25 PM, "hcp-users-boun...@humanconnectome.org on behalf of 
> Nicola Toschi"  tos...@med.uniroma2.it> wrote:
>
> Hello List,
>
> I am trying to retrieve the files present in the 'surf' directory of the
> freesurfer reconstruction via CREST. However, when trying to retrieve
> (e.g) lh.white I always end up with a file containing only error messages.
>
> I would be grateful for any pointers. below is an example of my curl
> call after opening a session.
>
> curl --cookie JSESSIONID= -O
> https://db.humanconnectome.org/data/archive/projects/HCP_1200/subjects/100307/experiments/100307_CREST/resources/100307_CREST/files/T1w/surf/lh.white
>
> Thank you very much in advance!
>
> Nicola
>
>
> ---
> This email has been checked for viruses by Avast antivirus software.
> https://www.avast.com/antivirus
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> 
> The materials in this message are private and may contain Protected 
> Healthcare Information or other information of a sensitive nature. If you are 
> not the intended recipient, be advised that any unauthorized use, disclosure, 
> copying or the taking of any action in reliance on the contents of this 
> information is strictly prohibited. If you have received this email in error, 
> please immediately notify the sender via telephone or return mail.




The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Intermediate freesurfer files not available through CREST?

2018-09-25 Thread Harms, Michael

Hi,
You can obtain the entire FreeSurfer output via the Structural Extended 
packages.  Perhaps that would be an easier way for you to proceed?

Cheers,
-MH

--
Michael Harms, Ph.D.

---

Associate Professor of Psychiatry

Washington University School of Medicine

Department of Psychiatry, Box 8134

660 South Euclid Ave.Tel: 314-747-6173

St. Louis, MO  63110  Email: mha...@wustl.edu

On 9/24/18, 6:25 PM, "hcp-users-boun...@humanconnectome.org on behalf of 
Nicola Toschi"  wrote:

Hello List,

I am trying to retrieve the files present in the 'surf' directory of the
freesurfer reconstruction via CREST. However, when trying to retrieve
(e.g) lh.white I always end up with a file containing only error messages.

I would be grateful for any pointers. below is an example of my curl
call after opening a session.

curl --cookie JSESSIONID= -O
https://db.humanconnectome.org/data/archive/projects/HCP_1200/subjects/100307/experiments/100307_CREST/resources/100307_CREST/files/T1w/surf/lh.white

Thank you very much in advance!

Nicola


---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] questions about the motion metrics (Movement_RelativeRMS_mean.txt)

2018-09-18 Thread Harms, Michael

1. The translations are in mm.  I don’t recall off the top of head whether the 
rotations are in degrees or rotations, but I believe that is documented in the 
release manual. (Or check the HCP Pipeline code).

2. RMS is based on a combination of both the translations and rotations.

3. Yes.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Gengyan Zhao 

Date: Tuesday, September 18, 2018 at 5:47 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] questions about the motion metrics 
(Movement_RelativeRMS_mean.txt)


Hello HCP masters,



I have several questions about the motion metrics generated by the minimal 
preprocessing pipeline.

For the file Movement_RelativeRMS_mean.txt:

1) what's the unit of the number in it?

2) How was the number calculated? Does the RMS include only the 3 translations 
or all the 3 translations and the 3 rotations?

3) Is the mean taken along the temporal dimension?



Thank you very much in advance.



Best,

Gengyan

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] making a chart/graph out of myelin maps

2018-09-06 Thread Harms, Michael

Hi,
You can use -cifti-parcellate with, e.g., the Glasser parcellation.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Darko Komnenić 

Date: Thursday, September 6, 2018 at 9:50 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] making a chart/graph out of myelin maps

Dear HCP experts,
after obtaining myelin maps for the participants in my sample, I was wondering 
if there's an easy way to turn the visual information that we see when we 
overlay the maps on a surface, into numerical information? For instance, having 
the average myelin value per region for each participant, so that their values 
can be plotted and compared? Are there already output files that contain this 
information?
Thanks in advance!
Best,
Darko

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Question regarding intracranial volume

2018-08-30 Thread Harms, Michael

Hi,
As the link to the FS page that Jenn provided makes clear, the FS value for 
“ICV” is solely an estimate derived from the determinant of the talairach 
transform.  You cannot actually measure a true ICV using just a T1w image.

Also, be aware that we didn’t QC the accuracy of the talairach transform (since 
the surfaces can be completely fine even if that particular transform is off), 
and thus the values for ICV aren’t QC’ed.

If you simply are looking for a covariate to “control” for brain size, I’d use 
a different measure derived from the surfaces such as SupraTentorial or 
SupraTentorialNotVent (if you don’t want the ventricles included):
https://surfer.nmr.mgh.harvard.edu/fswiki/MorphometryStats

If you generate a scatterplot of ICV vs. SupraTentorial you’ll see a number of 
data points that fall relatively far from the regression line.  Those are 
probably subjects for which the talairach transform (and thus the ICV estimate) 
isn’t accurate.

cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Elam, Jennifer" 

Date: Thursday, August 30, 2018 at 10:40 AM
To: "D. van der Linden" , Benjamin Risk 

Cc: "hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] Question regarding intracranial volume


Hi Dimitri,

The intracranial volume estimates we report are outputs from FreeSurfer which 
reports values in mm3. See http://surfer.nmr.mgh.harvard.edu/fswiki/eTIV



Others may want to comment on your finding that the average ICV in males in the 
S900 was 1700 vs. 1500 in other studies.



Best,

Jenn


Jennifer Elam, Ph.D.
Scientific Outreach, Human Connectome Project
Washington University School of Medicine
Department of Neuroscience, Box 8108
660 South Euclid Avenue
St. Louis, MO 63110
314-362-9387
e...@wustl.edu
www.humanconnectome.org


From: D. van der Linden 
Sent: Thursday, August 30, 2018 8:01:52 AM
To: Elam, Jennifer; Benjamin Risk
Cc: hcp-users@humanconnectome.org
Subject: Question regarding intracranial volume


Dear sir, madam,



I have a question regarding the HCP data.

In the data there are various measures of brain volume and also an estimate of 
intracranial volume.

It was not clear however whether those values represent voxels or mm3?



For example the average intracranial volume in the HCP 900 batch was around 
1700 for males, which seems higher than the values around 1500mm3 that are 
reported in several studies.



So, my question is:

Do the intracranial values represent mm3 or voxels, or something else?



Kind regards,



Dimitri van der Linden

Erasmus University Rotterdam

The netherlands



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] QC

2018-08-30 Thread Harms, Michael

Hi,
Script to generate scenes for structural QC are available here:
https://github.com/Washington-University/StructuralQC

The joint lecture/practical on “Data Quality” for the HCP course, available 
here for the 2017 Course, would be useful for you to review as well:
https://store.humanconnectome.org/courses/2017/exploring-the-human-connectome.php

cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Stevens, Michael" 

Date: Wednesday, August 29, 2018 at 5:24 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] QC

Just wondering… The SOP manual for HCP data 
(https://www.humanconnectome.org/storage/app/media/documentation/s1200/HCP_S1200_Release_Appendix_IV.pdf)
 refers to a bash script for the FreeSurfer output that produces handy scenes 
that assist QC.  For instance, “gen_scenes_two-structurals_Q#.sh”.

Is this or any other helpful QC script available to the research community to 
facilitate HCP protocol use on our own studies?

Thanks,
Mike


Michael C. Stevens, Ph.D.
Director, CNDLAB, Olin Neuropsychiatry Research Center
Director, Child & Adolescent Research, The Institute of Living
Adjunct Professor of Psychiatry, Yale University School of Medicine


This e-mail message, including any attachments, is for the sole use of the 
intended recipient(s) and may contain confidential and privileged information. 
Any unauthorized review, use, disclosure, or distribution is prohibited. If you 
are not the intended recipient, or an employee or agent responsible for 
delivering the message to the intended recipient, please contact the sender by 
reply e-mail and destroy all copies of the original message, including any 
attachments.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] TOPUP Settings

2018-08-30 Thread Harms, Michael

FWIW, the readout distortion effect in the structurals is such a small effect 
that it is difficult to confirm the appropriate polarities for that correction 
by trial and error.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of Keith Jamison 

Date: Thursday, August 30, 2018 at 8:51 AM
To: "Glasser, Matthew" 
Cc: "hcp-users@humanconnectome.org" , "Stevens, 
Michael" 
Subject: Re: [HCP-Users] TOPUP Settings

Note: The --UnwarpDir in PreFreeSurfer is the readout direction of the T1w and 
T2w images themselves, which in this case is completely unrelated to the plane 
in which the SE fieldmaps read out (which is set in --SEUnwarpDir). fMRI and 
DWI are 2D and their unwarpdir are more obvious. For the 3D T1w and T2w 
sequences, the readout direction in which most distortion occurs is trickier to 
figure out, which is why Matt suggests trial and error.  For HCP acquisitions 
it's always been "z".

-Keith

On Wed, Aug 29, 2018 at 7:02 PM, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
Hi Mike,

As far as I know the only sure way is with trial and error, but perhaps the 
latest dcm2niix has eliminated that.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of "Stevens, Michael" 
mailto:michael.stev...@hhchealth.org>>
Date: Wednesday, August 29, 2018 at 5:54 PM
To: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] TOPUP Settings

Also, I’d like to confirm some settings for the HCP scripts, as I’m helping 
some colleagues set up a new protocol that we plan to use HCP pipeline 
processing on.

For spin echo-based unwarping, if HCP’s standard SE field map sequences are 
used and are acquired A>>P first, P>>A second, should the proper setting for 
UnwarpDir in the PreFreeSurfer script unwarping of the T1w image be ‘y’ or 
‘y-‘?  And further… Shouldn’t the same UnwarpDir setting be used for fMRI 
unwarping?

I set my own HCP protocols up a couple of years back and I’m straining my 
memory to recall exactly how the HCP scripts parse all this and create the 
commands that call topup.  I’m hoping to avoid some painstaking trial-and-error 
by asking.

Thanks,
Mike


This e-mail message, including any attachments, is for the sole use of the 
intended recipient(s) and may contain confidential and privileged information. 
Any unauthorized review, use, disclosure, or distribution is prohibited. If you 
are not the intended recipient, or an employee or agent responsible for 
delivering the message to the intended recipient, please contact the sender by 
reply e-mail and destroy all copies of the original message, including any 
attachments.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] TOPUP Settings

2018-08-30 Thread Harms, Michael

Hi,
Note that UnwarpDir in the PreFreeSurfer script is a very different thing from 
UnwarpDir in the fMRIVolume script.  In the former, it specifies the *readout* 
direction of the *structurals* -- typically ‘z’ for how we collect T1 and T2 
structurals on Siemens scanners (assuming that the NIFTI you are using as input 
is either “LAS” or “RAS” oriented!!).  In the latter, it is the *phase 
encoding* direction of the *BOLD* scan.

In the PreFreeSurfer script, the phase encoding direction of the ancillary SE 
field map scans is specified by the SEUnwarpDir variable.  If those are AP/PA, 
we use SEUnwarpDir = y, and enter the SpinEchoFieldMap_PA scan as the input for 
the SpinEchoPhaseEncodePositive variable, and the SpinEchoFieldMap_AP scan as 
the input for the SpinEchoPhaseEncodeNegative variable.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu

From:  on behalf of "Glasser, Matthew" 

Date: Wednesday, August 29, 2018 at 6:03 PM
To: "Stevens, Michael" , 
"hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] TOPUP Settings

Hi Mike,

As far as I know the only sure way is with trial and error, but perhaps the 
latest dcm2niix has eliminated that.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of "Stevens, Michael" 
mailto:michael.stev...@hhchealth.org>>
Date: Wednesday, August 29, 2018 at 5:54 PM
To: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] TOPUP Settings

Also, I’d like to confirm some settings for the HCP scripts, as I’m helping 
some colleagues set up a new protocol that we plan to use HCP pipeline 
processing on.

For spin echo-based unwarping, if HCP’s standard SE field map sequences are 
used and are acquired A>>P first, P>>A second, should the proper setting for 
UnwarpDir in the PreFreeSurfer script unwarping of the T1w image be ‘y’ or 
‘y-‘?  And further… Shouldn’t the same UnwarpDir setting be used for fMRI 
unwarping?

I set my own HCP protocols up a couple of years back and I’m straining my 
memory to recall exactly how the HCP scripts parse all this and create the 
commands that call topup.  I’m hoping to avoid some painstaking trial-and-error 
by asking.

Thanks,
Mike


This e-mail message, including any attachments, is for the sole use of the 
intended recipient(s) and may contain confidential and privileged information. 
Any unauthorized review, use, disclosure, or distribution is prohibited. If you 
are not the intended recipient, or an employee or agent responsible for 
delivering the message to the intended recipient, please contact the sender by 
reply e-mail and destroy all copies of the original message, including any 
attachments.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] help on Simple higher level FEAT analysis

2018-08-20 Thread Harms, Michael

Hi,
FSL/FEAT are not currently equipped to handle grayordinate data.

We have examples of a “third-level” (across subject) analyses as part of the 
Practicals in the HCP Course.  The content for the 2018 course (in June) will 
hopefully become available online in the near future.  Until then, the material 
from the 2017 course is available, which covered the same basic content in this 
regard:
https://store.humanconnectome.org/courses/2017/exploring-the-human-connectome.php
(see the “tfMRI and PALM” tutorial).

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu
From:  on behalf of "Linnman, 
Clas,Ph.D." 
Date: Monday, August 20, 2018 at 11:00 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] help on Simple higher level FEAT analysis


Hi,

I am trying to do a simple higher level FEAT analysis on some task fMRI data.

I would like to correlate a variable to the motor responses.

From 
https://wiki.humanconnectome.org/display/PublicData/Advice+for+FEAT+Analysis+of+HCP+task+fMRI+data

I tried the following
Higher-level Analysis
FEAT GUI, choose that "Inputs are lower-level FEAT directories". Under "Select 
FEAT directories", for each participant in your group analysis, enter the paths 
to the .feat directories for the specific lower-level cope of interest (e.g., 
/myDATA/100307/MNINonLinear/Results/tfMRI_MOTOR/tfMRI_MOTOR_hp200_s2_level2.feat/GrayordinatesStats/cope13.feat
FSL then tells me it can not find the design.fsl file in the directory
But if I chose the above directory as the input 
(/myDATA/100307/MNINonLinear/Results/tfMRI_MOTOR/tfMRI_MOTOR_hp200_s2_level2.feat)
FSL tells me the directory contains no stats/cope image

Is there an example on how to do this?

Thanks
Clas



The information in this e-mail is intended only for the person to whom it is
addressed. If you believe this e-mail was sent to you in error and the e-mail
contains patient information, please contact the Partners Compliance HelpLine at
http://www.partners.org/complianceline . If the e-mail was sent to you in error
but does not contain patient information, please contact the sender and properly
dispose of the e-mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] DICOM to NIFTI in HCP standard

2018-08-16 Thread Harms, Michael

Hi,
Sorry, but not to my knowledge.  We use ‘dcm2niix’ currently for DICOM to NIFTI 
conversion (‘dcm2niix’ generates nice .json files containing a bunch of 
relevant parameters of the scan).  That particular step is pretty 
straightforward.

Then, as part of our “session building” process, we have code that does various 
checks and then “bundles” the ancillary files (e.g., SpinEchoFieldMaps) with 
their appropriate associated “main scan” into our “unproc” directory.  That 
session-building “magic” is highly customized, and is not code that is 
currently applicable to generalized use.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu
From:  on behalf of Shachar Gal 

Date: Thursday, August 16, 2018 at 12:27 PM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] DICOM to NIFTI in HCP standard

Hello,
Upon starting to analyse our data, which was acquired with an HCP-like 
protocol, with the HCP pipelines, we were wondering if there is an available 
script or tool that converts DICOM to NIFTI in the HCP dir structure.

Hope to hear from you.

Shachar Gal,
Tel Aviv University

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Diffusion Times (Delta and delta) for HCP 1200 release?

2018-08-13 Thread Harms, Michael

Via Essa Yacoub:

from Stam's paper -
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3720790/


Sampling in q-space includes 3 shells at b=1000, 2000 and 3000 s/mm2 (diffusion 
times are Δ=43.1 ms and d=10.6 ms, and Gmax=97.4 mT/m after vendor supplied 
gradient duty cycle optimization).


--
Michael Harms, Ph.D.

---

Associate Professor of Psychiatry

Washington University School of Medicine

Department of Psychiatry, Box 8134

660 South Euclid Ave.Tel: 314-747-6173

St. Louis, MO  63110  Email: mha...@wustl.edu
On 8/13/18, 12:57 PM, "hcp-users-boun...@humanconnectome.org on behalf of 
Nicola Toschi"  wrote:

Dear List,

I was looking for the data mentioned in the subject line (specific about
the monopolar sequence used in diffusion scanning, i.e. the diffusion
times bigdelta and smalldelta).

I could not find them in the manual ro appendix (forgive me if this is
an oversight). Could someone please point me to them?

Thank you very much in advance,

nicola


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Finding a significant difference only in one hemisphere

2018-08-08 Thread Harms, Michael

Hi,
Viewing on the inflated surface would probably be more helpful.

Also, how many subjects per group?

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu
From: Darko Komnenić 
Date: Wednesday, August 8, 2018 at 2:08 PM
To: "Glasser, Matthew" 
Cc: "Harms, Michael" , "Winkler, Anderson (NIH/NIMH) [E]" 
, "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] Finding a significant difference only in one hemisphere

Thanks a lot for the clarifications everyone!
The first two images attached are average myelin maps for controls and 
patients, respectively.
The third image is what I hope to be the effect size map. I added the -saveglm 
flag to palm, and then used command -cifti-create-dense-from-template to merge 
the dpv_cohen files for left and right hemisphere. Let me know if this does not 
work.
Thanks in advance for any comments!
Best,
Darko




On Wed, Aug 8, 2018 at 2:22 AM, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
What is an effect size map?  An effect size map is in the case of a t-test of 
difference between two groups is simply the difference in the means.  A 
standardized effect size maps is a Cohen’s d, which is helpful if you want to 
compare effect sizes of different measures that are not on the same scale.  
Looking at both the difference between means and the means themselves could be 
helpful in tracking down artifacts.  Looking at maps of statistical 
significance is not helpful, despite what you see typically done in the 
neuroimaging literature.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of "Harms, Michael" mailto:mha...@wustl.edu>>
Date: Tuesday, August 7, 2018 at 3:47 PM
To: "Winkler, Anderson (NIH/NIMH) [E]" 
mailto:anderson.wink...@nih.gov>>, Darko Komnenić 
mailto:komnen...@gmail.com>>

Cc: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] Finding a significant difference only in one hemisphere


A Cohen’s d map of the group difference might be helpful, but even more basic 
would be to just merge the individual myelin maps (-cifti-merge, or use 
wb_shortcuts -cifti-concatenate) and then average them (-cifti-average).

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid 
Ave<https://maps.google.com/?q=660+South+Euclid+Ave=gmail=g>.  
  Tel: 314-747-6173
St. Louis, MO  63110  Email: 
mha...@wustl.edu<mailto:mha...@wustl.edu>
From: "Winkler, Anderson (NIH/NIMH) [E]" 
mailto:anderson.wink...@nih.gov>>
Date: Tuesday, August 7, 2018 at 3:02 PM
To: Darko Komnenić mailto:komnen...@gmail.com>>, "Harms, 
Michael" mailto:mha...@wustl.edu>>
Cc: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] Finding a significant difference only in one hemisphere

Hi Darko,

The option “-saveglm” in PALM should produce Cohen’s d maps.

All the best,

Anderson


From: Darko Komnenić mailto:komnen...@gmail.com>>
Date: Tuesday, August 7, 2018 at 15:25
To: "Harms, Michael" mailto:mha...@wustl.edu>>
Cc: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] Finding a significant difference only in one hemisphere

Hi Michael,
I don't seem to have an average myelin map as an output of the analysis. Is it 
just something I can make by merging and averaging individual myelin maps in 
workbench, or should it have been the output of the GLM analysis? Sorry for 
this most likely really basic question.
Thanks in advance!
Best,
Darko

On Tue, Aug 7, 2018 at 3:53 PM, Harms, Michael 
mailto:mha...@wustl.edu>> wrote:

It might be helpful to simply see the average myelin map for each group.  Do 
those look appropriate?

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid 
Ave<https://maps.google.com/?q=660+South+Euclid+Ave=gmail=g>.  
  Tel: 314-747-6173
St. Louis, MO  63110  Email: 
mha...@wustl.edu<mailto:mha...@wustl.edu>
From: 
mailto:hcp-users-boun...@humanconnectome.or

Re: [HCP-Users] Finding a significant difference only in one hemisphere

2018-08-07 Thread Harms, Michael

A Cohen’s d map of the group difference might be helpful, but even more basic 
would be to just merge the individual myelin maps (-cifti-merge, or use 
wb_shortcuts -cifti-concatenate) and then average them (-cifti-average).

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu
From: "Winkler, Anderson (NIH/NIMH) [E]" 
Date: Tuesday, August 7, 2018 at 3:02 PM
To: Darko Komnenić , "Harms, Michael" 
Cc: "hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] Finding a significant difference only in one hemisphere

Hi Darko,

The option “-saveglm” in PALM should produce Cohen’s d maps.

All the best,

Anderson


From: Darko Komnenić 
Date: Tuesday, August 7, 2018 at 15:25
To: "Harms, Michael" 
Cc: "hcp-users@humanconnectome.org" 
Subject: Re: [HCP-Users] Finding a significant difference only in one hemisphere

Hi Michael,
I don't seem to have an average myelin map as an output of the analysis. Is it 
just something I can make by merging and averaging individual myelin maps in 
workbench, or should it have been the output of the GLM analysis? Sorry for 
this most likely really basic question.
Thanks in advance!
Best,
Darko

On Tue, Aug 7, 2018 at 3:53 PM, Harms, Michael 
mailto:mha...@wustl.edu>> wrote:

It might be helpful to simply see the average myelin map for each group.  Do 
those look appropriate?

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid 
Ave<https://maps.google.com/?q=660+South+Euclid+Ave=gmail=g>.  
  Tel: 314-747-6173
St. Louis, MO  63110  Email: 
mha...@wustl.edu<mailto:mha...@wustl.edu>
From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Darko Komnenić mailto:komnen...@gmail.com>>
Date: Tuesday, August 7, 2018 at 6:50 AM
To: "Glasser, Matthew" mailto:glass...@wustl.edu>>

Cc: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] Finding a significant difference only in one hemisphere

Hi Matt,
I'm not sure what you mean by a simple effect size map. The file overlaid on 
the surfaces here is a file showing the difference between control and patient 
groups, after correction for multiple comparisons. Do I use that to make an 
effect size map, or something else?
Best,
Darko

On Mon, Aug 6, 2018 at 9:03 PM, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
Can you just make a simple effect size map with + and – and no thresholding?

Matt.

From: Darko Komnenić mailto:komnen...@gmail.com>>
Date: Monday, August 6, 2018 at 12:09 PM

To: Matt Glasser mailto:glass...@wustl.edu>>
Cc: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] Finding a significant difference only in one hemisphere

Hi Matt,
here's a screenshot of an unthresholded map.
Thanks in advance!
Best,
Darko

On Fri, Aug 3, 2018 at 10:12 PM, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
Thresholded maps of statistical significance are essentially uninterpretable as 
to the existence of artifacts.  Please provide an unthresholded effect size map.

Peace,

Matt.

From: Darko Komnenić mailto:komnen...@gmail.com>>
Date: Friday, August 3, 2018 at 12:12 PM
To: Matt Glasser mailto:glass...@wustl.edu>>
Cc: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] Finding a significant difference only in one hemisphere

Hi Matt,
here's a screenshot
Thanks in advance!
Best,
Darko

On Thu, Aug 2, 2018 at 10:01 PM, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
How about posting some pics.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Darko Komnenić mailto:komnen...@gmail.com>>
Date: Thursday, August 2, 2018 at 9:16 AM
To: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Finding a significant difference only in one hemisphere

Dear HCP experts,
I ran an analysis comparing cortical myelination between patients and controls 
and found a significant difference between groups, but only in the left 
hemisphere. What's making me doubt my results is that the differences are 
observed in large areas of the left hemisphere, but absolutel

Re: [HCP-Users] HCPpipelines Task Analysis Level 2 design error

2018-07-27 Thread Harms, Michael

Yes, if you have 4 runs in your level 2, then you probably need to start with 
fresh templates derived from a 4 run volume analysis.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu
From:  on behalf of "Glasser, Matthew" 

Date: Friday, July 27, 2018 at 4:02 PM
To: Mathias Goncalves , "HCP-Users@humanconnectome.org" 

Subject: Re: [HCP-Users] HCPpipelines Task Analysis Level 2 design error

I think this is tricky because the fsf format is somewhat opaque.  What we did 
to make the original FSFs for the HCP was to run the volume analyses in FEAT 
for one subject and then modify the FSFs (minimally) for CIFTI.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Mathias Goncalves mailto:mathi...@mit.edu>>
Date: Friday, July 27, 2018 at 1:33 PM
To: "HCP-Users@humanconnectome.org" 
mailto:HCP-Users@humanconnectome.org>>
Subject: [HCP-Users] HCPpipelines Task Analysis Level 2 design error

Hi all,

I've been using the HCPpipelines scripts for preprocessing / task-analysis, but 
am having trouble with the level2 task analysis. I'm running the current latest 
(v3.27.0), and have modeled my files similar to the FSF 
templates
 on the repo.

I've attached a log of my output, as well as a level 2 design fsf from a 
particular task (made up of 4 runs). Any suggestions would be appreciated, 
thank you.

Cheers,
Mathias

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Missing output after running genericfMRISurfaceProcessingPipelineBatch.sh

2018-07-24 Thread Harms, Michael

Are you looking in $SUBJID/MNINonLinear/Results/$fMRIName ?


--
Michael Harms, Ph.D.

---

Associate Professor of Psychiatry

Washington University School of Medicine

Department of Psychiatry, Box 8134

660 South Euclid Ave.Tel: 314-747-6173

St. Louis, MO  63110  Email: mha...@wustl.edu
On 7/24/18, 1:53 PM, "hcp-users-boun...@humanconnectome.org on behalf of Marc 
Dubin"  wrote:

Hello,

When I run, genericfMRIVolumeProcessingPipelineBatch.sh, everything runs
smoothly and many output files are generated in the specified subject
folder.

However, when I run genericfMRISurfaceProcessingPipelineBatch.sh, it
runs for a few hours, ends gracefully, but then I can’t find any new
output files that have been generated.

In both cases, I am analyzing the same resting state data and the
TaskList is identical for both volume and surface scripts (including
just the resting state AP and PA scans).

Am I looking in the wrong place?

Thanks!

Marc

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] HCP pipeline working directory documentation

2018-07-24 Thread Harms, Michael

Hi,
I took a very quick look to see what direction this is heading.

One initlal thing I noticed -- the files in $SUBJID/T1w are not in MNI152 space.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu
From:  on behalf of "Glasser, Matthew" 

Date: Tuesday, July 24, 2018 at 10:23 AM
To: "Theis, Nicholas" , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] HCP pipeline working directory documentation

I am not aware of such a document.  I am happy to review it when you are ready 
for accuracy.

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of "Theis, Nicholas" mailto:the...@upmc.edu>>
Date: Tuesday, July 24, 2018 at 9:54 AM
To: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] HCP pipeline working directory documentation


Dear HCP creators and users-



To better understand the HCP pipelines we have been documenting its output 
files.  We want to identify the specific files produced by each stage of the 
pipeline (PreFreeSurfer, FreeSurfer, etc), and label what MRI preprocessing 
step the files represent (skull stripping, readout distortion correction, etc).



To your knowledge, has anyone already created a similar document?   We are 
hoping to make this list as complete -and helpful - as possible - although it 
is still a work in progress, and some aspects of it reflect our particular 
set-up.  Feel free to view the document in the link below.



https://docs.google.com/spreadsheets/d/16mKW14FvQr_Im57eVvhePeT-yuRl7IiaOfP7eDEfppE/edit?usp=sharing



If this is something the wider community would find useful then we are happy to 
share it by other means, especially as it is completed.  Any suggestions or 
corrections are welcomed.



Best,

Nick Theis
[Image removed by 
sender.]

HCP_filenames_corresponding_processing_step
docs.google.com
HCPstages Pipeline Stage, Location, Output File or Directory, Description, 
Links to References where applicable PreFreeSurfer,/ data2/ MRI/ Maria_K01/ 
HCPproc/${ SUBID}/ T1w, BiasField_acpc_dc, Bias Field( gradual intensity change 
across field of view) is corrected for using equation( 1) of Gl...







___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh input

2018-07-22 Thread Harms, Michael

Use the current version of ‘dcm2niix’.  The one you are using is quite old 
given the pace of changes to the converter.

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu
From: Marc Dubin 
Date: Saturday, July 21, 2018 at 11:13 PM
To: "Harms, Michael" 
Cc: "Glasser, Matthew" , "hcp-users@humanconnectome.org" 

Subject: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh 
input


Thanks…..for some reason DwellTime isn’t listed in my json files (attached).

On 21 Jul 2018, at 23:59, Harms, Michael wrote:
Hi,
Use a DICOM reader to check the value of (0019,1018).  Or, IIRC, the value is 
extracted as “DwellTime” in the BIDS json returned by ‘dcm2niix’.

Cheers,
-MH


--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu
From: Marc Dubin 
Date: Saturday, July 21, 2018 at 10:28 PM
To: "Harms, Michael" 
Cc: "Glasser, Matthew" , "hcp-users@humanconnectome.org" 

Subject: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh 
input


Hi Matt,

Do you know how to find the following? I was unable to find them in the 
dcm2niix json file or in the Matlab metadata structure. I just want to be sure 
my values are the same as the default values listed below.

Thanks!

Marc
DICOM field (0019,1018) in s or "NONE" if not used
T1wSampleSpacing="0.074"

# DICOM field (0019,1018) in s or "NONE" if not used
T2wSampleSpacing="0.021"

On 19 Jul 2018, at 22:55, Harms, Michael wrote:

As defined in the text comment, “ReconMatrixPE” is just the size (dimension) of 
your images in the PE direction.  You don’t need to attempt to “derive” it from 
fields in the DICOM – just check your converted NIFTI itself!

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu
From: Marc Dubin 
Date: Thursday, July 19, 2018 at 9:46 PM
To: "Glasser, Matthew" 
Cc: "Harms, Michael" , "hcp-users@humanconnectome.org" 

Subject: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh 
input


I used the Matlab DICOM viewer and got BWPPPE as a private field, but did not 
see a field for ReconMatrixPE (is the latter a field in the DICOM header or 
computed from fields?)

Thanks again!

On 19 Jul 2018, at 22:36, Glasser, Matthew wrote:
I would just pull the numbers directly from the indicated DICOM fields using a 
DICOM header viewer such as this one:

https://wiki.xnat.org/xnat-tools/dicombrowser<https://urldefense.proofpoint.com/v2/url?u=https-3A__wiki.xnat.org_xnat-2Dtools_dicombrowser=DwMF-g=lb62iw4YL4RFalcE2hQUQealT9-RXrryqt9KZX2qu2s=C9S_n0bF0p6dJ48M-Lgp1b9KBLxvTD5zOml7nDJ1vZU=GqesNQY9bpJn68E36-I5WBDQNXo9kEZLUS292SO3PoE=7rBFGJ2tD4GOT8JdpYtKkt7yMEQXReaOMxgQNWKQIQs=>

And then compute using the indicated formulas.  Unfortunately it seems one of 
the field names has been deleted and now it is confusing, but here is what it 
used to say:

https://github.com/Washington-University/HCPpipelines/commit/ca8c80ac7a615f8b8ebbd889b5f396bdcd788db8#diff-1bec2a8a1f62ce72a23680add2b08468<https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_Washington-2DUniversity_HCPpipelines_commit_ca8c80ac7a615f8b8ebbd889b5f396bdcd788db8-23diff-2D1bec2a8a1f62ce72a23680add2b08468=DwMF-g=lb62iw4YL4RFalcE2hQUQealT9-RXrryqt9KZX2qu2s=C9S_n0bF0p6dJ48M-Lgp1b9KBLxvTD5zOml7nDJ1vZU=GqesNQY9bpJn68E36-I5WBDQNXo9kEZLUS292SO3PoE=Z7LvjnrPy7s1J_JJiIMB0ztIa8zp_99rxBH-bXfEb8k=>

Matt.

From: Marc Dubin mailto:mrd9...@med.cornell.edu>>
Date: Thursday, July 19, 2018 at 9:24 PM
To: Matt Glasser mailto:glass...@wustl.edu>>
Cc: "Harms, Michael" mailto:mha...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh 
input


Thank you, Matthew!

Is the Size of the reconstructed image in the PE (phase encoding) direction the 
same as the number of phase encoding steps (109 in the case of my scans)?

Marc

On 18 Jul 2018, at 9:11, Glasser, Matthew wrote:
You need to follow the instru

Re: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh input

2018-07-21 Thread Harms, Michael
Hi,
Use a DICOM reader to check the value of (0019,1018).  Or, IIRC, the value is 
extracted as “DwellTime” in the BIDS json returned by ‘dcm2niix’.

Cheers,
-MH


--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu
From: Marc Dubin 
Date: Saturday, July 21, 2018 at 10:28 PM
To: "Harms, Michael" 
Cc: "Glasser, Matthew" , "hcp-users@humanconnectome.org" 

Subject: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh 
input


Hi Matt,

Do you know how to find the following? I was unable to find them in the 
dcm2niix json file or in the Matlab metadata structure. I just want to be sure 
my values are the same as the default values listed below.

Thanks!

Marc
DICOM field (0019,1018) in s or "NONE" if not used
T1wSampleSpacing="0.074"

# DICOM field (0019,1018) in s or "NONE" if not used
T2wSampleSpacing="0.021"

On 19 Jul 2018, at 22:55, Harms, Michael wrote:

As defined in the text comment, “ReconMatrixPE” is just the size (dimension) of 
your images in the PE direction.  You don’t need to attempt to “derive” it from 
fields in the DICOM – just check your converted NIFTI itself!

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu
From: Marc Dubin 
Date: Thursday, July 19, 2018 at 9:46 PM
To: "Glasser, Matthew" 
Cc: "Harms, Michael" , "hcp-users@humanconnectome.org" 

Subject: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh 
input


I used the Matlab DICOM viewer and got BWPPPE as a private field, but did not 
see a field for ReconMatrixPE (is the latter a field in the DICOM header or 
computed from fields?)

Thanks again!

On 19 Jul 2018, at 22:36, Glasser, Matthew wrote:
I would just pull the numbers directly from the indicated DICOM fields using a 
DICOM header viewer such as this one:

https://wiki.xnat.org/xnat-tools/dicombrowser<https://urldefense.proofpoint.com/v2/url?u=https-3A__wiki.xnat.org_xnat-2Dtools_dicombrowser=DwMF-g=lb62iw4YL4RFalcE2hQUQealT9-RXrryqt9KZX2qu2s=C9S_n0bF0p6dJ48M-Lgp1b9KBLxvTD5zOml7nDJ1vZU=GqesNQY9bpJn68E36-I5WBDQNXo9kEZLUS292SO3PoE=7rBFGJ2tD4GOT8JdpYtKkt7yMEQXReaOMxgQNWKQIQs=>

And then compute using the indicated formulas.  Unfortunately it seems one of 
the field names has been deleted and now it is confusing, but here is what it 
used to say:

https://github.com/Washington-University/HCPpipelines/commit/ca8c80ac7a615f8b8ebbd889b5f396bdcd788db8#diff-1bec2a8a1f62ce72a23680add2b08468<https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_Washington-2DUniversity_HCPpipelines_commit_ca8c80ac7a615f8b8ebbd889b5f396bdcd788db8-23diff-2D1bec2a8a1f62ce72a23680add2b08468=DwMF-g=lb62iw4YL4RFalcE2hQUQealT9-RXrryqt9KZX2qu2s=C9S_n0bF0p6dJ48M-Lgp1b9KBLxvTD5zOml7nDJ1vZU=GqesNQY9bpJn68E36-I5WBDQNXo9kEZLUS292SO3PoE=Z7LvjnrPy7s1J_JJiIMB0ztIa8zp_99rxBH-bXfEb8k=>

Matt.

From: Marc Dubin mailto:mrd9...@med.cornell.edu>>
Date: Thursday, July 19, 2018 at 9:24 PM
To: Matt Glasser mailto:glass...@wustl.edu>>
Cc: "Harms, Michael" mailto:mha...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh 
input


Thank you, Matthew!

Is the Size of the reconstructed image in the PE (phase encoding) direction the 
same as the number of phase encoding steps (109 in the case of my scans)?

Marc

On 18 Jul 2018, at 9:11, Glasser, Matthew wrote:
You need to follow the instructions listed to find those DICOM fields and put 
in the values.

b02b0.cnf is fine.

Peace,

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Marc Dubin 
mailto:mrd9...@med.cornell.edu>>
Date: Tuesday, July 17, 2018 at 11:29 PM
To: "Harms, Michael" mailto:mha...@wustl.edu>>
Cc: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh 
input


Thank you, Michael!

Do you know for this section:

"Effective" Echo Spacing of Spin Echo Field Maps. Specified in seconds.

# Set to "NONE" if not used.

# SEEchoSpacing = 1/(BWPPPE * ReconMatrixPE)

#   where BWPPPE is the "Bandwidth

Re: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh input

2018-07-19 Thread Harms, Michael

As defined in the text comment, “ReconMatrixPE” is just the size (dimension) of 
your images in the PE direction.  You don’t need to attempt to “derive” it from 
fields in the DICOM – just check your converted NIFTI itself!

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu
From: Marc Dubin 
Date: Thursday, July 19, 2018 at 9:46 PM
To: "Glasser, Matthew" 
Cc: "Harms, Michael" , "hcp-users@humanconnectome.org" 

Subject: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh 
input


I used the Matlab DICOM viewer and got BWPPPE as a private field, but did not 
see a field for ReconMatrixPE (is the latter a field in the DICOM header or 
computed from fields?)

Thanks again!

On 19 Jul 2018, at 22:36, Glasser, Matthew wrote:
I would just pull the numbers directly from the indicated DICOM fields using a 
DICOM header viewer such as this one:

https://wiki.xnat.org/xnat-tools/dicombrowser<https://urldefense.proofpoint.com/v2/url?u=https-3A__wiki.xnat.org_xnat-2Dtools_dicombrowser=DwMF-g=lb62iw4YL4RFalcE2hQUQealT9-RXrryqt9KZX2qu2s=C9S_n0bF0p6dJ48M-Lgp1b9KBLxvTD5zOml7nDJ1vZU=GqesNQY9bpJn68E36-I5WBDQNXo9kEZLUS292SO3PoE=7rBFGJ2tD4GOT8JdpYtKkt7yMEQXReaOMxgQNWKQIQs=>

And then compute using the indicated formulas.  Unfortunately it seems one of 
the field names has been deleted and now it is confusing, but here is what it 
used to say:

https://github.com/Washington-University/HCPpipelines/commit/ca8c80ac7a615f8b8ebbd889b5f396bdcd788db8#diff-1bec2a8a1f62ce72a23680add2b08468<https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_Washington-2DUniversity_HCPpipelines_commit_ca8c80ac7a615f8b8ebbd889b5f396bdcd788db8-23diff-2D1bec2a8a1f62ce72a23680add2b08468=DwMF-g=lb62iw4YL4RFalcE2hQUQealT9-RXrryqt9KZX2qu2s=C9S_n0bF0p6dJ48M-Lgp1b9KBLxvTD5zOml7nDJ1vZU=GqesNQY9bpJn68E36-I5WBDQNXo9kEZLUS292SO3PoE=Z7LvjnrPy7s1J_JJiIMB0ztIa8zp_99rxBH-bXfEb8k=>

Matt.

From: Marc Dubin mailto:mrd9...@med.cornell.edu>>
Date: Thursday, July 19, 2018 at 9:24 PM
To: Matt Glasser mailto:glass...@wustl.edu>>
Cc: "Harms, Michael" mailto:mha...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh 
input


Thank you, Matthew!

Is the Size of the reconstructed image in the PE (phase encoding) direction the 
same as the number of phase encoding steps (109 in the case of my scans)?

Marc

On 18 Jul 2018, at 9:11, Glasser, Matthew wrote:
You need to follow the instructions listed to find those DICOM fields and put 
in the values.

b02b0.cnf is fine.

Peace,

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Marc Dubin 
mailto:mrd9...@med.cornell.edu>>
Date: Tuesday, July 17, 2018 at 11:29 PM
To: "Harms, Michael" mailto:mha...@wustl.edu>>
Cc: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh 
input


Thank you, Michael!

Do you know for this section:

"Effective" Echo Spacing of Spin Echo Field Maps. Specified in seconds.

# Set to "NONE" if not used.

# SEEchoSpacing = 1/(BWPPPE * ReconMatrixPE)

#   where BWPPPE is the "BandwidthPerPixelPhaseEncode" = DICOM field 
(0019,1028) for Siemens, and

#   ReconMatrixPE = size of the reconstructed SEFM images in the PE 
dimension

# In-plane acceleration, phase oversampling, phase resolution, phase 
field-of-view, and interpolation

# all potentially need to be accounted for (which they are in Siemen's 
reported BWPPPE)

#

# Example value for when using Spin Echo Field Maps from the HCP-YA

#   0.000580002668012

SEEchoSpacing="NONE"

How I would know what the appropriate value of SEEchoSpacing would be for my 
data?

Also, is it safe to assume: TopUpConfig="${HCPPIPEDIR_Config}/b02b0.cnf”. ?

Thanks again!

On 18 Jul 2018, at 0:20, Harms, Michael wrote:

Hi,
If you are using SEFMs, you don't need to generate magnitude and phase field 
maps. It is handled internally. See the Batch script in the Examples directory.

Cheers,
-MH

--
Michael Harms, Ph.D.

---

Associate Professor of Psychiatry

Washington University School of Medicine

Department of Psychiatry, Box 8134

660 South Euclid Ave. Tel: 314-747-6173

St. Louis, MO 63110 Email: mha...@wustl.edu<mailto:mha...@wustl.edu>
On 7/17/18, 11:12 PM, 
"hcp-user

Re: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh input

2018-07-19 Thread Harms, Michael

Use the text in the current master as your guide (not the older guidance) 
because the current guidance has been specifically constructed to be accurate 
in all known situations with Siemens data, include partial fourier, in-plane 
acceleration, phase oversampling, interpolation, etc, etc.

If you are using a modern version of ‘dcm2niix’ for your DICOM to NIFTI 
conversion, the necessary values are all provided for you as part of the BIDS 
sidecar (.json) file.

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: mha...@wustl.edu
From: "Glasser, Matthew" 
Date: Thursday, July 19, 2018 at 9:36 PM
To: Marc Dubin 
Cc: "Harms, Michael" , "hcp-users@humanconnectome.org" 

Subject: Re: [HCP-Users] Generating Field Maps for 
PreFreeSurferPipelineBatch.sh input

I would just pull the numbers directly from the indicated DICOM fields using a 
DICOM header viewer such as this one:

https://wiki.xnat.org/xnat-tools/dicombrowser

And then compute using the indicated formulas.  Unfortunately it seems one of 
the field names has been deleted and now it is confusing, but here is what it 
used to say:

https://github.com/Washington-University/HCPpipelines/commit/ca8c80ac7a615f8b8ebbd889b5f396bdcd788db8#diff-1bec2a8a1f62ce72a23680add2b08468

Matt.

From: Marc Dubin mailto:mrd9...@med.cornell.edu>>
Date: Thursday, July 19, 2018 at 9:24 PM
To: Matt Glasser mailto:glass...@wustl.edu>>
Cc: "Harms, Michael" mailto:mha...@wustl.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh 
input


Thank you, Matthew!

Is the Size of the reconstructed image in the PE (phase encoding) direction the 
same as the number of phase encoding steps (109 in the case of my scans)?

Marc

On 18 Jul 2018, at 9:11, Glasser, Matthew wrote:
You need to follow the instructions listed to find those DICOM fields and put 
in the values.

b02b0.cnf is fine.

Peace,

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Marc Dubin 
mailto:mrd9...@med.cornell.edu>>
Date: Tuesday, July 17, 2018 at 11:29 PM
To: "Harms, Michael" mailto:mha...@wustl.edu>>
Cc: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh 
input


Thank you, Michael!

Do you know for this section:

"Effective" Echo Spacing of Spin Echo Field Maps. Specified in seconds.

# Set to "NONE" if not used.

# SEEchoSpacing = 1/(BWPPPE * ReconMatrixPE)

#   where BWPPPE is the "BandwidthPerPixelPhaseEncode" = DICOM field 
(0019,1028) for Siemens, and

#   ReconMatrixPE = size of the reconstructed SEFM images in the PE 
dimension

# In-plane acceleration, phase oversampling, phase resolution, phase 
field-of-view, and interpolation

# all potentially need to be accounted for (which they are in Siemen's 
reported BWPPPE)

#

# Example value for when using Spin Echo Field Maps from the HCP-YA

#   0.000580002668012

SEEchoSpacing="NONE"

How I would know what the appropriate value of SEEchoSpacing would be for my 
data?

Also, is it safe to assume: TopUpConfig="${HCPPIPEDIR_Config}/b02b0.cnf”. ?

Thanks again!

On 18 Jul 2018, at 0:20, Harms, Michael wrote:

Hi,
If you are using SEFMs, you don't need to generate magnitude and phase field 
maps. It is handled internally. See the Batch script in the Examples directory.

Cheers,
-MH

--
Michael Harms, Ph.D.

---

Associate Professor of Psychiatry

Washington University School of Medicine

Department of Psychiatry, Box 8134

660 South Euclid Ave. Tel: 314-747-6173

St. Louis, MO 63110 Email: mha...@wustl.edu<mailto:mha...@wustl.edu>
On 7/17/18, 11:12 PM, 
"hcp-users-boun...@humanconnectome.org<mailto:hcp-users-boun...@humanconnectome.org>
 on behalf of Marc Dubin" 
mailto:hcp-users-boun...@humanconnectome.org>
 on behalf of mrd9...@med.cornell.edu<mailto:mrd9...@med.cornell.edu>> wrote:

Dear All,

I am new to the HCP MR pipelines and was wondering if anyone can provide
guidance on how to generate magnitude and phase field maps for input to
PreFreeSurferPipelineBatch.sh

The files that I currently have from the Siemens 3T scanner are:

SpinEchoFieldMap_PA
SpinEchoFieldMap_AP

Thanks for any suggestions!!

Marc Dubin


Marc Dubin, MD PhD
Assistant Professor of Psychiatry and Neuroscience
Department o

Re: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh input

2018-07-18 Thread Harms, Michael

If you pull the most recent (master) version of PreFreeSurferPipelineBatch.sh 
from GitHub, you'll get one in which we recently improved the comments to 
hopefully make things clearer.

Cheers,
-MH

--
Michael Harms, Ph.D.

---

Associate Professor of Psychiatry

Washington University School of Medicine

Department of Psychiatry, Box 8134

660 South Euclid Ave.Tel: 314-747-6173

St. Louis, MO  63110  Email: mha...@wustl.edu
On 7/17/18, 11:48 PM, "Marc Dubin"  wrote:

Hi Michael,

Another quick question…

What would be appropriate values for:

DwellTime

SEUnwarpDir

TopupConfig

Thanks!

Marc


On 18 Jul 2018, at 0:20, Harms, Michael wrote:

> Hi,
> If you are using SEFMs, you don't need to generate magnitude and phase
> field maps.  It is handled internally.  See the Batch script in the
> Examples directory.
>
> Cheers,
> -MH
>
> --
> Michael Harms, Ph.D.
>
> ---
>
> Associate Professor of Psychiatry
>
> Washington University School of Medicine
>
> Department of Psychiatry, Box 8134
>
> 660 South Euclid Ave.Tel: 314-747-6173
>
> St. Louis, MO  63110  Email: mha...@wustl.edu
> On 7/17/18, 11:12 PM, "hcp-users-boun...@humanconnectome.org on
> behalf of Marc Dubin"  of mrd9...@med.cornell.edu> wrote:
>
> Dear All,
>
> I am new to the HCP MR pipelines and was wondering if anyone can
> provide
> guidance on how to generate magnitude and phase field maps for input
> to
> PreFreeSurferPipelineBatch.sh
>
> The files that I currently have from the Siemens 3T scanner are:
>
> SpinEchoFieldMap_PA
> SpinEchoFieldMap_AP
>
> Thanks for any suggestions!!
>
> Marc Dubin
>
>
> Marc Dubin, MD PhD
> Assistant Professor of Psychiatry and Neuroscience
> Department of Psychiatry and Brain and Mind Research Institute
>
> Weill Cornell Medicine | New York-Presbyterian
> Psychiatry-Box 140
> 525 East 68th Street
> New York, NY 10065
> T 212.746.5817
>
> mrd9...@med.cornell.edu
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.humanconnectome.org_mailman_listinfo_hcp-2Dusers=DwIGaQ=lb62iw4YL4RFalcE2hQUQealT9-RXrryqt9KZX2qu2s=C9S_n0bF0p6dJ48M-Lgp1b9KBLxvTD5zOml7nDJ1vZU=FAIO2nXXbHnlQIPmBor8N3OKZ3Og-QBPPythGBiuwKA=j-Z3UnFpmcfrTtH1NnTvqxuJkBHYdMb0ZZvhZjB0iXc=
>
>
> 
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If
> you are not the intended recipient, be advised that any unauthorized
> use, disclosure, copying or the taking of any action in reliance on
> the contents of this information is strictly prohibited. If you have
> received this email in error, please immediately notify the sender via
> telephone or return mail.



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Generating Field Maps for PreFreeSurferPipelineBatch.sh input

2018-07-17 Thread Harms, Michael

Hi,
If you are using SEFMs, you don't need to generate magnitude and phase field 
maps.  It is handled internally.  See the Batch script in the Examples 
directory.

Cheers,
-MH

--
Michael Harms, Ph.D.

---

Associate Professor of Psychiatry

Washington University School of Medicine

Department of Psychiatry, Box 8134

660 South Euclid Ave.Tel: 314-747-6173

St. Louis, MO  63110  Email: mha...@wustl.edu
On 7/17/18, 11:12 PM, "hcp-users-boun...@humanconnectome.org on behalf of Marc 
Dubin"  wrote:

Dear All,

I am new to the HCP MR pipelines and was wondering if anyone can provide
guidance on how to generate magnitude and phase field maps for input to
PreFreeSurferPipelineBatch.sh

The files that I currently have from the Siemens 3T scanner are:

SpinEchoFieldMap_PA
SpinEchoFieldMap_AP

Thanks for any suggestions!!

Marc Dubin


Marc Dubin, MD PhD
Assistant Professor of Psychiatry and Neuroscience
Department of Psychiatry and Brain and Mind Research Institute

Weill Cornell Medicine | New York-Presbyterian
Psychiatry-Box 140
525 East 68th Street
New York, NY 10065
T 212.746.5817

mrd9...@med.cornell.edu
___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


  1   2   3   4   5   >