Re: [HCP-Users] HCPpipelines Task Analysis Level 2 design error

2018-07-27 Thread Harms, Michael
Yes, if you have 4 runs in your level 2, then you probably need to start with fresh templates derived from a 4 run volume analysis. Cheers, -MH -- Michael Harms, Ph.D. --- Associate Professor of Psychiatry Washington University School of

Re: [HCP-Users] HCPpipelines Task Analysis Level 2 design error

2018-07-27 Thread Glasser, Matthew
I think this is tricky because the fsf format is somewhat opaque. What we did to make the original FSFs for the HCP was to run the volume analyses in FEAT for one subject and then modify the FSFs (minimally) for CIFTI. Matt. From: mailto:hcp-users-boun...@humanconnectome.org>> on behalf of

Re: [HCP-Users] MSMAll

2018-07-27 Thread Glasser, Matthew
We have not found that is necessary. sICA+FIX worked on the HCP task data using the HCP resting state data training set. Matt. From: Timothy Hendrickson mailto:hendr...@umn.edu>> Date: Friday, July 27, 2018 at 3:15 PM To: Matt Glasser mailto:glass...@wustl.edu>> Cc:

Re: [HCP-Users] MSMAll

2018-07-27 Thread Timothy Hendrickson
I noticed on the FSL FIX FAQ page the following: Can I use FIX to clean task fMRI data? - Yes, although you will probably need to create a study-specific training dataset Can you speak to how multi run ICA+FIX gets away without needing a study-specific training dataset? -Tim Timothy

Re: [HCP-Users] wb_command error

2018-07-27 Thread Glasser, Matthew
Hi Mohana, Is this the latest Connectome Workbench? Matt. From: mailto:hcp-users-boun...@humanconnectome.org>> on behalf of Mohana Ramaratnam mailto:mohanakann...@gmail.com>> Date: Friday, July 27, 2018 at 12:44 PM To: "HCP-Users@humanconnectome.org"

Re: [HCP-Users] Movie tasks fMRI

2018-07-27 Thread Elam, Jennifer
Hi Michel, The Movie stimulus files used for 7T are available on the S1200 Project page in ConnectomeDB under "Task Resources". See more info on the 7T movie watching experiment on pp.59-61 of the S1200 Reference

Re: [HCP-Users] Accessing 7T data in the temporary s3 bucket

2018-07-27 Thread Elam, Jennifer
Hi John, Actually, the 7T HCP data is not available on AWS S3 yet due to changes in the AWS public access program that have necessitated a switchover to a new HCP AWS account and the creation of the temporary hcp-openaccess-temp S3 bucket. Once we have the hcp-openaccess S3 bucket back up under

[HCP-Users] Accessing 7T data in the temporary s3 bucket

2018-07-27 Thread Rodgers-Lee, John (NIH/NIMH) [C]
Hi I’m having some issues downloading the 7T data from the temporary s3 bucket. I expect to be able to find 7T data for the following subjects (among others): 995174, 100610 and 878877. I cannot see any in the bucket though for example when I list the results for each subject with: aws s3 ls

[HCP-Users] wb_command error

2018-07-27 Thread Mohana Ramaratnam
Does anyone know why this error would be thrown? /usr/local/workbench/bin_rh_linux64/../exe_rh_linux64/wb_command -metric-merge / opt/data/build/TestUmass/20180718_021040/P01_MR1/MNINonLinear/Native/P01_MR1.L.S trainR_FS.native.shape.gii -metric /opt/data/build/TestUmass/20180718_021040/P01

Re: [HCP-Users] issues with palm tfce for HCP task func data

2018-07-27 Thread Winkler, Anderson (NIH/NIMH) [E]
PS: Note that this is about Inf, but you found NaNs. These NaNs are caused by the Infs, e.g., Inf-Inf = NaN, which is what happens in line 103 of palm_competitive, dd = diff(S(:,c)). From: "Winkler, Anderson (NIH/NIMH) [E]" Date: Friday, July 27, 2018 at 08:29 To: HCP Users Subject: Re: