Re: [HCP-Users] Diffusion transforms

2016-07-12 Thread Glasser, Matthew
Why do you need these files? They are processing intermediates that aren't intended to be released because the released data already has this transformation applied to it. Peace, Matt. From: > on behalf of

Re: [HCP-Users] Extracting ROI data from HCP resting state data - 2400 data points instead of 1200 ?

2016-07-12 Thread Glasser, Matthew
I would recommend using the data with _hp2000_clean in the name. I a referring to taking the mean across time at each point in space and subtracting that from the data. Peace, Matt. From: David Hofmann > Date: Tuesday, July 12, 2016 at

Re: [HCP-Users] Extracting ROI data from HCP resting state data - 2400 data points instead of 1200 ?

2016-07-12 Thread Glasser, Matthew
Also it appears you haven’t either cleaned or removed the mean image from the data. Matt. From: > on behalf of Stephen Smith > Date: Tuesday, July 12, 2016 at

Re: [HCP-Users] Mapping volumetric maps on brain surfaces

2016-07-12 Thread Jean-François Cabana
Hi Matthew, Thanks for your answer. Why do you recommend not doing the upsampling? I am not familiar with volume to surface mapping, but the reason I was doing upsampling was because we will also look at the white matter microstructure, and that also may include doing tractography. As some

Re: [HCP-Users] Extracting ROI data from HCP resting state data - 2400 data points instead of 1200 ?

2016-07-12 Thread Stephen Smith
Hi - no we do not (in general for resting-state) ever recommend temporal contatenation like this before further analyses - for the reason you're seeing here. For example, for the HCP released netmats, we take the 4 runs, one at a time, estimate the 4 (zstat) netmats, and average those. Cheers.