meseries get very large when you have a lot of subjects and
> timepoints.
>
> Matt.
>
> From: on behalf of Timothy
> Coalson
> Date: Wednesday, November 28, 2018 at 2:54 PM
> To: "Kenley, Jeanette"
> Cc: hcp-users , "Kaplan, Sydney" <
>
: Glasser, Matthew
Sent: Wednesday, November 28, 2018 6:47 PM
To: NEUROSCIENCE tim; Kenley, Jeanette
Cc: hcp-users; Kaplan, Sydney
Subject: Re: [HCP-Users] average dconn from individual dconns
To be more specific: In the HCP we use a technique called MIGP to make group
fMRI data and generate dense
behalf of Timothy Coalson mailto:tsc...@mst.edu>>
Date: Wednesday, November 28, 2018 at 2:54 PM
To: "Kenley, Jeanette" mailto:jkken...@wustl.edu>>
Cc: hcp-users
mailto:hcp-users@humanconnectome.org>>, "Kaplan,
Sydney" mailto:sydney.kap...@wustl.edu>>
Subj
The HCP pipelines deliberately resample the subcortical data in such a way
that the subcortical voxels used in each subject are the same, this is how
we handle the problem you are having.
If you concatenate your timeseries across subjects before correlation, you
don't need to generate a dconn for
I am still new to the wb_command suite and trying to understand how to best use
them.
I have created an individual cifti (dconn.nii) for each of my subjects in
32kfslr and would like to make an average.
I would like to use
wb_command -cifti-average output.dconn.nii -cifti subject1.dconn.nii