Hello HCP users,

I am new to HCP so please excuse me if I'm asking something that should be
obvious. I've tried to find a solution but haven't been successful so far.
I am trying to use the file
HCP_S900_820_rfMRI_MSMAll_groupPCA_d4500ROW_zcorr.dconn.nii which to my
understanding should contain a full FC matrix that is an average of 820
subjects. I want to feed this matrix into a Matlab function. Of course,
this is problematic because of the memory requirements. I have tried to
read the file using ft_read_cifti from fieldtrip, but Matlab crashes at 112
GB of memory. So clearly, this is not the way.

I know I am supposed to be able to use workbench for that, but can't make
it work so far. Can someone point me in the right direction? Is what I'm
trying to do impossible?

I am using a cluster that can handle up to 132 GB of memory per cpu
(RAM+swap) and runs Debian.

Thank you!!
Katharina

-- 
PhD candidate <http://www.cns.upf.edu/katharina>
Computational Neuroscience Group <http://www.cns.upf.edu/>
Center for Brain and Cognition <http://cbc.upf.edu/>
Universitat Pompeu Fabra <http://www.upf.edu/>
Barcelona, Spain

I am funded by the INDIREA <http://www.indirea.eu/> Marie Curie Initial
Training Network.

_______________________________________________
HCP-Users mailing list
[email protected]
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

Reply via email to