-cifti-separate creates nifti and gift files from parts of the cifti file. That output is a volume file, and should have been named ending in .nii.gz, such that it would be compressed on-disk. However, I would not recommend doing it this way for all subcortical structures, as that is a lot of volume files, all of which will have the full original FOV from the volume preprocessing output.
If the only thing you want to do with them is average them into per-area timeseries, I suggest you try -cifti-parcellate before anything else. You may need to first make a dlabel file that combines your chosen surface and subcortical parcellations. If you want all the individual timeseries separately, then you might try the -volume-all option of -cifti-separate, and possibly the -crop options, as well as the -roi options to tell you which voxels have data in them. The -metric option will get you the surface data. Note that you can specify these options simultaneously to a single command: wb_command -cifti-separate <input> COLUMN -volume-all out_vol.nii.gz -crop -label out_vol_structs.nii.gz -metric CORTEX_LEFT left_data.func.gii -roi left_roi.func.gii -metric CORTEX_RIGHT right_data.func.gii -roi right_roi.func.gii Tim On Thu, Aug 11, 2016 at 3:35 PM, Ferdaus Kawsar <ferdaus.kaw...@gmail.com> wrote: > Hi Tim, > Thanks for your reply. I used cifti-separate as you suggested to > create volumes from *.dtseries.nii file I have. However, I am having > difficulty to read the resultant file. Is it a cifti file? I cannot read > this file using either ft_read_cifti() or ciftiopen(). Also the file is > huge (1.8 GB). How do I extract time-course from this file? > > I created the file using following command: > wb_command -cifti-separate '*.dtseries.nii' COLUMN -volume PUTAMEN_RIGHT > 'out_putament_right' > > > Best > -Ferdaus > > On Wed, Aug 10, 2016 at 5:42 PM, Timothy Coalson <tsc...@mst.edu> wrote: > >> You might want to use "wb_command -cifti-parcellate" to do this for you, >> it has some extra options to do things like weighted means according to >> vertex area. >> >> Manually matching things up in matlab from loaded cifti files may be >> difficult, as some .dlabel.nii files don't exclude the medial wall, and may >> not include the subcortical voxels that the .dtseries.nii files do. You >> can use -cifti-separate on cifti files to make simpler single-hemisphere >> full-surface gifti files, and a nifti volume file for the subcortical stuff >> (though doing this on the timeseries will result in a fairly large volume >> file when loaded in memory). >> >> Tim >> >> >> On Wed, Aug 10, 2016 at 2:48 PM, Ferdaus Kawsar <ferdaus.kaw...@gmail.com >> > wrote: >> >>> Hi HCP Team, >>> I was wondering if someone could help regarding a >>> task I am trying to do. I need to extract average time-course for both >>> cortical and sub-cortical parcels. >>> I was able to extract average time-courses for sub-cortical parcels. >>> >>> After I loaded my *dtseries.nii cifti files (using ft_read_cifti()) in >>> matlab, I could access all the time-courses for sub-cortical regions. There >>> are 19 sub-cortical regions. After I got all the time -courses for a >>> sub-cortical region, a simple mean() method in matlab gave me the average >>> time-course. I am curious if this approach is right? >>> >>> Also, how do I get the time-courses for a cortical parcel? My eventual >>> goal is to extract average time course for each of 180 parcels in each >>> hemisphere. >>> >>> Best regards, >>> -Ferdaus >>> -- >>> Ferdaus A. Kawsar, PhD >>> Research Scientist II >>> Department of Neurology >>> Medical College of Wisconsin >>> Milwaukee, WI >>> >>> _______________________________________________ >>> HCP-Users mailing list >>> HCP-Users@humanconnectome.org >>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users >>> >> >> > > > -- > Ferdaus A. Kawsar, PhD > Research Scientist II > Department of Neurology > Medical College of Wisconsin > Milwaukee, WI > _______________________________________________ HCP-Users mailing list HCP-Users@humanconnectome.org http://lists.humanconnectome.org/mailman/listinfo/hcp-users