Hi Joachim,

While this is possible to do, it will be a little involved.  I am
surprised you are having framerate issues with a 32k mesh, as our software
for displaying CIFTI files, Connectome Workbench, is quite fast when
viewing dense connectomes on multiple 32k meshes at the same time.
Perhaps if I knew a little more about what you were trying to display (off
list if you want to keep it private) I could help you better.  We are
certainly interested in helping others implement their own CIFTI
visualization if their needs fall outside those offered by Connectome
Workbench.

If you take the 32k spheres from the distributed data and downsample them
(Caret5 has a way of doing this, and it sounds like FreeSurfer does
too--and Connectome Workbench command line utilities will have Caret5's
algorithm when we get around to porting it).  You can then use the
wb_command -cifti-resample command to downsample the dense timeseries or
dense connectome.  Note that you will need to do this along both
dimensions of a dense connectome.  We will shortly be releasing a new
version of Connectome Workbench that contains the -cifti-resample command.
 For methodological details, see this paper:

http://www.sciencedirect.com/science/article/pii/S1053811913005053

Note also that the 32k mesh was chosen for a reason: its 2mm average
vertex spacing on the midthickness surface matches the acquisition
resolution of the fMRI data.

Peace,

Matt.

On 6/4/13 4:37 PM, "[email protected]" <[email protected]>
wrote:

>Hi,
>
>we are working on a visualization method for functional connectivity
>on the cortical surface, and would like to use the HCP functional
>connectivity data towards that end. We require surface representations
>and either the time series associated with the cortical nodes, or a
>correlation matrix with the functional
>connectivity strength for each pair of nodes as input for our method.
>I figured out a way to read CIFTI files, however, even the
>down-sampled representations with ~30k nodes are too complex to handle
>at interactive framerates in our software for the time being.
>
>What would be the best way to reduce the resolution of the surfaces to
>something closer to 10k nodes? I know how to decimate the number of
>nodes with the freesurfer tools in order to get a lower resolution
>surface representation for individuals, but am not certain that this
>is the best approach. Is it possible to get matching time series data
>sampled to a surface different than the ones included in the
>preprocessed data?
>
>Any help or pointers in the right direction would be highly appreciated...
>
>Cheers,
>
>Joachim
>_______________________________________________
>HCP-Users mailing list
>[email protected]
>http://lists.humanconnectome.org/mailman/listinfo/hcp-users


________________________________
The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

_______________________________________________
HCP-Users mailing list
[email protected]
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

Reply via email to