Hi Tim,

Thanks.  That’s clear and sounds like a really reasonable approach.

Can you point me towards the exact files I’d need to reference and maybe 
suggest which function calls I’ll need to use to do the volume-to-surface 
mapping you describe?  I’ll whip up a quick script to loop through about 120 
datasets from this R01 project and let you know how well it works.

Mike


From: Timothy Coalson [mailto:[email protected]]
Sent: Friday, February 23, 2018 6:49 PM
To: Glasser, Matthew
Cc: Stevens, Michael; Erin W. E. Dickie; [email protected]
Subject: Re: [HCP-Users] Best Approach for using old volumetric data to pick 
parcels-of-interest

This is an email from Outside HHC. USE CAUTION opening attachments or links 
from unknown senders.

Surface-based methods may boost your statistical power enough (by better 
alignment, exclusion of irrelevant tissue, and smoothing that doesn't cross 
sulcal banks, if you decide you need smoothing) that you may not need to rely 
as much on existing ROIs.  Parcel-based statistics have a lot of power, because 
the multiple comparisons are orders of magnitude smaller, spatially independent 
noise averages out, and the signal averages together.  We believe that a lot of 
old data would benefit from reanalysis using surfaces.

However, our paper is mainly focused on specificity and continuous data.  If 
you have a binary volume ROI and you only need a rough guess of it on the 
surface, you can get approximate answers, in a way that should reduce false 
negatives (and give more false positives) from the surface/volume transition 
problems.  You can map the ROI to the anatomical MNI surfaces of a group of 
subjects, and take the max across subjects.  Each individual may miss the 
expected group ribbon location in any given location, but it is very likely 
that every point in the expected group ribbon location will overlap with at 
least one subject in the group.  If this isn't enough, you can dilate the 
volume ROI a few mm first.

Tim


On Fri, Feb 23, 2018 at 11:18 AM, Glasser, Matthew 
<[email protected]<mailto:[email protected]>> wrote:
Hi Mike,

We have a preprint out on this exact question and the conclusion is that it is 
really hard to do this accurately for most brain regions:

https://www.biorxiv.org/content/early/2018/01/29/255620

Really the best idea is probably to go back and reanalyze the old data without 
volume-based smoothing and aligned across surfaces.  Erin Dickie, CCed is 
working on tools to make this a little easier, but still there are issues like 
needing a field map to get accurate fMRI to structural registration.  The good 
news is that one’s statistical power should be much better if brains are 
actually lined up, and using parcellated analyses instead of smoothing offers 
further benefits.

Matt.

From: 
<[email protected]<mailto:[email protected]>>
 on behalf of "Stevens, Michael" 
<[email protected]<mailto:[email protected]>>
Date: Friday, February 23, 2018 at 8:58 AM
To: "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
Subject: [HCP-Users] Best Approach for using old volumetric data to pick 
parcels-of-interest

Hi everyone,

There’s been a lot posted here over the past year or two on the challenges and 
limitations of going back-and-forth between volumetric space and HCP-defined 
surface space, with solid arguments for moving to (and sticking with) 
CIFTI-defined brainordinates.  Here, I’m asking a slightly different question… 
The field has decades of research using volume-space fMRI timeseries analyses 
that helps to define where to look in the brain to test new hypotheses.  Has 
anyone got a well-thought-out approach for mapping such volume-space ROIs to 
the parcels within the new HCP 180 atlas?  I ask because the specificity of the 
HCP atlas sometimes offers a half dozen candidate parcels for 
hypothesis-testing for what we previously thought of as just one or two 
regions.  Even though our group currently has a half dozen newer NIH-funded 
studies that use HCP compliant sequences, most of that work is still predicated 
on a “region-of-interest” approach because the study groups sizes are less than 
a hundred, not in the thousands typical of the HCP grantwork.  So we still have 
to contend with the statistical power limitations inherent in any ROI approach. 
 It would be great to be able to use our prior volume-space data to have 
greater confidence in selecting among the various parcel-of-interest candidates 
when testing hypotheses.

I’m wondering if anyone’s yet worked out a step-by-step approach for a series 
of warps/surface-maps/transformations that can take ROIs from MNI space and 
give a “best guess” as to which HCP 180 atlas parcel(s) should be queried in 
such instances.  It would be a nice bridge from older work to newer HCP-guided 
work, that would allow researchers to circumvent the added burden of having to 
go back and collect new pilot data using HCP sequences.  A thoughtful list of 
the analytic or conceptual pros/cons of something like this would be helpful as 
well.

Thanks,
Mike


This e-mail message, including any attachments, is for the sole use of the 
intended recipient(s) and may contain confidential and privileged information. 
Any unauthorized review, use, disclosure, or distribution is prohibited. If you 
are not the intended recipient, or an employee or agent responsible for 
delivering the message to the intended recipient, please contact the sender by 
reply e-mail and destroy all copies of the original message, including any 
attachments.

_______________________________________________
HCP-Users mailing list
[email protected]<mailto:[email protected]>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

_______________________________________________
HCP-Users mailing list
[email protected]<mailto:[email protected]>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



Reminder: This e-mail and any attachments are subject to the current HHC email 
retention policies. Please save or store appropriately in accordance with 
policy.

_______________________________________________
HCP-Users mailing list
[email protected]
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

Reply via email to