Re: [HCP-Users] A question about the geodesic distance

2018-01-03 Thread Glasser, Matthew
This question appears to presume a standard mesh meaning some kind of 
registration has occurred.  I think you need to be a bit more specific about 
what you are wanting to do.

Peace,

Matt.

From: 
>
 on behalf of Aaron C >
Date: Wednesday, January 3, 2018 at 8:19 PM
To: "hcp-users@humanconnectome.org" 
>
Subject: [HCP-Users] A question about the geodesic distance


Dear HCP experts,


I have a question about the geodesic distance with respect to the MSM-All 
registration. Given a pair of brain vertices, is the variance of their geodesic 
distance across different subjects still preserved after MSM-All registration? 
Thank you.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] A question about the geodesic distance

2018-01-03 Thread Aaron C
Dear HCP experts,


I have a question about the geodesic distance with respect to the MSM-All 
registration. Given a pair of brain vertices, is the variance of their geodesic 
distance across different subjects still preserved after MSM-All registration? 
Thank you.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Smoothing, discarding volumes

2018-01-03 Thread Timothy Coalson
To expand on the smoothing issue a bit, even 4mm FWHM spatial smoothing in
the volume causes substantial signal mixing between areas on opposite sides
of sulci, which sounds like a particularly bad idea for ICA.  As I
understand it, group ICA shouldn't care much about spatial noise,
especially when you have a lot of subjects.

However, group ICA does care about alignment, that is, the same element
(voxel or vertex) having the same functional meaning across subjects -
volume alignment is not good at this for most of the human cortex (largely
due to folding variability across subjects), which is why we strongly
recommend using surface registration and analysis for cortex.

Tim


On Wed, Jan 3, 2018 at 1:10 PM, Glasser, Matthew  wrote:

> There are few circumstances that one should do unconstrained smoothing in
> the volume.
>
> Peace,
>
> Matt.
>
> On 1/3/18, 12:53 PM, "hcp-users-boun...@humanconnectome.org on behalf of
> Tobias Bachmann"  tobias.bachm...@studserv.uni-leipzig.de> wrote:
>
> >Hi Stephen,
> >
> >Am Mittwoch, 3. Januar 2018, 16:02:06 CET schrieb Stephen Smith:
> >> > On 3 Jan 2018, at 14:44, Tobias Bachmann
> >> >  wrote:
> >> > when using the HCP's ICA+FIX data (NIfTI volume files) for a rather
> >>simple
> >> > group ICA, would you recommend further conventional preprocessing,
> >>i.e.
> >> >
> >> > 1. smoothing (about which conflicting info is to be found)
> >>
> >> spatial or temporal?
> >
> >Spatial.
> >
> >> The simple answer is in general no, although for the purposes of
> >>group-ICA
> >> only (and not any later subject-specific analyses like dual
> >>regression), it
> >> might be the case that some lowpass temporal filtering could boost
> >> effective CNR.
> >
> >I'm indeed planning on using dual regression later on.
> >
> >> > 2. discarding the first few volumes (I found hardly anything regarding
> >> > this
> >> > issue)?
> >>
> >> For most purposes it's not necessary, though there are some slight
> >>residual
> >> starting effects in the data, so you could delete a few.
> >
> >Okay, thanks a lot!
> >
> >Tobias
> >
> >
> >___
> >HCP-Users mailing list
> >HCP-Users@humanconnectome.org
> >http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] volume to average surface with Nearest Neighbour interpolation

2018-01-03 Thread Timothy Coalson
The fs_LR 32k spheres use a resolution (vertex spacing) that is suitable
for 2mm fMRI data, but it sounds like you are using structural-resolution
voxels.  As Matt says, I would put the fs_LR surface into your volume
space, and do only a single mapping, because nearest neighbor or enclosing
voxel mapping is extremely lossy - additionally, I would use the 164k
spheres instead.

Other forms of resampling, meant for continuous data, are not as lossy
because they can approximate the underlying function, but "voxel identity"
is not a continuous function.  I don't know exactly what you are doing, but
I would suggest mapping the data that *is* continuous onto fs_LR registered
surfaces, and then re-posing your "element identity" as vertex indices,
rather than T1w voxels.  If this doesn't let you do what you want, then
maybe you can do per-subject independent volume analysis, and then map the
results of that onto the individual's surface before combining across
subjects?

If you want to explain your bigger-picture goal, we might have other useful
suggestions.

Tim


On Wed, Jan 3, 2018 at 11:58 AM, Glasser, Matthew 
wrote:

> I think I would probably resample the subject’s own FS_LR registered
> surfaces into the FreeSurfer space (an exact transformation) and then do a
> single mapping from volume to surface.  You would need to figure out the
> affine matrix that describes this transform.
>
> Peace,
>
> Matt.
>
> From:  on behalf of Seán Froudist
> Walsh 
> Date: Wednesday, January 3, 2018 at 10:29 AM
> To: "hcp-users@humanconnectome.org" 
> Subject: [HCP-Users] volume to average surface with Nearest Neighbour
> interpolation
>
> Dear HCP experts,
>
> I am interested in mapping individual voxels in a subject's FreeSurfer
> conformed space (orig.nii) onto the HCP template (fsaverage_LR) while
> maintaining the original voxel values.
>
> All of the voxels lie within the LH cortical ribbon in the (conformed)
> volume space. There are 186 voxels with non-zero values that act as unique
> identifiers, with all other voxels having a value of zero.
>
> I have prepared the native FreeSurfer to HCP transformations, then
> performed volume-to-surface mapping of the sample data, and finally applied
> the FreeSurfer-to-HCP transform to the sample data. I have tried to
> identify the options that perform something like Nearest Neighbour
> assignment, as I need to maintain the original values as identifiers. The
> problem I am facing is that volume-to-surface mapping as done below reduces
> the number of non-zero voxels/vertices from 186 to 94, and the
> Freesufer-to-HCP resampling reduces the number of non-zero vertices further
> from 94 to 13 non-zero points.
>
> I would greatly appreciate your guidance as to the best way to achieve my
> desired goal of obtaining all 186 vertices with their original values onto
> the HCP template. Should I map each voxel to the closest voxel on the
> FreeSurfer WM surface, or something similar?
>
> The commands I used are shown below.
>
> Many thanks,
>
> Sean
>
> wb_shortcuts -freesurfer-resample-prep lh.white.surf.gii lh.pial.surf.gii
> lh.sphere.FSave.reg.surf.gii HCP_S1200_GroupAvg_v1/standard_mesh_atlases/
> resample_fsaverage/fs_LR-deformed_to-fsaverage.L.sphere.32k_fs_LR.surf.gii
> lh.midthickness.surf.gii ${current_subject}.l.midthickness.32k_fs_LR.surf.gii
> lh.sphere.HCP.reg.surf.gii
>
> and then created a volume-to-surface mapping, while maintaining the
> original voxel/vertex values using
>
> wb_command -volume-to-surface-mapping ' 
> {current_subject}_samples_LH_cortex.nii.gz
> lh.midthickness.surf.gii   samples_native.shape.gii -enclosing
>
>
>
> and then applied the transform using
>
> wb_command -metric-resample samples_native.shape.gii
> lh.sphere.HCP.reg.surf.gii HCP_S1200_GroupAvg_v1/standard_mesh_atlases/
> resample_fsaverage/fs_LR-deformed_to-fsaverage.L.sphere.32k_fs_LR.surf.gii
> BARYCENTRIC -largest {current_subject}_samples_HCP.shape.gii
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Smoothing, discarding volumes

2018-01-03 Thread Glasser, Matthew
There are few circumstances that one should do unconstrained smoothing in
the volume.  

Peace,

Matt.

On 1/3/18, 12:53 PM, "hcp-users-boun...@humanconnectome.org on behalf of
Tobias Bachmann"  wrote:

>Hi Stephen,
>
>Am Mittwoch, 3. Januar 2018, 16:02:06 CET schrieb Stephen Smith:
>> > On 3 Jan 2018, at 14:44, Tobias Bachmann
>> >  wrote:
>> > when using the HCP's ICA+FIX data (NIfTI volume files) for a rather
>>simple
>> > group ICA, would you recommend further conventional preprocessing,
>>i.e.
>> > 
>> > 1. smoothing (about which conflicting info is to be found)
>> 
>> spatial or temporal?
>
>Spatial.
>
>> The simple answer is in general no, although for the purposes of
>>group-ICA
>> only (and not any later subject-specific analyses like dual
>>regression), it
>> might be the case that some lowpass temporal filtering could boost
>> effective CNR.
>
>I'm indeed planning on using dual regression later on.
>
>> > 2. discarding the first few volumes (I found hardly anything regarding
>> > this
>> > issue)?
>> 
>> For most purposes it's not necessary, though there are some slight
>>residual
>> starting effects in the data, so you could delete a few.
>
>Okay, thanks a lot!
>
>Tobias
>
>
>___
>HCP-Users mailing list
>HCP-Users@humanconnectome.org
>http://lists.humanconnectome.org/mailman/listinfo/hcp-users


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Smoothing, discarding volumes

2018-01-03 Thread Tobias Bachmann
Hi Stephen,

Am Mittwoch, 3. Januar 2018, 16:02:06 CET schrieb Stephen Smith:
> > On 3 Jan 2018, at 14:44, Tobias Bachmann
> >  wrote:
> > when using the HCP's ICA+FIX data (NIfTI volume files) for a rather simple
> > group ICA, would you recommend further conventional preprocessing, i.e.
> > 
> > 1. smoothing (about which conflicting info is to be found)
> 
> spatial or temporal?

Spatial.

> The simple answer is in general no, although for the purposes of group-ICA
> only (and not any later subject-specific analyses like dual regression), it
> might be the case that some lowpass temporal filtering could boost
> effective CNR.

I'm indeed planning on using dual regression later on.

> > 2. discarding the first few volumes (I found hardly anything regarding
> > this
> > issue)?
> 
> For most purposes it's not necessary, though there are some slight residual
> starting effects in the data, so you could delete a few.

Okay, thanks a lot!

Tobias


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] volume to average surface with Nearest Neighbour interpolation

2018-01-03 Thread Glasser, Matthew
I think I would probably resample the subject’s own FS_LR registered surfaces 
into the FreeSurfer space (an exact transformation) and then do a single 
mapping from volume to surface.  You would need to figure out the affine matrix 
that describes this transform.

Peace,

Matt.

From: 
>
 on behalf of Seán Froudist Walsh >
Date: Wednesday, January 3, 2018 at 10:29 AM
To: "hcp-users@humanconnectome.org" 
>
Subject: [HCP-Users] volume to average surface with Nearest Neighbour 
interpolation

Dear HCP experts,

I am interested in mapping individual voxels in a subject's FreeSurfer 
conformed space (orig.nii) onto the HCP template (fsaverage_LR) while 
maintaining the original voxel values.

All of the voxels lie within the LH cortical ribbon in the (conformed) volume 
space. There are 186 voxels with non-zero values that act as unique 
identifiers, with all other voxels having a value of zero.

I have prepared the native FreeSurfer to HCP transformations, then performed 
volume-to-surface mapping of the sample data, and finally applied the 
FreeSurfer-to-HCP transform to the sample data. I have tried to identify the 
options that perform something like Nearest Neighbour assignment, as I need to 
maintain the original values as identifiers. The problem I am facing is that 
volume-to-surface mapping as done below reduces the number of non-zero 
voxels/vertices from 186 to 94, and the Freesufer-to-HCP resampling reduces the 
number of non-zero vertices further from 94 to 13 non-zero points.

I would greatly appreciate your guidance as to the best way to achieve my 
desired goal of obtaining all 186 vertices with their original values onto the 
HCP template. Should I map each voxel to the closest voxel on the FreeSurfer WM 
surface, or something similar?

The commands I used are shown below.

Many thanks,

Sean

wb_shortcuts -freesurfer-resample-prep lh.white.surf.gii lh.pial.surf.gii 
lh.sphere.FSave.reg.surf.gii 
HCP_S1200_GroupAvg_v1/standard_mesh_atlases/resample_fsaverage/fs_LR-deformed_to-fsaverage.L.sphere.32k_fs_LR.surf.gii
 lh.midthickness.surf.gii ${current_subject}.l.midthickness.32k_fs_LR.surf.gii 
lh.sphere.HCP.reg.surf.gii

and then created a volume-to-surface mapping, while maintaining the original 
voxel/vertex values using


wb_command -volume-to-surface-mapping ' 
{current_subject}_samples_LH_cortex.nii.gz lh.midthickness.surf.gii   
samples_native.shape.gii -enclosing


and then applied the transform using


wb_command -metric-resample samples_native.shape.gii lh.sphere.HCP.reg.surf.gii 
HCP_S1200_GroupAvg_v1/standard_mesh_atlases/resample_fsaverage/fs_LR-deformed_to-fsaverage.L.sphere.32k_fs_LR.surf.gii
  BARYCENTRIC -largest {current_subject}_samples_HCP.shape.gii


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Smoothing, discarding volumes

2018-01-03 Thread Glasser, Matthew
1.  I would avoid this.
2.  The cleaned data do have some of this artifact removed.

Peace,

Matt.

From: 
>
 on behalf of Stephen Smith >
Date: Wednesday, January 3, 2018 at 9:02 AM
To: Tobias Bachmann 
>
Cc: "hcp-users@humanconnectome.org" 
>
Subject: Re: [HCP-Users] Smoothing, discarding volumes

Hi


On 3 Jan 2018, at 14:44, Tobias Bachmann 
>
 wrote:

Dear all,

when using the HCP's ICA+FIX data (NIfTI volume files) for a rather simple
group ICA, would you recommend further conventional preprocessing, i.e.

1. smoothing (about which conflicting info is to be found)

spatial or temporal?

The simple answer is in general no, although for the purposes of group-ICA only 
(and not any later subject-specific analyses like dual regression), it might be 
the case that some lowpass temporal filtering could boost effective CNR.

2. discarding the first few volumes (I found hardly anything regarding this
issue)?

For most purposes it's not necessary, though there are some slight residual 
starting effects in the data, so you could delete a few.

Cheers.




Kind regards,
Tobias Bachmann


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


---
Stephen M. Smith, Professor of Biomedical Engineering
Head of Analysis,  Oxford University FMRIB Centre

FMRIB, JR Hospital, Headington, Oxford  OX3 9DU, UK
+44 (0) 1865 222726  (fax 222717)
st...@fmrib.ox.ac.uk
http://www.fmrib.ox.ac.uk/~steve
---

Stop the cultural destruction of Tibet






___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] volume to average surface with Nearest Neighbour interpolation

2018-01-03 Thread Seán Froudist Walsh
Dear HCP experts,

I am interested in mapping individual voxels in a subject's FreeSurfer
conformed space (orig.nii) onto the HCP template (fsaverage_LR) while
maintaining the original voxel values.

All of the voxels lie within the LH cortical ribbon in the (conformed)
volume space. There are 186 voxels with non-zero values that act as unique
identifiers, with all other voxels having a value of zero.

I have prepared the native FreeSurfer to HCP transformations, then
performed volume-to-surface mapping of the sample data, and finally applied
the FreeSurfer-to-HCP transform to the sample data. I have tried to
identify the options that perform something like Nearest Neighbour
assignment, as I need to maintain the original values as identifiers. The
problem I am facing is that volume-to-surface mapping as done below reduces
the number of non-zero voxels/vertices from 186 to 94, and the
Freesufer-to-HCP resampling reduces the number of non-zero vertices further
from 94 to 13 non-zero points.

I would greatly appreciate your guidance as to the best way to achieve my
desired goal of obtaining all 186 vertices with their original values onto
the HCP template. Should I map each voxel to the closest voxel on the
FreeSurfer WM surface, or something similar?

The commands I used are shown below.

Many thanks,

Sean

wb_shortcuts -freesurfer-resample-prep lh.white.surf.gii lh.pial.surf.gii
lh.sphere.FSave.reg.surf.gii
HCP_S1200_GroupAvg_v1/standard_mesh_atlases/resample_fsaverage/fs_LR-deformed_to-fsaverage.L.sphere.32k_fs_LR.surf.gii
lh.midthickness.surf.gii
${current_subject}.l.midthickness.32k_fs_LR.surf.gii
lh.sphere.HCP.reg.surf.gii

and then created a volume-to-surface mapping, while maintaining the
original voxel/vertex values using

wb_command -volume-to-surface-mapping '
{current_subject}_samples_LH_cortex.nii.gz
lh.midthickness.surf.gii   samples_native.shape.gii -enclosing



and then applied the transform using

wb_command -metric-resample samples_native.shape.gii
lh.sphere.HCP.reg.surf.gii
HCP_S1200_GroupAvg_v1/standard_mesh_atlases/resample_fsaverage/fs_LR-deformed_to-fsaverage.L.sphere.32k_fs_LR.surf.gii
BARYCENTRIC -largest {current_subject}_samples_HCP.shape.gii

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Delay Discounting

2018-01-03 Thread Elam, Jennifer
Hi Hae-Min,

A detailed description of the delay discounting measure with references is on 
pp. 183-186 of the HCP Reference 
Manual.
 We administered this measure as a custom add on measure through the 
computerized Penn CNP. I believe you can ask the Penn CNP people to include it 
in your customized battery for your study. I'm CC'ing Cindy Hodge and Deanna 
Barch in case they have more to add about accessing the tool.


Best,

Jenn

Jennifer Elam, Ph.D.
Scientific Outreach, Human Connectome Project
Washington University School of Medicine
Department of Neuroscience, Box 8108
660 South Euclid Avenue
St. Louis, MO 63110
314-362-9387
e...@wustl.edu
www.humanconnectome.org



From: hcp-users-boun...@humanconnectome.org 
 on behalf of Jung, Hae-Min 

Sent: Wednesday, January 3, 2018 8:12:13 AM
To: hcp-users@humanconnectome.org
Subject: [HCP-Users] Delay Discounting


Hi Everyone,


I couldn't find good details on the administration of the delay discounting 
measure--it's also not a part of the Penn CNP nor the NIH Toolbox. Was it a 
custom computerized tool, or done on paper? If it's a computerized tool, is it 
available to use for our own study?


Best,

Hae-Min jung

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Smoothing, discarding volumes

2018-01-03 Thread Stephen Smith
Hi


> On 3 Jan 2018, at 14:44, Tobias Bachmann 
>  wrote:
> 
> Dear all,
> 
> when using the HCP's ICA+FIX data (NIfTI volume files) for a rather simple 
> group ICA, would you recommend further conventional preprocessing, i.e.
> 
> 1. smoothing (about which conflicting info is to be found)

spatial or temporal?

The simple answer is in general no, although for the purposes of group-ICA only 
(and not any later subject-specific analyses like dual regression), it might be 
the case that some lowpass temporal filtering could boost effective CNR.

> 2. discarding the first few volumes (I found hardly anything regarding this 
> issue)?

For most purposes it's not necessary, though there are some slight residual 
starting effects in the data, so you could delete a few.

Cheers.



> 
> Kind regards,
> Tobias Bachmann
> 
> 
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users


---
Stephen M. Smith, Professor of Biomedical Engineering
Head of Analysis,  Oxford University FMRIB Centre

FMRIB, JR Hospital, Headington, Oxford  OX3 9DU, UK
+44 (0) 1865 222726  (fax 222717)
st...@fmrib.ox.ac.ukhttp://www.fmrib.ox.ac.uk/~steve 

---

Stop the cultural destruction of Tibet 






___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] Smoothing, discarding volumes

2018-01-03 Thread Tobias Bachmann
Dear all,

when using the HCP's ICA+FIX data (NIfTI volume files) for a rather simple 
group ICA, would you recommend further conventional preprocessing, i.e.

1. smoothing (about which conflicting info is to be found)
2. discarding the first few volumes (I found hardly anything regarding this 
issue)?

Kind regards,
Tobias Bachmann


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] Delay Discounting

2018-01-03 Thread Jung, Hae-Min
Hi Everyone,


I couldn't find good details on the administration of the delay discounting 
measure--it's also not a part of the Penn CNP nor the NIH Toolbox. Was it a 
custom computerized tool, or done on paper? If it's a computerized tool, is it 
available to use for our own study?


Best,

Hae-Min jung

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users