[HCP-Users] Postdoctoral position at Harvard University

2019-02-21 Thread Hecht, Erin Elisabeth
Hello HCP Users,

I am writing to invite applications for a neuroimaging postdoc job in my lab.  
I'm looking for someone with expertise in collecting, processing, and analyzing 
structural and functional neuroimaging data, scripting, Unix, etc.  The project 
involves noninvasive research with dogs.  There will also be opportunities to 
get involved in human and nonhuman primate research, ex vivo imaging at 
Martinos, and histology and digital microscopy.  The job posting is below.  
Please feel free to share/distribute.  Applicants are welcome to contact me 
directly.

Thank you,

Erin E. Hecht, Ph.D.
Assistant Professor, Department of Human Evolutionary Biology, Harvard 
University
Mailing address: 11 Divinity Ave, Cambridge, MA 02138
Office phone: 617-384-8642 | Email: 
erin_he...@fas.harvard.edu
http://hechtlab.org | http://caninebrains.org/

Postdoctoral position available in the Evolutionary Neuroscience Lab (PI: Erin 
Hecht) in the Department of Human Evolutionary Biology at Harvard University
https://academicpositions.harvard.edu/postings/8638

The Department of Human Evolutionary Biology at Harvard University invites 
applications for a Postdoctoral Fellow in the Evolutionary Neuroscience 
Laboratory of Dr. Erin Hecht. The focus of the position will be to study 
neurodevelopmental adaptations for the acquisition of learned skills. The 
project involves longitudinal neuroimaging of military working dogs, from 
puppyhood through adulthood, as they progress through a formal on-base training 
regimen. Several weeks of paid travel per year will be required to a research 
site in Texas.

This is a one-year position, expected to begin in spring or summer 2019, with 
possibility of renewal dependent upon adequate funding and satisfactory 
performance. The research will take place in the Evolutionary Neuroscience 
Laboratory directed by Dr. Hecht and located in the Peabody Museum on Harvard 
University's Cambridge, Massachusetts campus, and on-site at the Lackland Air 
Force Base outside San Antonio.

Broadly, the Evolutionary Neuroscience Laboratory studies how brains change in 
response to selection pressure on behavior, and how brains acquire heritable 
adaptations for complex, learned behaviors. Comparisons between modern humans 
and our living primate relatives provide a way to address this question in the 
context of our own evolutionary history. In addition, the lab investigates 
general mechanisms of brain-behavior evolution by studying "unnatural 
selection" in intentionally-bred animals, including domestic dogs and foxes. 
The lab uses a variety of experimental and computational tools, such as 
structural and functional neuroimaging, histology and digital microscopy, and 
behavior analysis.

A doctoral degree is required for this position.  Desired qualifications 
include research in neuroscience, neuroimaging, development, and/or animal 
behavior. The project will require working with dogs and coordinating with an 
interdisciplinary research team that will include scientists, veterinarians, 
dog handlers and trainers, and military personnel. This research is 
non-invasive.  Please submit a letter of interest, an updated CV, and the names 
of three references by email to Dr. Erin Hecht at erin_he...@fas.harvard.edu. 
Evaluation will begin at the time the advertisement is placed and will continue 
until the position is filled.



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] Postdoctoral position(s) in functional/structural imaging of autism spectrum disorders

2019-02-21 Thread Inna Fishman
Candidates are being considered for 1-2 postdoctoral positions at the San Diego 
State University (SDSU) Brain Development Imaging Laboratories (BDIL) in 
affiliation with the SDSU Center for Autism and Developmental Disorders.



The post-doctoral position(s) will be supported by several existing and newly 
funded NIH projects examining functional network organization, brain structure, 
and cognition across the lifespan (toddlers to aging) in autism spectrum 
disorders (ASD). Postdoctoral fellow(s) will be involved in multiple NIH funded 
projects, with a particular role in a newly funded study of auditory precursors 
of early language development, utilizing sleep MRI protocols in young children. 
The postdoctoral fellow(s) will regularly interact with collaborating faculty 
at SDSU and the University of California, San Diego, and will have access to 
the rich San Diego cognitive and clinical neuroscience, and autism research 
communities.

The ideal candidates will have a strong background in neuroimaging. Clinical 
training is not required but would be a strength. The postdoctoral scholar(s) 
will have the opportunity to develop and implement new analysis pipelines for 
high-resolution functional and/or structural imaging as part of the new study 
on auditory development, as well as to contribute to continuing analysis of our 
existing multimodal MRI datasets across the lifespan. Fellow(s) will assist 
with supervision/training of graduate and undergraduate students (from diverse 
graduate/doctoral programs, including Clinical Neuropsychology, Cognitive 
Science, Computational Bioinformatics).

The Principal Investigators (Drs. Ralph-Axel Mueller, Ruth Carper, Inna 
Fishman) are committed to supporting fellows on their trajectory to an 
independent research career, including ensuring that they submit competitive 
applications for fellowships and external grant funding.

 

Requirements

·  PhD in Neuroscience, Cognitive Science, 
(Clinical/Developmental/Cognitive) Psychology, Biomedical Engineering, or other 
relevant field.

·  Experience in data acquisition, preprocessing, and analysis of brain MRI 
data.

·  Preference will be given to applicants with:

o   Experience in multiple brain imaging modalities (e.g. both diffusion and 
functional imaging).

o   Experience with multiple analysis methods/toolkits in the relevant MRI 
modality.

o   Experience with ASDs, other developmental disorders, or typical children or 
aging populations.

 

Brain Development Imaging Laboratories (BDIL) & Research Facilities

BDIL (bdil.sdsu.edu ) is a diverse and collaborative 
research group with 3 Principal Investigators, multiple postdoctoral trainees 
and research faculty, PhD and Master’s students in Psychology, Cognitive 
Science, Clinical Psychology, and BioInformatics from both SDSU and University 
of California, San Diego (UCSD). Our research studies employ multiple imaging 
modalities (anatomical, diffusion, and functional [connectivity] MRI) as well 
as behavioral and neuropsychological assessments to investigate the brain bases 
of ASDs and age-related change across the lifespan (1-65 years). BDIL 
collaborates with experts in MR physics, Radiology, EEG, MEG, and 
Bioinformatics to implement innovative imaging and analysis techniques (e.g., 
simultaneous fMR/EEG recording, combined MEG and MRI). BDIL is affiliated with 
the SDSU Center of Autism and Developmental Disorders (autism.sdsu.edu 
), a clinical research and training core facility.

 

To Apply

Please e-mail CV, research statement, reprints, and 3 letters of recommendation 
to Ralph-Axel Mueller at: rmuel...@sdsu.edu .


___
Inna Fishman, PhD
Director, SDSU Center for Autism and Developmental Disorders
Associate Research Professor, Dept. of Psychology
Associate Clinical Director, Brain Development Imaging Laboratories
San Diego State University
Phone: (619) 594-2299
Email: inna.fish...@sdsu.edu 



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] unwarping 164k surface to subject-specific ACPC-aligned headcoordinate space

2019-02-21 Thread Timothy Coalson
I see, I didn't realize we generated the 32k versions in T1w space but not
the 164k.

The output of the command should be named with "midthickness", not
"sphere", the shape will be the same as the input surface.  You will need
to use the midthickness surface from the T1w/Native folder as the input to
get the result you want (only rigidly aligned).  I think the
"100307.L.sphere.164k_fs_LR.surf.gii" sphere is just a copy of the standard
sphere, so it should work correctly, but I'm not sure.

Tim


On Thu, Feb 21, 2019 at 11:00 AM CHAUMON Maximilien <
maximilien.chau...@icm-institute.org> wrote:

> Thank you!
>
> In the data I have downloaded, the only place where I find a 164k surface
> is in the MNInonlinear directory, so I assume some nonlinear transformation
> has been applied. The T1w directory has no such high res surf.
>
> So This command should do what I want, right (I split in several lines for
> clarity) ?
> wb_command -surface-resample \
> 100307/MNINonLinear/Native/100307.L.midthickness.native.surf.gii \
> 100307/MNINonLinear/Native/100307.L.sphere.MSMAll.native.surf.gii \
> 100307/MNINonLinear/100307.L.sphere.164k_fs_LR.surf.gii \
> BARYCENTRIC \
> 100307/T1w/100307.L.sphere.164k_fs_LR.surf.gii
>
> Many thanks for your help!
>
>
>
> Le mer. 20 févr. 2019 à 20:34, Timothy Coalson  a écrit :
>
>> Sorry, the recommended sphere for resampling any subject will of course
>> be that subject's version of that file, not specifically subject 100307's
>> sphere.
>>
>> Tim
>>
>>
>> On Wed, Feb 20, 2019 at 1:31 PM Timothy Coalson  wrote:
>>
>>> On Wed, Feb 20, 2019 at 8:03 AM CHAUMON Maximilien <
>>> maximilien.chau...@icm-institute.org> wrote:
>>>
 Hello,

 I'm looking at fine changes in MEG forward leadfields and would like to
 use the 164k meshes in each subject (I know 164k vertices are overkill, but
 I need this high res rendering for one of my figures). I'm interested in
 the actual original location of individual vertices in the brain in
 subject-specific ACPC-aligned headcoordinate space. I don't want any non
 linear spatial transformation applied to the mesh.
 So I would like to use the 164k mesh with coordinates without the
 nonlinear transformation that (as far as I understand) was applied to all
 the 164k_fs_LR files. Is there an easy way to revert the non linear
 transformation?

>>>
>>> Only the surface files under the MNINonLinear folder have had any
>>> nonlinear anatomical warp applied.  The surface files under T1w all line up
>>> with the distortion corrected, rigidly aligned T1w image (we don't really
>>> keep scanner space around, and we often call this distortion corrected
>>> rigid alignment space "native volume space").  Surface
>>> registration/resampling does not deform the anatomy, it just tiles the same
>>> contour in 3D space with a new set of triangles.
>>>
>>> Note that averaging the coordinates of vertices across subjects will
>>> change their location, and this will affect geometry and "foldedness", to a
>>> degree depending on what registration was used.  As long as you stick with
>>> individual surfaces, you don't need to worry about this.
>>>
>>>
 I would then use the file
 {Subject}.{Hemi}.midthickness.164k_fs_LR.surf.gii and apply the inverse
 transformation,

 Alternatively, is there a way to easily downsample
 {Subject}.{Hemi}.midthickness.native.surf.gii to 164k vertices? is this
 then in subject-specific ACPC-aligned headcoordinate space? how could I
 move to that space?

>>>
>>> This is what the existing 164k surfaces in the T1w folder already are.
>>> Native mesh is commonly ~130k for HCP scans, so 164k is actually a small
>>> upsampling.  We recommend the MSMAll versions, as the same vertex number
>>> across subjects is more often in the same area than for other registrations.
>>>
>>> For reference (since we have already dealt with this resampling for
>>> you), the recommended sphere to use for resampling from native mesh to any
>>> fs_LR mesh is "MNINonLinear/Native/100307.L.sphere.MSMAll.native.surf.gii"
>>> (and the R version, of course).  Ignore the fact that it is in the
>>> MNINonLinear folder, sphere surfaces don't have a volume space, that is
>>> just where it got put.  The standard spheres for fs_LR are in the pipelines
>>> under global/templates/standard_mesh_atlases/, read the readme file.  The
>>> command to do surface resampling is wb_command -surface-resample.
>>>
>>>
 Does that make sense?

 Many thanks,
 Max

 ___
 HCP-Users mailing list
 HCP-Users@humanconnectome.org
 http://lists.humanconnectome.org/mailman/listinfo/hcp-users

>>>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] unwarping 164k surface to subject-specific ACPC-aligned headcoordinate space

2019-02-21 Thread CHAUMON Maximilien
Thank you!

In the data I have downloaded, the only place where I find a 164k surface is in 
the MNInonlinear directory, so I assume some nonlinear transformation has been 
applied. The T1w directory has no such high res surf.

So This command should do what I want, right (I split in several lines for 
clarity) ?
wb_command -surface-resample \
100307/MNINonLinear/Native/100307.L.midthickness.native.surf.gii \
100307/MNINonLinear/Native/100307.L.sphere.MSMAll.native.surf.gii \
100307/MNINonLinear/100307.L.sphere.164k_fs_LR.surf.gii \
BARYCENTRIC \
100307/T1w/100307.L.sphere.164k_fs_LR.surf.gii

Many thanks for your help!



Le mer. 20 févr. 2019 à 20:34, Timothy Coalson 
mailto:tsc...@mst.edu>> a écrit :
Sorry, the recommended sphere for resampling any subject will of course be that 
subject's version of that file, not specifically subject 100307's sphere.

Tim


On Wed, Feb 20, 2019 at 1:31 PM Timothy Coalson 
mailto:tsc...@mst.edu>> wrote:
On Wed, Feb 20, 2019 at 8:03 AM CHAUMON Maximilien 
mailto:maximilien.chau...@icm-institute.org>>
 wrote:
Hello,

I'm looking at fine changes in MEG forward leadfields and would like to use the 
164k meshes in each subject (I know 164k vertices are overkill, but I need this 
high res rendering for one of my figures). I'm interested in the actual 
original location of individual vertices in the brain in subject-specific 
ACPC-aligned headcoordinate space. I don't want any non linear spatial 
transformation applied to the mesh.
So I would like to use the 164k mesh with coordinates without the nonlinear 
transformation that (as far as I understand) was applied to all the 164k_fs_LR 
files. Is there an easy way to revert the non linear transformation?

Only the surface files under the MNINonLinear folder have had any nonlinear 
anatomical warp applied.  The surface files under T1w all line up with the 
distortion corrected, rigidly aligned T1w image (we don't really keep scanner 
space around, and we often call this distortion corrected rigid alignment space 
"native volume space").  Surface registration/resampling does not deform the 
anatomy, it just tiles the same contour in 3D space with a new set of triangles.

Note that averaging the coordinates of vertices across subjects will change 
their location, and this will affect geometry and "foldedness", to a degree 
depending on what registration was used.  As long as you stick with individual 
surfaces, you don't need to worry about this.

I would then use the file {Subject}.{Hemi}.midthickness.164k_fs_LR.surf.gii and 
apply the inverse transformation,

Alternatively, is there a way to easily downsample 
{Subject}.{Hemi}.midthickness.native.surf.gii to 164k vertices? is this then in 
subject-specific ACPC-aligned headcoordinate space? how could I move to that 
space?

This is what the existing 164k surfaces in the T1w folder already are.  Native 
mesh is commonly ~130k for HCP scans, so 164k is actually a small upsampling.  
We recommend the MSMAll versions, as the same vertex number across subjects is 
more often in the same area than for other registrations.

For reference (since we have already dealt with this resampling for you), the 
recommended sphere to use for resampling from native mesh to any fs_LR mesh is 
"MNINonLinear/Native/100307.L.sphere.MSMAll.native.surf.gii" (and the R 
version, of course).  Ignore the fact that it is in the MNINonLinear folder, 
sphere surfaces don't have a volume space, that is just where it got put.  The 
standard spheres for fs_LR are in the pipelines under 
global/templates/standard_mesh_atlases/, read the readme file.  The command to 
do surface resampling is wb_command -surface-resample.

Does that make sense?

Many thanks,
Max

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] A few questions about HCP behavioral data

2019-02-21 Thread Xinyang Liu
Dear HCP experts,

We're currently analyzing some of the behavioral data from your dataset and 
have three short questions about the IRT-modelling and theta-score derivation 
that is also mentioned in the "NIH Toolbox Scoring and Interpretation Guide". 
To be able to build comparable scores between NIH-task data and Penn- and 
Working Memory data we would like to know more about the NIH IRT-scores. We 
hope you can help us out.

In the "NIH Toolbox Scoring and Interpretation Guide" it is mentioned that 
IRT-scores were calculated before the consecutive scores, such as Age-adjusted 
or Unadjusted. Was that also true for the HCP data?
What was the database to estimate the IRT model for the HCP data? Was the model 
estimated based on the HCP data only or was it somehow combined with the 
norming sample of the NIH Toolbox?
What type of IRT model was run? A simple Rasch model or did you add other free 
parameters, such as guessing probability (2PL)?

Thank you very much for your help.

Kind regards,
Xinyang




___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users