Re: [HCP-Users] NIFTI to CIFTI

2019-06-21 Thread Timothy Coalson
I don't know what 3dDeconvolve does, but if you used it to try to do
anything spatial (spatial sharpening, PSF, etc), then the files produced by
wb_command -cifti-convert would be entirely inappropriate.  The only
meaningful operations that can be done on -cifti-convert "volume-ish" files
are temporal processing, because the spatial aspects are completely jumbled.

That said, yes, if you made a volume-ish file by -cifti-convert and put
that through an afni tool, the cifti-template can be the input file you
used in -cifti-convert to make the volume-ish input in the first place.  If
the cifti files you are using are standard 91k grayordinates, you could use
the appropriate file from the Pipelines/global/templates folder if that was
somehow more convenient.  Since we do not expect tools to preserve anything
about the volume-ish files except the dimensions, we can't use the
volume-ish file to also store the information required to put the "voxels"
back into the right vertices, which is why a cifti file is required as the
template, to define what "voxel" actually represents each vertex, etc.

You will likely need the -reset-scalars option, as I expect the statistics
output is a different number of frames than the timeseries.

Tim


On Fri, Jun 21, 2019 at 12:24 PM Pipoly, Marco A 
wrote:

> Hello All,
>
>
> I am relatively new to working with HCP tfMRI connectome data and am
> running into a bit of confusion about the following: I have statistical map
> outputs (in stats.nii.gz format) from, CIFTI data converted to NIFTI data
> that I put into AFNI 3dDeconvolve. I am interested in converting this
> output (stats.nii.gz) back into CIFTI in order to regain respect to the
> surface and structure information (that does not appear to remain when you
> wb_command CIFTI->NIFTI).
>
>
> It appears this will do the trick?:
>
> [-from-nifti] - convert a NIFTI (1 or 2) file made with this command back
>  into CIFTI
>   - the input nifti file
>   - a cifti file with the dimension(s) and mapping(s)
> that should be used
>   - output - the output cifti file
>
> If so, what exactly does it mean by "cifti-template?" Would this be the
> original dtseries file I used to procure the AFNI 3dDeconvolve outputs or
> is there a generic template brain(s) for the various CIFTI data types?
>
>
> Best,
>
>
> Marco A. Pipoly, B.S.
> NSF Graduate Research Fellow
> Graduate Student, PhD in Neuroscience
> University of Iowa
> marco-pip...@uiowa.edu
>
>
>
>
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] A question about scene file

2019-06-21 Thread Timothy Coalson
Sorry, in my second paragraph I meant "breaking the usual convention of the
scene file XML".

Tim


On Fri, Jun 21, 2019 at 2:28 PM Timothy Coalson  wrote:

> The paths inside a scene file's XML are supposed to be relative to the
> location of the scene file.  You are expected to usually have the scene
> file in a directory nearby where the data it refers to is, to reduce how
> far it crosses your filesystem structure to generate the relative paths.
> You can use wb_command -scene-file-relocate to move the scene file and
> regenerate its internal paths to put it closer to the data directory.  If
> you need to move it to a system that doesn't have the same filesystems
> mounted the same way, there is wb_command -zip-scene-file (or the "Zip..."
> button in the wb_view scene window), which will generate a self-contained
> archive with all the referenced files and the necessary directory layout.
>
> If the QC scene was generated with a script, it may be that the
> substituted paths were absolute, breaking the usual convention of the
> script.  Using -scene-file-relocate or loading and saving the scene file
> may convert these paths to relative.
>
> Tim
>
>
> On Fri, Jun 21, 2019 at 12:23 PM Aaron C  wrote:
>
>> Dear HCP experts,
>>
>> The scene file (a structural processing QC scene) I generated in a Linux
>> computer doesn't work in a Windows computer. It seems that the file paths
>> were hard-coded in the scene file. Is there a way to make it more portable?
>> Thank you.
>>
>> ___
>> HCP-Users mailing list
>> HCP-Users@humanconnectome.org
>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] A question about scene file

2019-06-21 Thread Timothy Coalson
The paths inside a scene file's XML are supposed to be relative to the
location of the scene file.  You are expected to usually have the scene
file in a directory nearby where the data it refers to is, to reduce how
far it crosses your filesystem structure to generate the relative paths.
You can use wb_command -scene-file-relocate to move the scene file and
regenerate its internal paths to put it closer to the data directory.  If
you need to move it to a system that doesn't have the same filesystems
mounted the same way, there is wb_command -zip-scene-file (or the "Zip..."
button in the wb_view scene window), which will generate a self-contained
archive with all the referenced files and the necessary directory layout.

If the QC scene was generated with a script, it may be that the substituted
paths were absolute, breaking the usual convention of the script.  Using
-scene-file-relocate or loading and saving the scene file may convert these
paths to relative.

Tim


On Fri, Jun 21, 2019 at 12:23 PM Aaron C  wrote:

> Dear HCP experts,
>
> The scene file (a structural processing QC scene) I generated in a Linux
> computer doesn't work in a Windows computer. It seems that the file paths
> were hard-coded in the scene file. Is there a way to make it more portable?
> Thank you.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] table for parcellation

2019-06-21 Thread Timothy Coalson
(i) You can use wb_command -cifti-label-export-table on the dlabel file to
get the order of the parcels in a fixed format, though there are extra
lines and numbers in the output text file.  360 parcels is a rather long
table, you might consider a matrix figure instead, and only mention the
highlights.  If you make the figure in wb_view, then you can upload the
scene to BALSA, which will include the data files used to make it, allowing
others to use your data directly instead of looking through a table.

Someone else will need to answer (ii).

Tim


On Thu, Jun 20, 2019 at 10:32 PM Marta Moreno 
wrote:

> Dear experts,
>
> After running:
>
> 1.  PreFreeSurfer
> 2.  FreeSurfer
> 3.  PostFreeSurfer
> 4.  fMRIVolume
> 5.  fMRISurface
> 6.  ICA+FIX (MR+FIX)
> 7.  MSMAll (Do MSM Surface Registration)
> 8.  dedrift and resample pipeline
>
> 2 questions:
>
> (i) I used wb_command -cifti-parcellate and wb_command -cifti-correlation
> to create a parcellation.pconn.nii file per each subject. I have uploaded
> all files into matlab and now would like to create a table that includes
> all subjects’ correlation value from parcel 264 to all 360 parcels, and
> include the variable names for all 360 parcels. How can I do this?
>
> (ii) Is the file “RS_fMRI_MR_Atlas_MSMAll_Test_hp0_clean.dtseries.nii”
> the final step after (8) to use for wb_command -cifti-parcellate and 
> wb_command
> -cifti-correlation
>
> Thanks,
>
> L.
>
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] problems to download the example data to run HCP Pipelines and NHPPipelines

2019-06-18 Thread Timothy Coalson
Could you clarify what exactly you tried to download?  I don't know of any
non-human data in connectomedb, either.

Tim


On Tue, Jun 18, 2019 at 9:17 AM DE CASTRO Vanessa 
wrote:

> Good morning, I'm trying to download the example data to run the HCP
> Pipelines, and also the ones related to the non-human primates, and there
> is no way; even with the last version of Aspera Connect. What can I do?
> Than you very much.
>
> Sincerely,
>
> *--*
>
> *Vanessa DeCastro, PhD*
>
> *Centre de Recherche Cerveau et Cognition - UMR 5549 - CNRS
> Pavillon Baudot CHU Purpan
> 31052 Toulouse Cedex 03, France *
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] -volume-label-import

2019-06-14 Thread Timothy Coalson
You cannot have multiple label keys with the same name in a single label
map.

There are two very separate things that label volumes can do in cifti
create commands: one is to set the name of a particular subset of voxels to
group them into a "structure" (which has a limited set of available values,
and are mostly used to prevent things like smoothing or gradient from
crossing structure boundaries unless desired), and the other is to act as
data inside the file (for dlabel files), for which there is no limit on
what the names can be.

I'm not clear on what you are actually trying to do.  If your volume space
already matches our 2mm template, and you want to use your labels with our
91k grayordinate data, you could just use the
Pipelines/global/templates/91282_Greyordinates/91282_Greyordinates.dscalar.nii
as the template and your volume file as the -volume-all argument.  If you
want to include voxels that our 91k grayordinates don't, then you need to
create a new cifti space, which takes some more steps.

Tim


On Fri, Jun 14, 2019 at 12:34 PM Sanchez, Juan (NYSPI) <
juan.sanc...@nyspi.columbia.edu> wrote:

> Dear HCP community
>
> I am trying to create a labeled volume that will work with
> -cifti-create-dense-from-template
>
> The volume data has over 100 ROI's and the cifti create function only
> imports specified labels
>
>
>  CORTEX_LEFT CORTEX_RIGHT CEREBELLUM ACCUMBENS_LEFT ACCUMBENS_RIGHT
> ALL_GREY_MATTER ALL_WHITE_MATTER AMYGDALA_LEFT AMYGDALA_RIGHT BRAIN_STEM
> CAUDATE_LEFT CAUDATE_RIGHT CEREBELLAR_WHITE_MATTER_LEFT
> CEREBELLAR_WHITE_MATTER_RIGHT CEREBELLUM_LEFT CEREBELLUM_RIGHT
> CEREBRAL_WHITE_MATTER_LEFT CEREBRAL_WHITE_MATTER_RIGHT CORTEX
> DIENCEPHALON_VENTRAL_LEFT DIENCEPHALON_VENTRAL_RIGHT HIPPOCAMPUS_LEFT
> HIPPOCAMPUS_RIGHT INVALID OTHER OTHER_GREY_MATTER OTHER_WHITE_MATTER
> PALLIDUM_LEFT PALLIDUM_RIGHT PUTAMEN_LEFT PUTAMEN_RIGHT THALAMUS_LEFT
> THALAMUS_RIGHT
>
>
> I need to import all of the labeled ROI's values from the nii into the
> subortical cifti. I tried labeling all of the ROI's with OTHER and name
> collision in the input names did not allow that to work.
>
>
> Does anyone know how to solve this or a work around
>
> Thanks so much
>
> J
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] wb_command -convert-matrix4-to-workbench-sparse

2019-06-07 Thread Timothy Coalson
Tractography visualization is somewhat rough around the edges.  What was
the full probtrackx command you used?  Do you have bingham parameter
volumes for the fiber orientations (mean, stdev, theta, phi, psi, ka, kb),
or only the fiber orientation sample volumes?

Tim


On Fri, Jun 7, 2019 at 7:48 AM DE CASTRO Vanessa 
wrote:

> Good morning, I would like to transform my data obtained with fsl's
> probtackx, to visualized them with wb_view. But the instructions to run the
> command is not clear for me. I've tried several choices but there is always
> something missing or nor recognized (I'm very naive with all this.. just
> starting). Could you show a practical example to run it, please?
>
> Thank you very much
>
>
> *--*
>
> *Vanessa DeCastro, PhD*
>
> *Centre de Recherche Cerveau et Cognition - UMR 5549 - CNRS
> Pavillon Baudot CHU Purpan
> 31052 Toulouse Cedex 03, France *
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] wb_command -cifti-gradient

2019-06-05 Thread Timothy Coalson
"wb_command -gifti-help" is intended to help explain these file formats:

https://www.humanconnectome.org/software/workbench-command/-gifti-help

There are other -*-help options for other formats, and other aspects of
wb_command, like how to read the command usage info:

https://www.humanconnectome.org/software/workbench-command

Tim


On Tue, Jun 4, 2019 at 10:05 PM Glasser, Matthew  wrote:

> You can use the midthickness surfaces .surf.gii in the same folder.
>
>
>
> Matt.
>
>
>
> *From: *秦键 
> *Date: *Tuesday, June 4, 2019 at 10:02 PM
> *To: *"Glasser, Matthew" 
> *Cc: *"hcp-users@humanconnectome.org" 
> *Subject: *Re: [HCP-Users] wb_command -cifti-gradient
>
>
>
> A 32k fs_LR surface file with suffix of dscalar.nii, and a
> thickness.32k_fs_LR.dscalar.nii can be an example.
>
>
>
>
>
> [image: Image removed by sender.]
>
> *秦**键*
>
> 邮箱:qinjian...@126.com
>
> 签名由 网易邮箱大师  定制
>
> On 06/05/2019 10:36, Glasser, Matthew  wrote:
>
> What file did you try to run this on?
>
>
> Matt.
>
>
>
> *From: * on behalf of 秦键 <
> qinjian...@126.com>
> *Date: *Tuesday, June 4, 2019 at 9:32 PM
> *To: *"hcp-users@humanconnectome.org" 
> *Subject: *[HCP-Users] wb_command -cifti-gradient
>
>
>
>
>
> Dear professors,
>
> When I used the wb_command -cifti-gradient for fs_LR 32k cifti file, I was
> asked to input the left and right surface files, where can I find the left
> and right surface files and what are the format of them?  Can I have an
> example of the use of the wb_command -cifti-gradient?
>
> Thank you and best wishes!
>
>
>
>
>
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] pdconn analysis problems

2019-06-03 Thread Timothy Coalson
Also, if you literally want spatial gradient magnitude, you can run
-cifti-gradient on the correct dimension of the pdconn (or dpconn) file.

Tim


On Mon, Jun 3, 2019 at 8:14 PM Timothy Coalson  wrote:

> If you just want to look at them first, you can load them into wb_view.
> Depending on whether it is pdconn or dpconn (via a transpose), you will
> either get a dense map when you click a parcel, or a parcellated map when
> you click a vertex/voxel.
>
> Tim
>
>
> On Mon, Jun 3, 2019 at 8:04 PM Joseph Orr  wrote:
>
>> I wanted to contrast the connectivity of the different parcels in order
>> to look for evidence of gradients in networks. I'll try it in matlab.
>> --
>> Joseph M. Orr, Ph.D.
>> Assistant Professor
>> Department of Psychological and Brain Sciences
>> Texas A Institute for Neuroscience
>> Texas A University
>> College Station, TX
>>
>>
>> On Mon, Jun 3, 2019 at 7:58 PM Glasser, Matthew 
>> wrote:
>>
>>> Are you wanting to view the files?  You could probably translate the
>>> file into a .dscalar.nii using matlab.
>>>
>>>
>>>
>>> Matt.
>>>
>>>
>>>
>>> *From: * on behalf of Joseph Orr
>>> 
>>> *Date: *Monday, June 3, 2019 at 7:55 PM
>>> *To: *HCP Users 
>>> *Subject: *Re: [HCP-Users] pdconn analysis problems
>>>
>>>
>>>
>>> Thanks Tim, running cifti-transpose and separating on COLUMN solved both
>>> issues. Is there a way to separate the different parcels that make up a
>>> pdconn so that I can compare the connectivity maps between parcels?
>>>
>>>
>>>
>>> Thanks,
>>>
>>> Joe
>>>
>>>
>>>
>>> --
>>>
>>> Joseph M. Orr, Ph.D.
>>>
>>> Assistant Professor
>>>
>>> Department of Psychological and Brain Sciences
>>>
>>> Texas A Institute for Neuroscience
>>>
>>> Texas A University
>>>
>>> College Station, TX
>>>
>>>
>>>
>>>
>>>
>>> On Mon, Jun 3, 2019 at 5:14 PM Timothy Coalson  wrote:
>>>
>>> "Segmentation fault" is a computer term about invalid memory access, not
>>> related to the neuroscience term of segmentation.  Due to using the wrong
>>> variable while copying map names, this command can crash when using ROW
>>> when the rows are longer than the columns.  You can get around it by
>>> transposing and separating with COLUMN instead.  This will be fixed in the
>>> next release.
>>>
>>>
>>>
>>> I don't know if this is also the reason palm was crashing.  We could
>>> make a bleeding edge build available if you want to test it.
>>>
>>>
>>>
>>> Tim
>>>
>>>
>>>
>>>
>>>
>>> On Sat, Jun 1, 2019 at 2:17 PM Joseph Orr  wrote:
>>>
>>> Sure thing, here's a link to the file. Let me know if there are access
>>> problems and I can try another sharing via dropbox.
>>>
>>> *[image: Image removed by sender.] L-ctx_R-CB_crosscorr.pdconn.nii
>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__drive.google.com_a_tamu.edu_file_d_1W9n40sZgLNSn8ODIYf9xqBoAZ5b5C71E_view-3Fusp-3Ddrive-5Fweb=DwMFaQ=ODFT-G5SujMiGrKuoJJjVg=ZKy1VO33u0kvO-PqY1gpb9Ld-AGhtT8c9PAcpsEyp70=dtQbE_Obfv7WQhtS5EGqWgePsIaI6hU895cKCRVdqX0=jwPQkvc49uGNzjH-ER9ry1R9fgOMJ093SIy8J5g11Nk=>*
>>>
>>>
>>> --
>>>
>>> Joseph M. Orr, Ph.D.
>>>
>>> Assistant Professor
>>>
>>> Department of Psychological and Brain Sciences
>>>
>>> Texas A Institute for Neuroscience
>>>
>>> Texas A University
>>>
>>> College Station, TX
>>>
>>>
>>>
>>>
>>>
>>> On Sat, Jun 1, 2019 at 1:31 PM Glasser, Matthew 
>>> wrote:
>>>
>>> It sounds like there might be both Workbench and PALM bugs here.
>>> Perhaps you could upload the data somewhere (off list if needed), so Tim
>>> and Anderson could take a look?
>>>
>>>
>>>
>>> Matt.
>>>
>>>
>>>
>>> *From: * on behalf of Joseph Orr
>>> 
>>> *Date: *Saturday, June 1, 2019 at 1:00 PM
>>> *To: *HCP Users 
>>> *Subject: *[HCP-Users] pdconn analysis problems
>>>
>>>
>>>
>>> I have a pdconn input (cortical ptseries by subcortical dtseries) that
>>> I'd like to ana

Re: [HCP-Users] pdconn analysis problems

2019-06-03 Thread Timothy Coalson
If you just want to look at them first, you can load them into wb_view.
Depending on whether it is pdconn or dpconn (via a transpose), you will
either get a dense map when you click a parcel, or a parcellated map when
you click a vertex/voxel.

Tim


On Mon, Jun 3, 2019 at 8:04 PM Joseph Orr  wrote:

> I wanted to contrast the connectivity of the different parcels in order to
> look for evidence of gradients in networks. I'll try it in matlab.
> --
> Joseph M. Orr, Ph.D.
> Assistant Professor
> Department of Psychological and Brain Sciences
> Texas A Institute for Neuroscience
> Texas A University
> College Station, TX
>
>
> On Mon, Jun 3, 2019 at 7:58 PM Glasser, Matthew 
> wrote:
>
>> Are you wanting to view the files?  You could probably translate the file
>> into a .dscalar.nii using matlab.
>>
>>
>>
>> Matt.
>>
>>
>>
>> *From: * on behalf of Joseph Orr <
>> joseph@tamu.edu>
>> *Date: *Monday, June 3, 2019 at 7:55 PM
>> *To: *HCP Users 
>> *Subject: *Re: [HCP-Users] pdconn analysis problems
>>
>>
>>
>> Thanks Tim, running cifti-transpose and separating on COLUMN solved both
>> issues. Is there a way to separate the different parcels that make up a
>> pdconn so that I can compare the connectivity maps between parcels?
>>
>>
>>
>> Thanks,
>>
>> Joe
>>
>>
>>
>> --
>>
>> Joseph M. Orr, Ph.D.
>>
>> Assistant Professor
>>
>> Department of Psychological and Brain Sciences
>>
>> Texas A Institute for Neuroscience
>>
>> Texas A University
>>
>> College Station, TX
>>
>>
>>
>>
>>
>> On Mon, Jun 3, 2019 at 5:14 PM Timothy Coalson  wrote:
>>
>> "Segmentation fault" is a computer term about invalid memory access, not
>> related to the neuroscience term of segmentation.  Due to using the wrong
>> variable while copying map names, this command can crash when using ROW
>> when the rows are longer than the columns.  You can get around it by
>> transposing and separating with COLUMN instead.  This will be fixed in the
>> next release.
>>
>>
>>
>> I don't know if this is also the reason palm was crashing.  We could make
>> a bleeding edge build available if you want to test it.
>>
>>
>>
>> Tim
>>
>>
>>
>>
>>
>> On Sat, Jun 1, 2019 at 2:17 PM Joseph Orr  wrote:
>>
>> Sure thing, here's a link to the file. Let me know if there are access
>> problems and I can try another sharing via dropbox.
>>
>> *[image: Image removed by sender.] L-ctx_R-CB_crosscorr.pdconn.nii
>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__drive.google.com_a_tamu.edu_file_d_1W9n40sZgLNSn8ODIYf9xqBoAZ5b5C71E_view-3Fusp-3Ddrive-5Fweb=DwMFaQ=ODFT-G5SujMiGrKuoJJjVg=ZKy1VO33u0kvO-PqY1gpb9Ld-AGhtT8c9PAcpsEyp70=dtQbE_Obfv7WQhtS5EGqWgePsIaI6hU895cKCRVdqX0=jwPQkvc49uGNzjH-ER9ry1R9fgOMJ093SIy8J5g11Nk=>*
>>
>>
>> --
>>
>> Joseph M. Orr, Ph.D.
>>
>> Assistant Professor
>>
>> Department of Psychological and Brain Sciences
>>
>> Texas A Institute for Neuroscience
>>
>> Texas A University
>>
>> College Station, TX
>>
>>
>>
>>
>>
>> On Sat, Jun 1, 2019 at 1:31 PM Glasser, Matthew 
>> wrote:
>>
>> It sounds like there might be both Workbench and PALM bugs here.  Perhaps
>> you could upload the data somewhere (off list if needed), so Tim and
>> Anderson could take a look?
>>
>>
>>
>> Matt.
>>
>>
>>
>> *From: * on behalf of Joseph Orr <
>> joseph@tamu.edu>
>> *Date: *Saturday, June 1, 2019 at 1:00 PM
>> *To: *HCP Users 
>> *Subject: *[HCP-Users] pdconn analysis problems
>>
>>
>>
>> I have a pdconn input (cortical ptseries by subcortical dtseries) that
>> I'd like to analyze with PALM, but I'm having some trouble. I only found
>> one old post related to this, but the only suggestion was to use -transpose
>> data flag in palm. When palm tries to read in the pdconn, I get an error
>> "Undefined function or variable 'Y'". The command line output is below. I
>> tried with the data transposed and not, but I get the same error. I tried
>> to separate the pdconn to just the volume, but this yielded a segmentation
>> error: ($ wb_command -cifti-separate input.pdconn.nii ROW -volume-all test
>>
>> /Applications/workbench/bin_macosx64/wb_command: line 14:  1248
>> Segmentation fault: 11
>>  "$directory"/../macosx

Re: [HCP-Users] pdconn analysis problems

2019-06-03 Thread Timothy Coalson
You don't need matlab for that, -cifti-change-mapping will let you reset a
dimension to scalars.  Depending on the input file, you may also need a
-cifti-transpose to get a dscalar.

Tim


On Mon, Jun 3, 2019 at 7:58 PM Glasser, Matthew  wrote:

> Are you wanting to view the files?  You could probably translate the file
> into a .dscalar.nii using matlab.
>
>
>
> Matt.
>
>
>
> *From: * on behalf of Joseph Orr <
> joseph@tamu.edu>
> *Date: *Monday, June 3, 2019 at 7:55 PM
> *To: *HCP Users 
> *Subject: *Re: [HCP-Users] pdconn analysis problems
>
>
>
> Thanks Tim, running cifti-transpose and separating on COLUMN solved both
> issues. Is there a way to separate the different parcels that make up a
> pdconn so that I can compare the connectivity maps between parcels?
>
>
>
> Thanks,
>
> Joe
>
>
>
> --
>
> Joseph M. Orr, Ph.D.
>
> Assistant Professor
>
> Department of Psychological and Brain Sciences
>
> Texas A Institute for Neuroscience
>
> Texas A University
>
> College Station, TX
>
>
>
>
>
> On Mon, Jun 3, 2019 at 5:14 PM Timothy Coalson  wrote:
>
> "Segmentation fault" is a computer term about invalid memory access, not
> related to the neuroscience term of segmentation.  Due to using the wrong
> variable while copying map names, this command can crash when using ROW
> when the rows are longer than the columns.  You can get around it by
> transposing and separating with COLUMN instead.  This will be fixed in the
> next release.
>
>
>
> I don't know if this is also the reason palm was crashing.  We could make
> a bleeding edge build available if you want to test it.
>
>
>
> Tim
>
>
>
>
>
> On Sat, Jun 1, 2019 at 2:17 PM Joseph Orr  wrote:
>
> Sure thing, here's a link to the file. Let me know if there are access
> problems and I can try another sharing via dropbox.
>
> *[image: Image removed by sender.] L-ctx_R-CB_crosscorr.pdconn.nii
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__drive.google.com_a_tamu.edu_file_d_1W9n40sZgLNSn8ODIYf9xqBoAZ5b5C71E_view-3Fusp-3Ddrive-5Fweb=DwMFaQ=ODFT-G5SujMiGrKuoJJjVg=ZKy1VO33u0kvO-PqY1gpb9Ld-AGhtT8c9PAcpsEyp70=dtQbE_Obfv7WQhtS5EGqWgePsIaI6hU895cKCRVdqX0=jwPQkvc49uGNzjH-ER9ry1R9fgOMJ093SIy8J5g11Nk=>*
>
>
> --
>
> Joseph M. Orr, Ph.D.
>
> Assistant Professor
>
> Department of Psychological and Brain Sciences
>
> Texas A Institute for Neuroscience
>
> Texas A University
>
> College Station, TX
>
>
>
>
>
> On Sat, Jun 1, 2019 at 1:31 PM Glasser, Matthew 
> wrote:
>
> It sounds like there might be both Workbench and PALM bugs here.  Perhaps
> you could upload the data somewhere (off list if needed), so Tim and
> Anderson could take a look?
>
>
>
> Matt.
>
>
>
> *From: * on behalf of Joseph Orr <
> joseph@tamu.edu>
> *Date: *Saturday, June 1, 2019 at 1:00 PM
> *To: *HCP Users 
> *Subject: *[HCP-Users] pdconn analysis problems
>
>
>
> I have a pdconn input (cortical ptseries by subcortical dtseries) that I'd
> like to analyze with PALM, but I'm having some trouble. I only found one
> old post related to this, but the only suggestion was to use -transpose
> data flag in palm. When palm tries to read in the pdconn, I get an error
> "Undefined function or variable 'Y'". The command line output is below. I
> tried with the data transposed and not, but I get the same error. I tried
> to separate the pdconn to just the volume, but this yielded a segmentation
> error: ($ wb_command -cifti-separate input.pdconn.nii ROW -volume-all test
>
> /Applications/workbench/bin_macosx64/wb_command: line 14:  1248
> Segmentation fault: 11
>  "$directory"/../macosx64_apps/wb_command.app/Contents/MacOS/wb_command "$@"
> ).
>
>
>
> Are there any additional commands I can run on a pdconn to separate each
> parcel and have a series of dconn files? I'd be interested in doing this in
> order to compare the dense connectivity maps for different parcels.
>
>
>
> Thanks!
>
> Joe
>
>
>
> *Command line output for palm*
>
> Running PALM alpha115 using MATLAB 9.5.0.1067069 (R2018b) Update 4 with
> the following options:
> -i input.pdconn.nii
> -transposedata
> -o palm
> -d design.mat
> -t design.con
> -T
> Found FSL in /usr/local/fsl
> Found FreeSurfer in /Applications/freesurfer
> Found HCP Workbench executable in
> /Applications/workbench/bin_macosx64/wb_command
> Reading input 1/1: input.pdconn.nii
> Error using palm_ready
> (/Users/josephorr/Documents/MATLAB/palm-alpha115/palm_ready.m:141)
> Undefined function or variable 'Y'
>
>
&g

Re: [HCP-Users] MMP parcellation for 7T

2019-06-03 Thread Timothy Coalson
In particular, if you are only tracking the parcels x parcels matrix, using
the 59k surfaces should make effectively no difference.  They mostly exist
to try to capture the higher resolution fMRI data.  Even if you were
capturing per-vertex tractography counts, the uncertainty in the
probabilistic tractography might render this higher resolution useless.

Tim


On Sun, Jun 2, 2019 at 7:38 PM Glasser, Matthew  wrote:

> If you haven’t run the analysis yet, I would encourage you to just use the
> 32k meshes.  No one has shown the 59k meshes to have a clear benefit and
> they take up a lot more space.
>
>
>
> Matt.
>
>
>
> *From: *"Shadi, Kamal" 
> *Date: *Sunday, June 2, 2019 at 7:35 PM
> *To: *"Glasser, Matthew" , "
> hcp-users@humanconnectome.org" 
> *Subject: *Re: [HCP-Users] MMP parcellation for 7T
>
>
>
> I want to run probtrackx using 7T diffusion data and MMP parcellation and
> since 7T data has 59k meshes I would like to have ROIs in a same mesh
> resolution.
>
>
>
> Get Outlook for iOS 
>
>
> --
>
> *From:* Glasser, Matthew 
> *Sent:* Sunday, June 2, 2019 7:42 PM
> *To:* Shadi, Kamal; hcp-users@humanconnectome.org
> *Subject:* Re: [HCP-Users] MMP parcellation for 7T
>
>
>
> I take it you are using the experimental 59k surfaces, rather than the 32k
> surfaces for general use?
>
>
>
> Matt.
>
>
>
> *From: *"Shadi, Kamal" 
> *Date: *Sunday, June 2, 2019 at 6:40 PM
> *To: *"Glasser, Matthew" , "
> hcp-users@humanconnectome.org" 
> *Subject: *Re: [HCP-Users] MMP parcellation for 7T
>
>
>
> Is there a recommended way to up-sample the MMP ROIs to 7T surfaces?
>
>
>
> Thanks,
>
> Kamal
>
>
>
> *From: *"Glasser, Matthew" 
> *Date: *Sunday, June 2, 2019 at 1:35 PM
> *To: *"Shadi, Kamal" , "
> hcp-users@humanconnectome.org" 
> *Subject: *Re: [HCP-Users] MMP parcellation for 7T
>
>
>
> There isn’t a separate 7T file yet.  We do plan to investigate the
> parcellation with 7T fMRI data.
>
>
>
> Matt.
>
>
>
> *From: * on behalf of "Shadi,
> Kamal" 
> *Date: *Sunday, June 2, 2019 at 11:32 AM
> *To: *"hcp-users@humanconnectome.org" 
> *Subject: *[HCP-Users] MMP parcellation for 7T
>
>
>
> Dear HCP Experts,
>
>
>
> Is there a *dlabel.nii* file containing 180 MMP ROIs per hemisphere for
> 7T data release? I can find the file for 3T release in BALSA
> (Q1-Q6_RelatedParcellation210.L.CorticalAreas_dil_Colors.32k_fs_LR.dlabel.nii)
> but I could not find the similar file for 7T?
>
>
>
> Thanks in advance,
>
> Kamal
>
>
>
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Convert nifti-ROIs to cifti format (subcortical)

2019-05-29 Thread Timothy Coalson
Yes, it is a similar problem to working in "image space", basically the FSL
tools work in a space resulting from taking the voxel indices and
multiplying them by the voxel dimensions, so a particular corner voxel
always has coordinates of (0, 0, 0)mm.  Because of the transition from
origin-less ANALYZE volume format to a completely defined coordinate space
in NIfTI, different software has ended up with different quirks as to
whether it properly handles NIfTI coordinates.  I think ANTs does use the
full NIfTI coordinate specification, though internally they flip it to the
ITK convention (which only matters if you need to interpret ANTs transforms
yourself, as the identity transform still does the right thing).
Unfortunately, FSL continued to use the ANALYZE coordinate specification
when doing registration and resampling of NIfTI volumes.

We started connectome workbench much more recently than FSL started, and
although workbench inherited some code from caret5 (which also used the
full NIfTI coordinate specification, but also supported ANALYZE and other
formats, leading to some issues), its volume code was rewritten from
scratch, and only supports NIfTI volumes, so we were able to avoid legacy
issues and use the full NIfTI coordinates from day 1.  FSL didn't have this
luxury, as they started before NIfTI even existed.

As to your situation, when using FSL tools to resample to a different voxel
grid, you have to either carefully generate your volume spaces in advance
to have their corners aligned in NIfTI space the same way that FSL assumes
they align in its ANALYZE-like space, or you have to deal with FSL adding
translations to your voxel coordinates every time you resample between
these voxel spaces (by using affine files containing the opposite
translations).

The different MNI template resolutions we use in the pipelines have these
special corner voxels aligned already in their NIfTI coordinates, so for
the pipelines we can tolerate the quirks of FSL resampling tools (and their
more advanced tools don't seem to have equivalents in other software
packages, and we work with them on improvements), but I can't recommend FSL
resampling tools to the unwary as the first choice for a new task in an
unknown volume space.

Tim


On Wed, May 29, 2019 at 1:16 PM Jaime Caballero  wrote:

> Thank you, Timothy
>
> I did the resampling with ANTs' WarpImageMultiTransform, as I thought the
> software used for this didn't matter. I have now done the same with
> wb_command as you suggested and the resulting ROIs are exactly the same.
>
> I could not make FSL's applywarp work, so I cannot tell by now what would
> come out from that. Anyway I'm not sure I understand why would it fail. Is
> it something related to rounding coordinates or working directly in image
> space?
>
> Regards,
> Jaime
>
> El mar., 28 may. 2019 a las 20:46, Timothy Coalson ()
> escribió:
>
>> If you need to preserve the mm coordinates of the ROI, I would not trust
>> FSL's resampling with an identity transform to get it right, because that
>> will produce a different shift depending on what coordinates a particular
>> corner voxel is at in each image (as I understand it, FSL's conventions
>> come from originally handling ANALYZE format images).  Instead, wb_command
>> -volume-affine-resample, given an identity matrix (and NOT specifying
>> -flirt) will do the resampling via nifti mm coordinates, with no guesswork.
>>
>> Tim
>>
>>
>> On Tue, May 28, 2019 at 8:08 AM Jaime Caballero 
>> wrote:
>>
>>> Thank you so much!
>>>
>>> It worked fine, and the resulting ROIs are were they are suposed to be.
>>>
>>> Regards,
>>> Jaime
>>>
>>> El lun., 27 may. 2019 22:20, Glasser, Matthew 
>>> escribió:
>>>
>>>> That file does exist in the structural package then.
>>>>
>>>>
>>>>
>>>> Matt.
>>>>
>>>>
>>>>
>>>> *From: *Jaime Caballero 
>>>> *Date: *Monday, May 27, 2019 at 3:20 PM
>>>> *To: *"Glasser, Matthew" 
>>>> *Cc: *"hcp-users@humanconnectome.org" 
>>>> *Subject: *Re: [HCP-Users] Convert nifti-ROIs to cifti format
>>>> (subcortical)
>>>>
>>>>
>>>>
>>>> Sorry for the confusion.
>>>>
>>>>
>>>>
>>>> Locally acquired data wasn't processed using the HCP pipelines, it is
>>>> in a different resolution and it was processed in volume space, no problem
>>>> there.
>>>>
>>>>
>>>>
>>>> I want to use Choi's parcellation with HCP data. All the process I
>>>> described is mi workarou

Re: [HCP-Users] Convert nifti-ROIs to cifti format (subcortical)

2019-05-28 Thread Timothy Coalson
If you need to preserve the mm coordinates of the ROI, I would not trust
FSL's resampling with an identity transform to get it right, because that
will produce a different shift depending on what coordinates a particular
corner voxel is at in each image (as I understand it, FSL's conventions
come from originally handling ANALYZE format images).  Instead, wb_command
-volume-affine-resample, given an identity matrix (and NOT specifying
-flirt) will do the resampling via nifti mm coordinates, with no guesswork.

Tim


On Tue, May 28, 2019 at 8:08 AM Jaime Caballero  wrote:

> Thank you so much!
>
> It worked fine, and the resulting ROIs are were they are suposed to be.
>
> Regards,
> Jaime
>
> El lun., 27 may. 2019 22:20, Glasser, Matthew 
> escribió:
>
>> That file does exist in the structural package then.
>>
>>
>>
>> Matt.
>>
>>
>>
>> *From: *Jaime Caballero 
>> *Date: *Monday, May 27, 2019 at 3:20 PM
>> *To: *"Glasser, Matthew" 
>> *Cc: *"hcp-users@humanconnectome.org" 
>> *Subject: *Re: [HCP-Users] Convert nifti-ROIs to cifti format
>> (subcortical)
>>
>>
>>
>> Sorry for the confusion.
>>
>>
>>
>> Locally acquired data wasn't processed using the HCP pipelines, it is in
>> a different resolution and it was processed in volume space, no problem
>> there.
>>
>>
>>
>> I want to use Choi's parcellation with HCP data. All the process I
>> described is mi workaround to adapt Choi files to HCP files.
>>
>>
>>
>> Jaime
>>
>>
>>
>> El lun., 27 may. 2019 22:12, Glasser, Matthew 
>> escribió:
>>
>> I thought you said you were using a locally collected sample you ran the
>> HCP Pipelines on?
>>
>> Matt.
>>
>>
>>
>> *From: *Jaime Caballero 
>> *Date: *Monday, May 27, 2019 at 3:10 PM
>> *To: *"Glasser, Matthew" 
>> *Subject: *Re: [HCP-Users] Convert nifti-ROIs to cifti format
>> (subcortical)
>>
>>
>>
>> Ok, thank you! I will try that.
>>
>>
>>
>> I asume the reference image
>> ${StudyFolder}/${Subject}/MNINonLinear/ROIs/Atlas_ROIs.2.nii.gz is to be
>> downloaded with the structural package?
>>
>>
>>
>> Regards,
>>
>> Jaime
>>
>>
>>
>> El lun., 27 may. 2019 a las 21:00, Glasser, Matthew ()
>> escribió:
>>
>> To make the .dscalar.nii file, you seem to be on the right track.  If the
>> Choi ROIs are properly in MNI space, hopefully you could simply use
>> applywarp --interp=nn -i  -r
>> ${StudyFolder}/${Subject}/MNINonLinear/ROIs/Atlas_ROIs.2.nii.gz -o
>>  and then the wb_command -cifti-create-dense-from-template you
>> mention.
>>
>>
>>
>> Matt.
>>
>>
>>
>> *From: *Jaime Caballero 
>> *Date: *Monday, May 27, 2019 at 12:56 PM
>> *To: *"Glasser, Matthew" 
>> *Cc: *"hcp-users@humanconnectome.org" 
>> *Subject: *Re: [HCP-Users] Convert nifti-ROIs to cifti format
>> (subcortical)
>>
>>
>>
>> The objective is to extract functional connectivity between cortical and
>> striatal ROIs, and ALFF and ReHo from both cortical and striatal ROIs.
>>
>>
>>
>> Up to this point I have imported each subject's dtseries.nii file into
>> MATLAB, and also the previously defined ROIs in an HCP-compatible format.
>> For the dtseries I have a 96854x1200 matrix, and for the ROI a 96854x1
>> matrix containing a mask, which I use to extract the time series I'm
>> interested in for further processing.
>>
>>
>>
>> Jaime
>>
>>
>>
>>
>>
>> El lun., 27 may. 2019 a las 19:14, Glasser, Matthew ()
>> escribió:
>>
>> What do you plan to do with the file?
>>
>>
>>
>> Matt.
>>
>>
>>
>> *From: *Jaime Caballero 
>> *Date: *Monday, May 27, 2019 at 12:08 PM
>> *To: *"Glasser, Matthew" 
>> *Cc: *"hcp-users@humanconnectome.org" 
>> *Subject: *Re: [HCP-Users] Convert nifti-ROIs to cifti format
>> (subcortical)
>>
>>
>>
>> A .dscalar.nii output, I think. Basically I want an equivalent of the
>> nifti ROI, but in cifti: for each voxel/vertex, value 1 if inside the ROI,
>> 0 if outside. Would a dlabel file be better for this application?
>>
>>
>>
>> El lun., 27 may. 2019 a las 19:03, Glasser, Matthew ()
>> escribió:
>>
>> Are you wanting a .dlabel.nii output or a .dscalar.nii output?
>>
>>
>>
>> Matt.
>>
>>
>>
>> *From: * on behalf of Jaime
>> Caballero 
>> *Date: *Monday, May 27, 2019 at 10:37 AM
>> *To: *"hcp-users@humanconnectome.org" 
>> *Subject: *[HCP-Users] Convert nifti-ROIs to cifti format (subcortical)
>>
>>
>>
>> Dear experts
>>
>>
>>
>> In my center we are studying cortico-striatal functional connectivity,
>> and cortical/striatal local measures (ALFF, ReHo) on a locally acquired
>> sample. For that we are using Choi's functional parcellation, distributed
>> as a volumetric NIFTI file, in MNI152 space. We want to validate our
>> measures with a subset of the S1200 release (resting state, 3T).
>> Specifically I have used the ICA-FIX cleaned and MSM-all registered files,
>> i.e.:
>>
>>
>> /MNINonLinear/Results/rfMRI_REST1_LR/rfMRI_REST1_LR_Atlas_MSMAll_hp2000_clean.dtseries.nii
>>
>>
>>
>> The thing is I have doubts on the way to convert nifti ROIs to cifti
>> format in this case.
>>
>>
>>
>> My first approach:
>>
>>
>>
>> 1. Load a cifti template file and 

Re: [HCP-Users] Error while running "GenericfMRIVolumeProcessingPipeline.sh"

2019-05-21 Thread Timothy Coalson
This may not be related to your particular problem, but you need to have
FSL 6.0.1 for some of the pipelines (MR FIX in particular).  Using fslhd on
the BothPhases and Mask files should give others on the list some
information to work with.

Tim


On Tue, May 21, 2019 at 8:18 AM Simon Wein <
simon.w...@psychologie.uni-regensburg.de> wrote:

> Dear all,
>
> we are experiencing problems when performing the fMRI volumetric
> processing with "GenericfMRIVolumeProcessingPipeline.sh":
>
> Image Exception : #3 :: Attempted to multiply images of different sizes
> terminate called after throwing an instance of 'std::runtime_error'
>   what():  Attempted to multiply images of different sizes
> /loctmp/CUDA/Pipelines-master_fs6/global/scripts/TopupPreprocessingAll.sh:
> line 269: 10710 Aborted ${FSLDIR}/bin/fslmaths
> ${WD}/BothPhases -abs -add 1 -mas ${WD}/Mask -dilM -dilM -dilM -dilM -dilM
> ${WD}/BothPhases
>
>
> Environment:
> 1. Debian 9.0
> 2. HCP pipeline 4.0.0
> 3. Workbench 1.3.2
> 4. FreeSurfer 6.0
> 5. FSL 6.0
> 6. gradunwarp (HCP) 1.0.3
>
>
> We would be thankful for any help!
>
> Kind regards
> Simon
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Probabilistic tractography for dense connectome

2019-05-17 Thread Timothy Coalson
You can also do that with wb_command -cifti-parcellate using "-method SUM".

Tim


On Fri, May 17, 2019 at 11:04 AM Aaron C  wrote:

> Hi Stam and Matt,
>
> I have one more question about this. I got the dense connectome and would
> like to calculate structural connectivity between the parcels in Matt's
> 360-parcel parcellation scheme. For the connectivity between parcel A and
> B, in the dense connectome matrix, I first located the rows belonging to
> A and the columns belonging to B, and then summed up the values in this
> submatrix. I used this summed value as the connectivity between parcel A
> and B. Would this make sense? Thank you.
>
> Aaron
> --
> *From:* Stamatios Sotiropoulos 
> *Sent:* Wednesday, May 1, 2019 7:05 PM
> *To:* Glasser, Matthew
> *Cc:* Aaron C; hcp-users@humanconnectome.org
> *Subject:* Re: [HCP-Users] Probabilistic tractography for dense connectome
>
> This may be the reason indeed.
>
> The Pretractography script creates both native space and MNI space ROIs.
> You need to make sure you use the MNI ones (I.e. created via the
> MakeTrajectorySpace_MNI.sh script).
>
> Stam
>
> On 1 May 2019, at 23:52, Glasser, Matthew  wrote:
>
> Is that because this is a native space grayordinates instead of an MNI
> space grayordinates and thus the masks are subject specific?
>
> Matt.
>
> From: Aaron C 
> Date: Wednesday, May 1, 2019 at 12:27 PM
> To: Stamatios Sotiropoulos 
> Cc: Matt Glasser , "hcp-users@humanconnectome.org" <
> hcp-users@humanconnectome.org>
> Subject: Re: [HCP-Users] Probabilistic tractography for dense connectome
>
> Hi Stam,
>
> I tried your PreTractography script to generate these files needed
> for probtrackx2, and then used the following command (the command from the
> HCP course for generating dense connectome):
>
> probtrackx2_gpu --samples=../T1w/Diffusion.bedpostX/merged \
> --mask=../T1w/Diffusion.bedpostX/nodif_brain_mask \
> --xfm=xfms/standard2acpc_dc \
> --invxfm=xfms/acpc_dc2standard --seedref=T1w_restore.2.nii.gz \
> --loopcheck --forcedir -c 0.2 --sampvox=2 --randfib=1 \
> --stop=Connectomes/stop --wtstop=Connectomes/wtstop \
> --waypoints=ROIs/Whole_Brain_Trajectory_ROI_2 \
> -x ROIs/Whole_Brain_Trajectory_ROI_2 --omatrix3 \
> --target3=Connectomes/Grayordinates.txt --dir=Connectomes
>
> The command completed without error. I then used the MATLAB to load the
> connectivity matrix:
>
> x=load(''fdt_matrix3.dot');
> M=spconvert(x);
>
> However, the dimension of M is only 86392 x 86392, not 91282 x 91282.
>
> So I tried the same probtrackx2 command, but instead used the files from
> the HCP course virtual machine for the input to probtrackx2 (so this time
> I know the input files should be correct), but the dimension is still 86392
> x 86392, not 91282 x 91282.
>
> If possible, would you please give me some hints to find out the missing
> grayordinates in this connectivity matrix? Thank you!
>
> Aaron
> --
> *From:* Stamatios Sotiropoulos 
> *Sent:* Friday, April 26, 2019 10:58 AM
> *To:* Aaron C
> *Cc:* Glasser, Matthew; hcp-users@humanconnectome.org
> *Subject:* Re: [HCP-Users] Probabilistic tractography for dense connectome
>
> Hi Aaron
>
> You need the PreTractography script, available in one of the branches of
> the WU-pipelines.
>
>
> https://github.com/Washington-University/HCPpipelines/tree/diffusion-tractography/DiffusionTractography
>
> Best wishes
> Stam
>
>
>
> On 26 Apr 2019, at 15:50, Aaron C  wrote:
>
> Hi Matt,
>
> Thank you for letting me know. The full command I mentioned is as follows:
>
> probtrackx2 --samples=../T1w/Diffusion.bedpostX/merged \
> --mask=../T1w/Diffusion.bedpostX/nodif_brain_mask \
> --xfm=xfms/standard2acpc_dc \
> --invxfm=xfms/acpc_dc2standard --seedref=T1w_restore.2.nii.gz \
> --loopcheck --forcedir -c 0.2 --sampvox=2 --randfib=1 \
> --stop=Connectome/stop --wtstop=Connectome/wtstop \
> --waypoints=ROIs/Whole_Brain_Trajectory_ROI_2 \
> -x ROIs/Whole_Brain_Trajectory_ROI_2 --omatrix3 \
> --target3=Connectomes/GrayOrdinates.txt --dir=Connectomes
>
> It's in the HCP course practical "Fibre Orientation Models and
> Tractography Analysis" taught by Matteo Bastiani. Thank you.
> --
> *From:* Glasser, Matthew 
> *Sent:* Thursday, April 25, 2019 7:04 PM
> *To:* Aaron C; hcp-users@humanconnectome.org
> *Cc:* Stamatios Sotiropoulos
> *Subject:* Re: [HCP-Users] Probabilistic tractography for dense connectome
>
> Not as far as I am aware, but Stam might know.
>
> Matt.
>
> From:  on behalf of Aaron C <
> aaroncr...@outlook.com>
> Date: Thursday, April 25, 2019 at 9:15 AM
> To: "hcp-users@humanconnectome.org" 
> Subject: [HCP-Users] Probabilistic tractography for dense connectome
>
> Dear HCP experts,
>
> I have a question about the probabilistic tractography command used for
> generating dense connectome (
> https://wustl.app.box.com/s/wna2cu94pqgt8zskg687mj8zlmfj1pq7). Are there
> any shared scripts for generating "pial.L.asc", "white.L.asc",
> 

Re: [HCP-Users] Creating New dscalar files

2019-05-13 Thread Timothy Coalson
Dense files have independent values for every vertex and voxel that is
used.  What you are describing (one value per ROI) is a parcellated file,
such as pscalar.  To use them, first make your ROIs into a dlabel file
(give each ROI a separate integer value, and then use wb_command
-cifti-label-import), then use -cifti-parcellate on any same-resolution
dscalar file to get a pscalar file, and then you can use wb_command
-cifti-convert -from-text to turn a text file of numbers into the pscalar
equivalent.  These files will display as expected on the surface and in the
volume, and can additionally be shown in the chart view.

If you actually want a dense file where the values of all the vertices of
an ROI are stored separately as floating point, but all have the same
value, you should probably use matlab to do that.  We don't typically
generate files like this.

Tim


On Mon, May 13, 2019 at 11:42 AM Franzmeier, Nicolai Dr. <
nicolai.franzme...@med.uni-muenchen.de> wrote:

> To whom it may concern,
> I have a brief question about creating new dscalar files based on a given
> brain parcellation for visualization with the connectome workbench.
> I want to use the attached dscalar template (covering 200 ROIs in total),
> and replace the existing ROI intensities with a new 200-element numeric
> vector.
> I was going through the online documentation but couldn’t find a solution.
> Could you help?
> Thanks in advance and best wishes!
> Nico
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Trouble with command line wb_view

2019-05-10 Thread Timothy Coalson
Open a new terminal or do "rehash" in the existing one, and see if that
works - shells have tricks to remember what executables are in what
directories, so they don't have to search the filesystem every time they
run something, and this cleverness can be tricked by changing things
without telling the shell about it (apparently the "autorehash" tcsh option
is supposed to prevent this problem, and bash avoids it by scanning the
filesystem whenever it doesn't find a match in the internal lookup).

If that didn't help, do "ls -l /Applications/workbench/bin_macosx64/".  You
should see something like this:

$ ls -l workbench/bin_macosx64/
total 40
-rwxr-xr-x 1 tim velab   312 Aug  5  2015 wb_command
-rwxr-xr-x 1 tim velab   310 Aug  5  2015 wb_import
-rwxr-xr-x 1 tim velab 27622 Aug 28  2018 wb_shortcuts
-rwxr-xr-x 1 tim velab   306 Aug  5  2015 wb_view

If the "x"s or "r"s on the left of my example are "-"s for you, then you
need to add read and execute permissions with "chmod a+rx
/Applications/workbench/bin_macosx64/*".  If the files are missing, then it
wasn't extracted correctly.

Tim


On Fri, May 10, 2019 at 12:25 PM Michael Amlung 
wrote:

> I cannot get the command line tools to work for wb_view on Mac.
> > wb_view
> wb_view: Command not found.
>
> I have tried everything in the README.txt file, including adding the line
> to the path.
>
> I confirmed that my .cshrc looks like this:
> set PATH = ($PATH /Applications/workbench/bin_macosx64/)
>
> when I print the PATH, I see /Applications/workbench/bin_macosx64
>
> I confirmed I am running /bin/tcsh
>
> But for some reason it cannot locate the wb_view command. The program
> opens fine if I open it form the LaunchPad.
>
> Any suggestions?
>
> **
> *Michael Amlung, Ph.D.*
>
> Email: michael.aml...@gmail.com
>
> **
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] matlab cifti functions dependency issue?

2019-05-09 Thread Timothy Coalson
Additionally, this may already be fixed in the latest master (as of 3 weeks
ago).

Tim


On Thu, May 9, 2019 at 1:47 PM Timothy Coalson  wrote:

> The quick solution is to add that path to your default matlab path, with
> the added benefit that you can then use ciftiopen and related in your own
> code.  Our setups always have a version of these functions in the default
> matlab path, which is probably why we missed this.
>
> Tim
>
>
> On Thu, May 9, 2019 at 1:08 PM Moataz Assem <
> moataz.as...@mrc-cbu.cam.ac.uk> wrote:
>
>> Hi,
>>
>>
>>
>> When running MSMAllPipelineBatch, it ultimately calls the matlab function
>> ComputeVN (in my case, running the matlab interpreted version) which then
>> crashes because it doesn’t recognize the “ciftiopen” function. Obviously it
>> can’t find the directory where the function is. I have already pointed
>> $MATLAB_GIFTI_LIB in SetUpHCPPipeline.sh to
>> …./HCPpipelines-4.0.0/global/matlab  but I suspect this isn’t used properly.
>>
>>
>>
>> The MSMAllPipelineBatch doesn’t ask for any related directory inputs.
>> When computeVN is called by SingleSubjectConcat.sh, there is no addpath
>> pointing to the cifti functions. So am I missing something and the
>> directory does get pointed to somewhere earlier?
>>
>>
>>
>> Thanks
>>
>>
>>
>> Moataz
>>
>> ___
>> HCP-Users mailing list
>> HCP-Users@humanconnectome.org
>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] matlab cifti functions dependency issue?

2019-05-09 Thread Timothy Coalson
The quick solution is to add that path to your default matlab path, with
the added benefit that you can then use ciftiopen and related in your own
code.  Our setups always have a version of these functions in the default
matlab path, which is probably why we missed this.

Tim


On Thu, May 9, 2019 at 1:08 PM Moataz Assem 
wrote:

> Hi,
>
>
>
> When running MSMAllPipelineBatch, it ultimately calls the matlab function
> ComputeVN (in my case, running the matlab interpreted version) which then
> crashes because it doesn’t recognize the “ciftiopen” function. Obviously it
> can’t find the directory where the function is. I have already pointed
> $MATLAB_GIFTI_LIB in SetUpHCPPipeline.sh to
> …./HCPpipelines-4.0.0/global/matlab  but I suspect this isn’t used properly.
>
>
>
> The MSMAllPipelineBatch doesn’t ask for any related directory inputs. When
> computeVN is called by SingleSubjectConcat.sh, there is no addpath pointing
> to the cifti functions. So am I missing something and the directory does
> get pointed to somewhere earlier?
>
>
>
> Thanks
>
>
>
> Moataz
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Reporting dense analysis results

2019-04-25 Thread Timothy Coalson
For just finding the overlap of some (positive-only) map with the parcels,
the script would likely be a lot simpler if you used -cifti-parcellate with
the "-method SUM" option (when doing so, I would also recommend using
vertex areas, so that the resulting numbers are surface-area integrals
rather than based on number of vertices).  You can then use -cifti-stats
SUM to get the total, and divide by that in -cifti-math to get percentages.

Sharing the data files of the results means that to some extent, tables may
not be as necessary.  I don't have a strong opinion here.  Personally, I
like figures, but I haven't done/used meta-analysis.

Tim


On Thu, Apr 25, 2019 at 8:50 AM Stevens, Michael <
michael.stev...@hhchealth.org> wrote:

> Hi folks,
>
>
>
> Yesterday’s question/replies on reporting tables of pscalar results
> prompted us to ask about a related question – I’m wondering what HCP folks
> recommend in terms of the format of tabulating/reporting straightforward
> “activation results” for DENSE data?  I couldn’t find a prior listserv post
> that exactly addressed this question, nor did a couple passes through
> recently published literature using HCP methodology turn up a good example
> to follow.  Could be I’m just missing stuff…
>
>
>
> We’re finishing up analyses on a somewhat conceptually novel analysis that
> we think might be received at peer review better if we report the dense
> results.  So we sorta envision reporting a table of clusters/cluster peaks
> where we refer to the 2017 parcellation paper for annotations, e.g.,
> “Cluster 1 – Left IFSp (72%), Left IFJa (26%), Left IFSa (2%)”.  To get
> there, I’m picturing a do-able, yet somewhat awkward combination of cluster
> finding calls, label file references, ROI definitions, finding
> peaks/center-of-mass, and then a whole a bunch of –cifti-math operations to
> determine overlap of clusters vs. parcels… The number of steps/operations
> that would go into this is enough that I’m just brought up short thinking,
> “Wait, am I possibly missing something…”
>
>
>
> Before I start going down this path in coding something like this up, I
> thought I’d check two things:
>
>
>
> A) Is there a different conceptual approach altogether that you’d
> recommend considering for showcasing dense analysis results?  Our goal
> ultimately is to simply reinforce our results are fairly compatible with
> the demarcations of the 360-parcel atlas to remove a potential reviewer
> criticism (this analysis is some weird stuff… using spontaneous
> fluctuations of electrodermal signals as event-onsets for fMRI timeseries
> analyses… amazingly, it seemed to work, with pretty interesting results
> that mirror our connectivity analyses on the same data).  But if HCP has an
> entirely different approach to tabulating/summarizing dense results, we’d
> welcome being brought up-to-speed.
>
>
>
> B) The lazy part of me wonders… Has someone already coded up workbench
> function call or even a script for the various wb_commands needed that
> might already do this sort of thing with dense data?  Again, this seems so
> meat-and-potatoes for fMRI that we don’t want to re-invent the wheel here.
>
>
>
> Thanks,
>
> Mike
>
>
>
> *This e-mail message, including any attachments, is for the sole use of
> the intended recipient(s) and may contain confidential and privileged
> information. Any unauthorized review, use, disclosure, or distribution is
> prohibited. If you are not the intended recipient, or an employee or agent
> responsible for delivering the message to the intended recipient, please
> contact the sender by reply e-mail and destroy all copies of the original
> message, including any attachments. *
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] "activation" tables for reporting pscalar results

2019-04-24 Thread Timothy Coalson
We recommend sharing the results as data files (as mentioned, this is the
intent of BALSA), even if you choose to report MNI coordinates in the
text.  Something to keep in mind is that group average surfaces do not
behave like group average volume data, the surface gets smoothed out
wherever folding patterns aren't fully aligned, resulting in a surface that
does not approach gyral crowns or sulcal fundi (most notably with
functional alignment such as MSMAll - freesurfer-aligned surfaces will
average to something with more folding preserved, at the cost of functional
locality, but there are still locations with high variability in folding
patterns across subjects that will still get smoothed out on a group
average surface).  See supplementary material, figure S1, and figure S9
panel B2, from our paper on the effects of volume-based methods:

https://www.ncbi.nlm.nih.gov/pubmed/29925602

If meta analysis of this sort is only intended to give a very rough idea of
location, even this may not be a deal breaker.  You can use wb_command
-surface-coordinates-to-metric to get the coordinates as data, use
-cifti-create-dense-from-template to convert that to cifti, and then use
-cifti-parcellate on that to get center of gravity coordinates of the
vertices used.  Note that these center of gravity coordinates could be a
distance away from the surface, due to curvature.

Tim


On Wed, Apr 24, 2019 at 11:06 AM Joseph Orr  wrote:

> True - these kind of tools generally assume certain degrees of smoothing,
> which isn't the case with surface-based. And activation based meta-analysis
> will apply a kernel that will likely extend outside the brain for a surface
> activation that is not within a sulcus. I'd be curious to hear what those
> more familiar with meta-analytic methods think about how surface-based
> results can be incorporated with volumetric results.
> --
> Joseph M. Orr, Ph.D.
> Assistant Professor
> Department of Psychological and Brain Sciences
> Texas A Institute for Neuroscience
> Texas A University
> College Station, TX
>
>
> On Wed, Apr 24, 2019 at 11:00 AM Harms, Michael  wrote:
>
>>
>>
>> Well, that raises the question if surface-based results should just be
>> automatically “lumped in” with volume-based results by tools such as
>> neurosynth to begin with…
>>
>>
>>
>> --
>>
>> Michael Harms, Ph.D.
>>
>> ---
>>
>> Associate Professor of Psychiatry
>>
>> Washington University School of Medicine
>>
>> Department of Psychiatry, Box 8134
>>
>> 660 South Euclid Ave.Tel: 314-747-6173
>>
>> St. Louis, MO  63110  Email: mha...@wustl.edu
>>
>>
>>
>> *From: *Joseph Orr 
>> *Date: *Wednesday, April 24, 2019 at 10:51 AM
>> *To: *"Harms, Michael" 
>> *Cc: *HCP Users 
>> *Subject: *Re: [HCP-Users] "activation" tables for reporting pscalar
>> results
>>
>>
>>
>> Well I am planning on doing that, but that doesn't necessarily help with
>> automated meta-analytic tools like neurosynth that mine for tables.
>>
>> --
>>
>> Joseph M. Orr, Ph.D.
>>
>> Assistant Professor
>>
>> Department of Psychological and Brain Sciences
>>
>> Texas A Institute for Neuroscience
>>
>> Texas A University
>>
>> College Station, TX
>>
>>
>>
>>
>>
>> On Wed, Apr 24, 2019 at 10:36 AM Harms, Michael  wrote:
>>
>>
>>
>> Why not simply report the parcel name and its values?  And consider
>> putting the scene on BALSA, so that others can easily access the data.
>>
>>
>>
>> Cheers,
>>
>> -MH
>>
>>
>>
>> --
>>
>> Michael Harms, Ph.D.
>>
>> ---
>>
>> Associate Professor of Psychiatry
>>
>> Washington University School of Medicine
>>
>> Department of Psychiatry, Box 8134
>>
>> 660 South Euclid Ave.Tel: 314-747-6173
>>
>> St. Louis, MO  63110  Email: mha...@wustl.edu
>>
>>
>>
>> *From: * on behalf of Joseph Orr <
>> joseph@tamu.edu>
>> *Date: *Wednesday, April 24, 2019 at 10:06 AM
>> *To: *HCP Users 
>> *Subject: *[HCP-Users] "activation" tables for reporting pscalar results
>>
>>
>>
>> I am trying to determine the best approach for producing tables of
>> pscalar results. I haven't seen any papers reporting pscalar results that
>> have tables, but I anticipate reviewers wanting to see these, and tables
>> are critical for meta-analyses. Since there aren't peaks, I was thinking of
>> calculating the center of mass after converting the significant parcels to
>> a volume. Has anyone done this already for the Multi-Modal Parcellation? Or
>> is there a reason that I'm not thinking of that doing this is not ideal or
>> even not valid?
>>
>>
>>
>> Thanks,
>>
>> Joe
>>
>> --
>>
>> Joseph M. Orr, Ph.D.
>>
>> Assistant Professor
>>
>> Department of Psychological and Brain Sciences
>>
>> Texas A Institute for Neuroscience
>>
>> Texas A University
>>
>> College Station, TX
>>
>> ___
>> HCP-Users mailing list
>> 

Re: [HCP-Users] hp2000 filter not applied to hp2000_clean.nii.gz volume data for some (one?) subjects?

2019-04-23 Thread Timothy Coalson
Correction, the issue to follow is #107:

https://github.com/Washington-University/HCPpipelines/issues/107

Tim


On Tue, Apr 23, 2019 at 4:35 PM Harms, Michael  wrote:

>
>
> For users that want to follow this, please see:
>
> https://github.com/Washington-University/HCPpipelines/issues/108
>
>
>
> It has something to do with the fact that we needed to apply manual
> reclassification of the FIX output in that particular subject/run.
>
>
>
> Cheers,
>
> -MH
>
>
>
> --
>
> Michael Harms, Ph.D.
>
> ---
>
> Associate Professor of Psychiatry
>
> Washington University School of Medicine
>
> Department of Psychiatry, Box 8134
>
> 660 South Euclid Ave.Tel: 314-747-6173
>
> St. Louis, MO  63110  Email: mha...@wustl.edu
>
>
>
> *From: * on behalf of Keith
> Jamison 
> *Date: *Tuesday, April 23, 2019 at 3:59 PM
> *To: *HCP Users 
> *Subject: *[HCP-Users] hp2000 filter not applied to hp2000_clean.nii.gz
> volume data for some (one?) subjects?
>
>
>
> For subject 204218, both REST1_LR and REST1_RL, I noticed a linear trend
> in the *_hp2000_clean.nii.gz NIFTI time series, but the
> hp2000_clean.dtseries.nii CIFTI files do not have this trend. See attached
> figures showing this issue for both REST1_LR and REST1_RL for 204218. The
> overall mean time series has a negative trend for NIFTI, but in the voxel
> time series on the left you can see that some have positive trend and some
> have negative. To test, I did run fslmaths-based filtering on
> hp2000_clean.nii.gz and I no longer see any linear trend.
>
> I tried one scan in one additional subject, 102311 REST1_LR, and did not
> see this linear trend in either NIFTI or CIFTI (also attached).
>
> Note: I did remove the overall mean for each voxel timecourse before
> plotting, and for the NIFTI I'm only showing gray matter voxels, as
> determined by downsampling aparc+aseg.nii.gz and excluding labels for
> WM,CSF,ventricles, and a few misc. I also tried looking at all non-zero
> voxels, as well as only those marked in
> RibbonVolumeToSurfaceMapping/goodvoxels.nii.gz, but the issue of linear
> trends is the same.
>
> Any idea what might be going on with this subject? I haven't tried this in
> anyone other than 204218 (bad) and 102311 (good).
>
> -Keith
>
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Parcellation

2019-04-23 Thread Timothy Coalson
We recommend mapping the individual cortical data to surfaces before doing
anything else with the data.  If you have fieldmaps, and high-res T1w and
T2w, you may be able to use the HCP pipelines to do this:

https://github.com/Washington-University/HCPpipelines

If you don't have these scans, another possibility is ciftify:

https://github.com/edickie/ciftify

Either of these will output your data in cifti format, and you can then use
wb_command -cifti-parcellate on it.

You can accurately map the parcellation into an individual's volume space
using their surfaces, but that uses the same information needed to map the
subject's data to the surface anyway.  However, it is generally not
possible to accurately map the parcellation to group-averaged volume data,
because existing volume registration does not achieve the same alignment
precision across subjects that surface registration does, see our recent
paper:

https://www.ncbi.nlm.nih.gov/pubmed/29925602

Tim


On Tue, Apr 23, 2019 at 12:27 PM Briend, Frederic  wrote:

> Dear hcp-users members,
>
>
> I would like to use a parcellation (Glasser's atlas 360), to reduce the
> dimensionality from my image (resting-state images in .nii) from voxels to
> parcels (averaging my timeseries across each parcel).
>
> How is it possible to do that, as advice by Matthew here
> ,
> by the Connectome Workbench’s wb_command -cifti-parcellate ?
>
>
> Thanks a lot.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Diffusion Preprocessing Failure

2019-04-22 Thread Timothy Coalson
I haven't used eddy, but since it looks like an output file, my first
thought is permissions - does that folder exist, and can the user you are
running the eddy job as write files to it?  Does a file with that name
exist, and not have write permissions?

Tim


On Mon, Apr 22, 2019 at 11:39 AM Timothy Hendrickson 
wrote:

> Hello,
>
> I am attempting to run the diffusion preprocessing with HCP version 3.27.0
> with the GPU enabled eddy, however I receive an error that
> "eddy_unwarped_Neg" could not be opened, see below:
>
>  START: eddy_postproc
>
> JAC resampling has been used. Eddy Output is now combined.
>
> Image Exception : #22 :: ERROR: Could not open image
> /output_dir/sub-9276/ses-52
> 742/Diffusion/eddy/eddy_unwarped_Neg
>
> I'm looking for an intuition as to what could have happened.
>
> Best,
>
> -Tim
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] wb_command -metric-tfce

2019-04-22 Thread Timothy Coalson
For the most part, it exists because I was testing a new method for
computing the TFCE transform itself (it is an integral containing cluster
size which other utilities generally approximate by using many different
thresholds).  We do not currently do statistical testing within wb_command.

Tim


On Sun, Apr 21, 2019 at 7:00 AM Reza Rajimehr  wrote:

> Thanks Michael. We will use PALM. But what is the application of
> wb_command -metric-tfce then?
>
>
> On Sun, Apr 21, 2019 at 2:38 AM Harms, Michael  wrote:
>
>>
>>
>> Hi,
>>
>> We suggest you use PALM, since you need to use permutation to determine
>> the distribution of the TFCE metric.
>>
>>
>>
>> Cheers,
>>
>> -MH
>>
>>
>>
>> --
>>
>> Michael Harms, Ph.D.
>>
>> ---
>>
>> Associate Professor of Psychiatry
>>
>> Washington University School of Medicine
>>
>> Department of Psychiatry, Box 8134
>>
>> 660 South Euclid Ave
>> .
>> Tel: 314-747-6173
>>
>> St. Louis, MO  63110  Email: mha...@wustl.edu
>>
>>
>>
>> *From: * on behalf of Reza
>> Rajimehr 
>> *Date: *Saturday, April 20, 2019 at 7:18 PM
>> *To: *hcp-users 
>> *Subject: *[HCP-Users] wb_command -metric-tfce
>>
>>
>>
>> Hi,
>>
>>
>>
>> We have curvature data from two groups of subjects in a common anatomical
>> space (MSMAll). We performed a univariate comparison between the two groups
>> using t-test. We now have a map, which shows vertex-wise curvature
>> difference between the two groups (the curvature difference is shown only
>> for vertices which have a significant difference). The next step is to do
>> cluster-wise correction using TFCE method. It looks like this command can
>> do what we want:
>>
>>
>>
>> https://www.humanconnectome.org/software/workbench-command/-metric-tfce
>>
>>
>>
>> However, we couldn’t find any example command, and its usage is a bit
>> unclear for us. For example, how should we specify the two groups?
>>
>>
>>
>> Any help would be appreciated.
>>
>>
>>
>> Note: Our analysis here is somewhat similar to the analysis in Figure 5B
>> in Van Essen et al. 2012 paper:
>>
>>
>>
>> https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3432236/pdf/bhr291.pdf
>>
>>
>>
>> Best,
>>
>> Reza
>>
>> ___
>> HCP-Users mailing list
>> HCP-Users@humanconnectome.org
>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>
>>
>> --
>>
>> The materials in this message are private and may contain Protected
>> Healthcare Information or other information of a sensitive nature. If you
>> are not the intended recipient, be advised that any unauthorized use,
>> disclosure, copying or the taking of any action in reliance on the contents
>> of this information is strictly prohibited. If you have received this email
>> in error, please immediately notify the sender via telephone or return mail.
>>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] DeDriftAndResamplePipeline error

2019-04-18 Thread Timothy Coalson
Make sure you have the whole pipelines repo for 4.0.0, do not try to mix
and match folders from different versions, and make sure your setup script
is pointed to the 4.0.0 version when running things from 4.0.0.  The
log_Warn function is defined inside global/scripts, and it should get
sourced automatically based on HCPPIPEDIR, so make sure that is set
correctly (pointed to the 4.0.0 version).

Tim


On Thu, Apr 18, 2019 at 1:39 PM Marta Moreno 
wrote:

> Thanks for your response. And sorry to bother again with this issue but I
> am still getting the following error: ReApplyFixMultiRunPipeline.sh: line
> 592: log_Warn: command not found
>
> Please find log files attached.
>
> Pipelines for MR+FIX, MSMAll and DeDriftAndResample are from version version
> 4.0.0.
> PreFreeSurfer, FreeSurfer, PostFreeSurfer, fMRIVolume, fMRISurface are
> from version  3_22
> Since MR+FIX and MSMAll were run successfully, why it should be a version
> issue in ReApplyFixMultiRunPipeline.sh?
>
> I want to be sure this is a version issue because I have already run 
> PreFreeSurfer,
> FreeSurfer, PostFreeSurfer, fMRIVolume, fMRISurface version  3_22 on a
> sample of 30 patients pre/post tx.
>
> Thanks a lot for your help and patience.
>
> Leah.
>
>
>
> On Apr 15, 2019, at 9:39 PM, Timothy Coalson  wrote:
>
> I would also suggest changing your log level to INFO in wb_view,
> preferences (the wb_command option does not store the logging level change
> to preferences).  We should probably change the default level, or change
> the level of that volume coloring message.
>
> Tim
>
>
> On Mon, Apr 15, 2019 at 8:34 PM Timothy Coalson  wrote:
>
>> I have pushed a similar edit to reapply MR fix, please update to the
>> latest master.
>>
>> Tim
>>
>>
>> On Mon, Apr 15, 2019 at 8:27 PM Timothy Coalson  wrote:
>>
>>> They weren't instructions, I pushed an edit, and it was a different
>>> script.
>>>
>>> Tim
>>>
>>>
>>> On Mon, Apr 15, 2019 at 8:08 PM Glasser, Matthew 
>>> wrote:
>>>
>>>> Here is the error:
>>>>
>>>> readlink: illegal option -- f
>>>> usage: readlink [-n] [file ...]
>>>>
>>>> I believe Tim already gave you instructions for this.
>>>>
>>>> Also, the log_Warn line is again concerning as to whether you followed
>>>> the installation instructions and all version 4.0.0 files here.
>>>>
>>>> Matt.
>>>>
>>>> From: Marta Moreno 
>>>> Date: Monday, April 15, 2019 at 8:53 AM
>>>> To: Matt Glasser 
>>>> Cc: HCP Users , Timothy Coalson <
>>>> tsc...@mst.edu>, "Brown, Tim" 
>>>> Subject: Re: [HCP-Users] DeDriftAndResamplePipeline error
>>>>
>>>> I had to re-run DeDriftAndResamplePipeline twice because it was
>>>> searching for settings.sh in the wrong place, and now I am getting the
>>>> following error message:
>>>> ReApplyFixMultiRunPipeline.sh: line 586: log_Warn: command not found
>>>>
>>>> I am attaching log files.
>>>>
>>>> Does the folder containing fix1.067 need to include all the ICAFIX
>>>> files?
>>>>
>>>> Thanks a lot!
>>>>
>>>> Leah.
>>>>
>>>>
>>>> On Apr 15, 2019, at 12:19 AM, Marta Moreno 
>>>> wrote:
>>>>
>>>> It seams to be working now. Thanks a lot!
>>>>
>>>> Leah.
>>>>
>>>> On Apr 15, 2019, at 12:04 AM, Glasser, Matthew 
>>>> wrote:
>>>>
>>>> If you ran MR+FIX, you need to set these appropriately
>>>>
>>>> MRFixConcatName="NONE"
>>>> MRFixNames="NONE"
>>>>
>>>> And not set
>>>>
>>>> fixNames="RS_fMRI_1 RS_fMRI_2" #Space delimited list or NONE
>>>>
>>>>
>>>> https://github.com/Washington-University/HCPpipelines/blob/master/Examples/Scripts/DeDriftAndResamplePipelineBatch.sh
>>>>
>>>> Also it looks like line 124 needs an “s” on the end of the flag name to
>>>> read --multirun-fix-concat-names=${MRFixConcatName}
>>>>
>>>> Matt.
>>>>
>>>> From: Marta Moreno 
>>>> Date: Sunday, April 14, 2019 at 10:56 PM
>>>> To: Matt Glasser 
>>>> Cc: HCP Users 
>>>> Subject: Re: [HCP-Users] DeDriftAndResamplePipeline error
>>>>
>>>> Thanks a lot for your

Re: [HCP-Users] Assigning the results of -cifti-correlation to appropriate resting state networks

2019-04-17 Thread Timothy Coalson
I don't know which parcels are assigned to each network, but if you need to
know the current order of the parcels, wb_command -file-information will
show that.

If you have a dlabel file with the networks as labels, you can put that
through -cifti-parcellate to get each parcel labeled with its majority
network, and if you want a dlabel file containing the reordering you used
for the parcels, you can use -cifti-parcel-mapping-to-label.

Tim


On Wed, Apr 17, 2019 at 10:32 AM Jayasekera, Dinal <
dinal.jayasek...@wustl.edu> wrote:

> Dear all,
>
>
> I have run a wb_command, specifically -cifti-parcellate, -cifti-reorder
> and -cifti-correlation, on some functional connectivity data to extract and
> plot an adjacency matrix on MATLAB. The matrix that is generated is a
> 360x360 (360 parcels x 360 parcels) matrix and I'm trying to figure
> out how to identify the parcels of the matrix that represent each resting
> state network (at least 5 RSNs for now).
>
>
> Does anyone have a file/GUI that can enable me to automatically identify
> each RSN's parcels on MATLAB?
>
>
> Kind regards,
> *Dinal Jayasekera *
>
> PhD Candidate | InSITE Fellow 
> Ammar Hawasli Lab 
> Department of Biomedical Engineering
>  | Washington University in St.
> Louis 
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] question about making RSFC matrices available

2019-04-16 Thread Timothy Coalson
We have done so partly because we haven't needed to put many data files on
github (it would not be simple for the pipelines to fetch
data-use-terms-protected files from BALSA, for instance).  He first asked
whether it was okay to put them on github, and the real answer seems to be
"ask your own institution, not us".

Tim


On Tue, Apr 16, 2019 at 8:00 PM Glasser, Matthew  wrote:

> Hi Tim,
>
> I guess we have treated such derivative results similarly and put them
> behind the HCP data use terms.  One easy solution would be to upload the
> results to the BALSA database and add the HCP data use terms.
>
> Matt.
>
> From:  on behalf of Timothy
> Coalson 
> Date: Tuesday, April 16, 2019 at 5:11 PM
> To: "Burgess, Gregory" 
> Cc: "Curtiss, Sandy" , 李婧玮 ,
> Thomas Yeo , "hcp-users@humanconnectome.org" <
> hcp-users@humanconnectome.org>
> Subject: Re: [HCP-Users] question about making RSFC matrices available
>
> However, he is not sharing the HCP data files themselves, but only results
> obtained from using them.  The data use terms only state that the *original
> data* must be distributed under the same terms.  Derived data appears to
> only be covered by "all relevant rules and regulations imposed by my
> institution", and that paragraph looks like it mainly exists to remind
> users that the data is not considered de-identified.
>
> Tim
>
>
> On Tue, Apr 16, 2019 at 4:56 PM Burgess, Gregory 
> wrote:
>
>> I believe that the Open Access Data Use terms (
>> https://www.humanconnectome.org/study/hcp-young-adult/document/wu-minn-hcp-consortium-open-access-data-use-terms)
>> require that anyone receiving the data must have first agreed to the Open
>> Access Data Use Terms. If my understanding is correct, that would mean that
>> you would need to verify that the recipient has accepted the terms before
>> sharing with them. I doubt that would be easy to manage on a public site.
>>
>> --Greg
>>
>> 
>> Greg Burgess, Ph.D.
>> Senior Scientist, Human Connectome Project
>> Washington University School of Medicine
>> Department of Psychiatry
>> Phone: 314-362-7864
>> Email: gburg...@wustl.edu 
>>
>> On Apr 16, 2019, at 1:24 PM, Thomas Yeo  wrote:
>>
>> Hi,
>>
>> Maybe this question has already been answered before, but we have been
>> working on the HCP data and have computed our own derivatives, e.g., FC
>> matrices of individual subjects.
>>
>> Is it ok to share these matrices with accompanying HCP subject IDs on our
>> personal github/website? If not, how do you suggest we can share these
>> derivatives?
>>
>> Thanks,
>> Thomas
>>
>> ___
>> HCP-Users mailing list
>> HCP-Users@humanconnectome.org
>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>
>>
>>
>> --
>>
>> The materials in this message are private and may contain Protected
>> Healthcare Information or other information of a sensitive nature. If you
>> are not the intended recipient, be advised that any unauthorized use,
>> disclosure, copying or the taking of any action in reliance on the contents
>> of this information is strictly prohibited. If you have received this email
>> in error, please immediately notify the sender via telephone or return mail.
>>
>> ___
>> HCP-Users mailing list
>> HCP-Users@humanconnectome.org
>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] question about making RSFC matrices available

2019-04-16 Thread Timothy Coalson
However, he is not sharing the HCP data files themselves, but only results
obtained from using them.  The data use terms only state that the *original
data* must be distributed under the same terms.  Derived data appears to
only be covered by "all relevant rules and regulations imposed by my
institution", and that paragraph looks like it mainly exists to remind
users that the data is not considered de-identified.

Tim


On Tue, Apr 16, 2019 at 4:56 PM Burgess, Gregory  wrote:

> I believe that the Open Access Data Use terms (
> https://www.humanconnectome.org/study/hcp-young-adult/document/wu-minn-hcp-consortium-open-access-data-use-terms)
> require that anyone receiving the data must have first agreed to the Open
> Access Data Use Terms. If my understanding is correct, that would mean that
> you would need to verify that the recipient has accepted the terms before
> sharing with them. I doubt that would be easy to manage on a public site.
>
> --Greg
>
> 
> Greg Burgess, Ph.D.
> Senior Scientist, Human Connectome Project
> Washington University School of Medicine
> Department of Psychiatry
> Phone: 314-362-7864
> Email: gburg...@wustl.edu 
>
> On Apr 16, 2019, at 1:24 PM, Thomas Yeo  wrote:
>
> Hi,
>
> Maybe this question has already been answered before, but we have been
> working on the HCP data and have computed our own derivatives, e.g., FC
> matrices of individual subjects.
>
> Is it ok to share these matrices with accompanying HCP subject IDs on our
> personal github/website? If not, how do you suggest we can share these
> derivatives?
>
> Thanks,
> Thomas
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] DeDriftAndResamplePipeline error

2019-04-15 Thread Timothy Coalson
I have pushed a similar edit to reapply MR fix, please update to the latest
master.

Tim


On Mon, Apr 15, 2019 at 8:27 PM Timothy Coalson  wrote:

> They weren't instructions, I pushed an edit, and it was a different script.
>
> Tim
>
>
> On Mon, Apr 15, 2019 at 8:08 PM Glasser, Matthew 
> wrote:
>
>> Here is the error:
>>
>> readlink: illegal option -- f
>>
>> usage: readlink [-n] [file ...]
>>
>> I believe Tim already gave you instructions for this.
>>
>> Also, the log_Warn line is again concerning as to whether you followed
>> the installation instructions and all version 4.0.0 files here.
>>
>> Matt.
>>
>> From: Marta Moreno 
>> Date: Monday, April 15, 2019 at 8:53 AM
>> To: Matt Glasser 
>> Cc: HCP Users , Timothy Coalson <
>> tsc...@mst.edu>, "Brown, Tim" 
>> Subject: Re: [HCP-Users] DeDriftAndResamplePipeline error
>>
>> I had to re-run DeDriftAndResamplePipeline twice because it was searching
>> for settings.sh in the wrong place, and now I am getting the following
>> error message:
>> ReApplyFixMultiRunPipeline.sh: line 586: log_Warn: command not found
>>
>> I am attaching log files.
>>
>> Does the folder containing fix1.067 need to include all the ICAFIX files?
>>
>> Thanks a lot!
>>
>> Leah.
>>
>>
>> On Apr 15, 2019, at 12:19 AM, Marta Moreno 
>> wrote:
>>
>> It seams to be working now. Thanks a lot!
>>
>> Leah.
>>
>> On Apr 15, 2019, at 12:04 AM, Glasser, Matthew 
>> wrote:
>>
>> If you ran MR+FIX, you need to set these appropriately
>>
>> MRFixConcatName="NONE"
>> MRFixNames="NONE"
>>
>> And not set
>>
>> fixNames="RS_fMRI_1 RS_fMRI_2" #Space delimited list or NONE
>>
>>
>> https://github.com/Washington-University/HCPpipelines/blob/master/Examples/Scripts/DeDriftAndResamplePipelineBatch.sh
>>
>> Also it looks like line 124 needs an “s” on the end of the flag name to
>> read --multirun-fix-concat-names=${MRFixConcatName}
>>
>> Matt.
>>
>> From: Marta Moreno 
>> Date: Sunday, April 14, 2019 at 10:56 PM
>> To: Matt Glasser 
>> Cc: HCP Users 
>> Subject: Re: [HCP-Users] DeDriftAndResamplePipeline error
>>
>> Thanks a lot for your response.
>>
>> I am running v.4.0.0 now. And I have set up the script as follow:
>>
>> HighResMesh="164"
>> LowResMesh="32"
>> RegName="MSMAll_InitalReg_2_d40_WRN"
>>
>> DeDriftRegFiles="${HCPPIPEDIR}/global/templates/MSMAll/DeDriftingGroup.L.sphere.DeDriftMSMAll.164k_fs_LR.surf.gii@
>> ${HCPPIPEDIR}/global/templates/MSMAll/DeDriftingGroup.R.sphere.DeDriftMSMAll.164k_fs_LR.surf.gii"
>> ConcatRegName="MSMAll_Test"
>> Maps="sulc curvature corrThickness thickness"
>> MyelinMaps="MyelinMap SmoothedMyelinMap" #No _BC, this will be reapplied
>> MRFixConcatName="NONE"
>> MRFixNames="NONE"
>> #fixNames="rfMRI_REST1_LR rfMRI_REST1_RL rfMRI_REST2_LR rfMRI_REST2_RL"
>> #Space delimited list or NONE
>> fixNames="RS_fMRI_1 RS_fMRI_2" #Space delimited list or NONE
>> #dontFixNames="tfMRI_WM_LR tfMRI_WM_RL tfMRI_GAMBLING_LR
>> tfMRI_GAMBLING_RL tfMRI_MOTOR_LR tfMRI_MOTOR_RL tfMRI_LANGUAGE_LR
>> tfMRI_LANGUAGE_RL tfMRI_SOCIAL_LR tfMRI_SOCIAL_RL tfMRI_RELATIONAL_LR
>> tfMRI_RELATIONAL_RL tfMRI_EMOTION_LR tfMRI_EMOTION_RL" #Space delimited
>> list or NONE
>> dontFixNames="NONE"
>> SmoothingFWHM="2" #Should equal previous grayordinates smoothing (because
>> we are resampling from unsmoothed native mesh timeseries)
>> HighPass="0"
>> MotionRegression=TRUE
>> MatlabMode="1" #Mode=0 compiled Matlab, Mode=1 interpreted Matlab, Mode=2
>> octave
>> #MatlabMode="0" #Mode=0 compiled Matlab, Mode=1 interpreted Matlab,
>> Mode=2 octave
>>
>> But the script does not run and it is aborted with the following message:
>> DeDriftAndResamplePipeline.sh - ABORTING: unrecognized option:
>> --multirun-fix-concat-name=NONE
>>
>> I am attaching the log files.
>>
>> Leah.
>>
>>
>> On Apr 14, 2019, at 11:10 PM, Glasser, Matthew 
>> wrote:
>>
>> In this case you do run it with the individual fMRI names and that
>> doesn’t look like the version 4.0.0 example script...
>>
>> Matt.
>>
>> From:  on behalf of Marta Moreno <
>> mmorenoort...@icloud.com>
>> Date: Sund

Re: [HCP-Users] DeDriftAndResamplePipeline error

2019-04-15 Thread Timothy Coalson
They weren't instructions, I pushed an edit, and it was a different script.

Tim


On Mon, Apr 15, 2019 at 8:08 PM Glasser, Matthew  wrote:

> Here is the error:
>
> readlink: illegal option -- f
>
> usage: readlink [-n] [file ...]
>
> I believe Tim already gave you instructions for this.
>
> Also, the log_Warn line is again concerning as to whether you followed the
> installation instructions and all version 4.0.0 files here.
>
> Matt.
>
> From: Marta Moreno 
> Date: Monday, April 15, 2019 at 8:53 AM
> To: Matt Glasser 
> Cc: HCP Users , Timothy Coalson <
> tsc...@mst.edu>, "Brown, Tim" 
> Subject: Re: [HCP-Users] DeDriftAndResamplePipeline error
>
> I had to re-run DeDriftAndResamplePipeline twice because it was searching
> for settings.sh in the wrong place, and now I am getting the following
> error message:
> ReApplyFixMultiRunPipeline.sh: line 586: log_Warn: command not found
>
> I am attaching log files.
>
> Does the folder containing fix1.067 need to include all the ICAFIX files?
>
> Thanks a lot!
>
> Leah.
>
>
> On Apr 15, 2019, at 12:19 AM, Marta Moreno 
> wrote:
>
> It seams to be working now. Thanks a lot!
>
> Leah.
>
> On Apr 15, 2019, at 12:04 AM, Glasser, Matthew  wrote:
>
> If you ran MR+FIX, you need to set these appropriately
>
> MRFixConcatName="NONE"
> MRFixNames="NONE"
>
> And not set
>
> fixNames="RS_fMRI_1 RS_fMRI_2" #Space delimited list or NONE
>
>
> https://github.com/Washington-University/HCPpipelines/blob/master/Examples/Scripts/DeDriftAndResamplePipelineBatch.sh
>
> Also it looks like line 124 needs an “s” on the end of the flag name to
> read --multirun-fix-concat-names=${MRFixConcatName}
>
> Matt.
>
> From: Marta Moreno 
> Date: Sunday, April 14, 2019 at 10:56 PM
> To: Matt Glasser 
> Cc: HCP Users 
> Subject: Re: [HCP-Users] DeDriftAndResamplePipeline error
>
> Thanks a lot for your response.
>
> I am running v.4.0.0 now. And I have set up the script as follow:
>
> HighResMesh="164"
> LowResMesh="32"
> RegName="MSMAll_InitalReg_2_d40_WRN"
>
> DeDriftRegFiles="${HCPPIPEDIR}/global/templates/MSMAll/DeDriftingGroup.L.sphere.DeDriftMSMAll.164k_fs_LR.surf.gii@
> ${HCPPIPEDIR}/global/templates/MSMAll/DeDriftingGroup.R.sphere.DeDriftMSMAll.164k_fs_LR.surf.gii"
> ConcatRegName="MSMAll_Test"
> Maps="sulc curvature corrThickness thickness"
> MyelinMaps="MyelinMap SmoothedMyelinMap" #No _BC, this will be reapplied
> MRFixConcatName="NONE"
> MRFixNames="NONE"
> #fixNames="rfMRI_REST1_LR rfMRI_REST1_RL rfMRI_REST2_LR rfMRI_REST2_RL"
> #Space delimited list or NONE
> fixNames="RS_fMRI_1 RS_fMRI_2" #Space delimited list or NONE
> #dontFixNames="tfMRI_WM_LR tfMRI_WM_RL tfMRI_GAMBLING_LR tfMRI_GAMBLING_RL
> tfMRI_MOTOR_LR tfMRI_MOTOR_RL tfMRI_LANGUAGE_LR tfMRI_LANGUAGE_RL
> tfMRI_SOCIAL_LR tfMRI_SOCIAL_RL tfMRI_RELATIONAL_LR tfMRI_RELATIONAL_RL
> tfMRI_EMOTION_LR tfMRI_EMOTION_RL" #Space delimited list or NONE
> dontFixNames="NONE"
> SmoothingFWHM="2" #Should equal previous grayordinates smoothing (because
> we are resampling from unsmoothed native mesh timeseries)
> HighPass="0"
> MotionRegression=TRUE
> MatlabMode="1" #Mode=0 compiled Matlab, Mode=1 interpreted Matlab, Mode=2
> octave
> #MatlabMode="0" #Mode=0 compiled Matlab, Mode=1 interpreted Matlab, Mode=2
> octave
>
> But the script does not run and it is aborted with the following message:
> DeDriftAndResamplePipeline.sh - ABORTING: unrecognized option:
> --multirun-fix-concat-name=NONE
>
> I am attaching the log files.
>
> Leah.
>
>
> On Apr 14, 2019, at 11:10 PM, Glasser, Matthew  wrote:
>
> In this case you do run it with the individual fMRI names and that
> doesn’t look like the version 4.0.0 example script...
>
> Matt.
>
> From:  on behalf of Marta Moreno <
> mmorenoort...@icloud.com>
> Date: Sunday, April 14, 2019 at 10:06 PM
> To: HCP Users 
> Subject: [HCP-Users] DeDriftAndResamplePipeline error
>
> Dear Experts,
>
> I have run DeDriftAndResamplePipelineBatch.sh from from
> ${StudyFolder}/${Subject}/scripts after running MSMAII and getting the
> following error:
>
> While running:
> /Applications/workbench/bin_macosx64/../macosx64_apps/wb_command.app/Contents/MacOS/wb_command
> -metric-resample
> /Volumes/data/data3/NTTMS/NTTMS_s002/NTTMS_s002_170812/MNINonLinear/Results/RS_fMRI_MR/RS_fMRI_MR.L.native.func.gii
> /Volumes/data/data3/NTTMS/NTTMS_s002/NTTMS_s002_170812/MNINonLinear/Native/NTTMS_s002_170812.L.sphere.MSMAll

Re: [HCP-Users] Surface Parcelations Area

2019-04-12 Thread Timothy Coalson
The cleanest way to do it is to use -cifti-create-dense-from-template to
put the vertex area metrics into a cifti file (this may have already been
done, look at what files exist with "va" in the name), and then use
-cifti-parcellate on that with -method SUM.

Tim


On Fri, Apr 12, 2019 at 4:30 PM Kashyap, Amrit  wrote:

> Hey HCP users,
>
> I was wondering if you all knew of a quick method of getting the surface
> area of a cifti file (dlabel). I noticed there was a cifti-label-adjacency
> which gets you the boundary length between two areas, which is pretty neat
> and was hoping maybe there is a easy command that would give me the surface
> area of each parcellations,
>
> Thanks
>
> Amrit
>
> --
>
> This e-mail message (including any attachments) is for the sole use of
> the intended recipient(s) and may contain confidential and privileged
> information. If the reader of this message is not the intended
> recipient, you are hereby notified that any dissemination, distribution
> or copying of this message (including any attachments) is strictly
> prohibited.
>
> If you have received this message in error, please contact
> the sender by reply e-mail message and destroy all copies of the
> original message (including attachments).
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] connectome for monkeys // fiber trajectories

2019-04-11 Thread Timothy Coalson
Connectome workbench is agnostic to species, though there are some defaults
(identification symbol size) which default to a size suited to the human
brain.  We frequently use it with primate data.

Workbench can display probabilistic trajectories generated with fsl's
bedpostx/probtrackx tools (for the specific type of output that saves the
fiber orientations used per-seed and per-voxel, matrix4 I think), but I
don't think we have made a tutorial for it (and it still has some rough
edges, as it hasn't been a priority for us).  The wb_command
-convert-matrix4-to-workbench-sparse and -convert-fiber-orientations (or
-estimate-fiber-binghams for starting with just the direction samples, but
is less accurate) commands are the starting points - once you have
converted the files to workbench formats with the expected extensions
(currently .trajTEMP.wbsparse and .fiberTEMP.nii), loaded them into
wb_view, enabled them in features or layers (I don't recall exactly how),
clicking on a seed point will display the trajectories to/from it.

There is also wb_command -probtrackx-dot-convert, which allows converting
the other probtrackx matrix types to cifti files (can show how much each
voxel/vertex is used for a seed, but can't be displayed the way the
trajectory files can).

Tim


On Thu, Apr 11, 2019 at 9:23 AM DE CASTRO Vanessa 
wrote:

> Hi! I've started to work with the human connectome workbench, and I was
> wondering if is ready to use with monkeys as well, like Caret.
>
> And I also read in the tutorial that you are already working in a new
> feature: probabilistic fiber trajectories... how soon it will come?? :D
>
> Thank you very much for everything.
>
> Sinceresly yours,
>
> *--*
>
> *Vanessa DeCastro, PhD*
>
> *Centre de Recherche Cerveau et Cognition - UMR 5549 - CNRS
> Pavillon Baudot CHU Purpan
> 31052 Toulouse Cedex 03, France *
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Some questions about the HCP structural processing pipeline

2019-04-09 Thread Timothy Coalson
To add some context, the _acpc_dc_restore_brain versions of the files are
*outputs* of the structural pipelines.  We do not run freesurfer on masked
T1w images.

Matt is asking for the exact arguments you provided to
FreeSurferPipeline.sh, by providing the full command line that was run that
contains "FreeSurferPipeline.sh" (not the ...Batch.sh).

Tim


On Tue, Apr 9, 2019 at 9:21 AM Glasser, Matthew  wrote:

> How did YOU call the FreeSurfer Pipeline in your subject with an issue?
>
> Matt.
>
> From: Aaron C 
> Date: Tuesday, April 9, 2019 at 8:42 AM
> To: Matt Glasser , "hcp-users@humanconnectome.org" <
> hcp-users@humanconnectome.org>
> Subject: Re: [HCP-Users] Some questions about the HCP structural
> processing pipeline
>
> Hi Matt,
>
> I called the FreeSurfer pipeline using the script
> "FreeSurferPipelineBatch.sh" in the "Examples" folder. Do you mean that
> these brain extracted T1w and T2w images "T1w_acpc_dc_restore_brain.nii.gz"
> and "T2w_acpc_dc_restore_brain.nii.gz" were actually not used in the
> FreeSurfer pipeline? Thank you.
>
> Aaron
>
> --
> *From:* Glasser, Matthew 
> *Sent:* Monday, April 8, 2019 10:50 PM
> *To:* Aaron C; hcp-users@humanconnectome.org
> *Subject:* Re: [HCP-Users] Some questions about the HCP structural
> processing pipeline
>
> The brain masked file should not be used for surface estimation.  How are
> you calling the FreeSurfer pipeline?
>
> Matt.
>
> From:  on behalf of Aaron C <
> aaroncr...@outlook.com>
> Date: Monday, April 8, 2019 at 9:41 PM
> To: "hcp-users@humanconnectome.org" 
> Subject: [HCP-Users] Some questions about the HCP structural processing
> pipeline
>
> Dear HCP experts,
>
> I have some questions about the HCP structural processing pipeline.
>
>1. The brain extracted file "T1w_acpc_dc_restore_brain.nii.gz" in the
>T1w folder has excessive voxels removed in the cortical surface, which
>resulted in erroneous pial surface estimation. Is there a way that I could
>adjust any parameters of brain extraction in the PreFreeSurfer.sh file to
>have a larger brain mask?
>2. There is another file "brainmask_fs.nii.gz". Is this the same mask
>derived from "T1w_acpc_dc_restore_brain.nii.gz" and
>"T2w_acpc_dc_restore_brain.nii.gz"?
>3. In the structural processing QC scene, is there a way that I could
>also display the boundary of the extracted brain alongside with pial and
>white surfaces in the coronal brain?
>
> Thank you.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Volumetric subcortical group-averaged data: what is the exact MNI template you used?

2019-04-05 Thread Timothy Coalson
The files themselves are in the pipelines repository, if that helps:

https://github.com/Washington-University/HCPpipelines/tree/master/global/templates

It is visually obvious that they are not left/right symmetric, assuming
that is what you were asking.

Tim


On Fri, Apr 5, 2019 at 4:54 PM Glasser, Matthew  wrote:

> I don’t know, you can ask on the FSL list.
>
> Matt.
>
> From: Xavier Guell Paradis 
> Date: Friday, April 5, 2019 at 4:49 PM
> To: Matt Glasser 
> Cc: Xavier Guell Paradis , "
> hcp-users@humanconnectome.org" 
> Subject: Re: [HCP-Users] Volumetric subcortical group-averaged data: what
> is the exact MNI template you used?
>
> This means that the template is asymmetric, not symmetric, correct?
> Thanks,
> Xavier.
>
> On Fri, Apr 5, 2019 at 5:48 PM Glasser, Matthew 
> wrote:
>
>> FSL’s MNI152.
>>
>> Matt.
>>
>> From:  on behalf of Xavier Guell
>> Paradis 
>> Date: Friday, April 5, 2019 at 4:46 PM
>> To: "hcp-users@humanconnectome.org" 
>> Subject: [HCP-Users] Volumetric subcortical group-averaged data: what is
>> the exact MNI template you used?
>>
>> Dear HCP experts,
>> I am interested in analyzing your group-averaged subcortical volumetric
>> data. My understanding is that your volumetric data is registered to MNI
>> space. I was wondering if you could let me know what specific MNI template
>> you used. I am especially interested in knowing whether it is a symmetric
>> or an asymmetric MNI template.
>> Thank you,
>> Xavier.
>>
>> ___
>> HCP-Users mailing list
>> HCP-Users@humanconnectome.org
>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>
>>
>> --
>>
>> The materials in this message are private and may contain Protected
>> Healthcare Information or other information of a sensitive nature. If you
>> are not the intended recipient, be advised that any unauthorized use,
>> disclosure, copying or the taking of any action in reliance on the contents
>> of this information is strictly prohibited. If you have received this email
>> in error, please immediately notify the sender via telephone or return mail.
>>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Glasser Parcellation MNI Coordinates

2019-04-05 Thread Timothy Coalson
You could use the group average MNI surfaces from connectomedb for
visualization.  If you need a single coordinate per parcel, you can use
wb_command -surface-coordinates-to-metric on the surfaces, combine those
metric files into a cifti file with -cifti-create-dense-from-template, and
use -cifti-parcellate on that.

Tim


On Fri, Apr 5, 2019 at 3:24 PM Anita Sinha  wrote:

> To Whom It May Concern,
>
>
> I am using Glasser parcellation for some fMRI analysis and wanted to know
> where I could access the MNI coordinates for the parcellation scheme, so I
> can generate glass brain figures to visualize the regions and their
> corresponding networks. If you could point me to where I could locate these
> coordinates, that would be greatly appreciated.
>
>
> Thank you for your time. I look forward to hearing from you soon.
>
>
> Regards,
>
>
> Anita
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] MSM binaries versions

2019-04-04 Thread Timothy Coalson
The 1.0 and 3.0 versions on github are nearly identical, that was just a
naming issue.

The version in FSL may be based on version 2, and is missing a library
needed for HOCR, so some options in v3 aren't available.  You should be
able to use the fsl versions of the executables other than msm (so,
msmresample, etc) with any version of msm.

I'm not sure about your other questions.

Tim


On Thu, Apr 4, 2019 at 1:29 PM Moataz Assem 
wrote:

> Hi,
>
>
>
> What is the recommended version of the MSM binaries to use? The rep
> directory (https://github.com/ecr05/MSM_HOCR/releases) has v1.0.0 and
> v3.0.0 and I was previously using v2 from here:
> https://www.doc.ic.ac.uk/~ecr05/MSM_HOCR_v2/
>
> Also what is the difference between these binaries and the ones downloaded
> with fsl? In otherwords, can I just point the MSMBINDIR in the
> SetUpHCPPipeline.sh to  ~/fsl/bin/ since it contains all the msm related
> functions?
>
>
>
> Also I would appreciate a clarification on the list of compiled files to
> make sure exist in the directory pointed to for the MSMBINDIR variable.
>
>
>
> Thanks
>
>
>
> Moataz
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] A question about the HCP pipeline

2019-04-04 Thread Timothy Coalson
No, since the subcortical data needed to be in MNI space, we chose to use
MNI space surfaces for each subject so that we only needed to generate a
single motion-corrected volume timeseries.  Because the per-subject
processing uses the individual surfaces and the same warpfield for surface
and volume, everything lines up just as well as in native space - the main
difference is that the warpfield can locally change the sampling density in
the volume data.

Tim


On Thu, Apr 4, 2019 at 7:57 AM Aaron C  wrote:

> Dear HCP experts,
>
> For the HCP pipeline, is there a version that processes the data in native
> space? Thank you.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] MMP in volume

2019-04-02 Thread Timothy Coalson
Our paper did in fact project it to the volume, by taking into account the
non-overlap between subjects, and the individual nature of the original
parcellations, and the result is a label volume that is generally thinner
than typical cortex, and has holes "missing" where there was no cortical
area that was more likely than white matter or CSF.  See supplemental
figure S4 (which has a link to the data files).

If you want to use the fine-grained detail of our cortical parcellation, we
strongly recommend that you use processing methods that preserve a similar
level of detail and tissue accuracy, and for group analysis that generally
means surface-based methods.

Note that for a single individual, you can in fact go back and forth
between volume and surface without appreciable losses in localization (only
resampling losses, effectively).  The problem with group-average volume
cortical data is the fact that different subjects aren't all that well
aligned (even simple cortical overlap generally isn't great), and averaging
those voxels that are only somewhat aligned is where you lose your
localization.

There are also some low-variability parts of cortex like the insula and CeS
that volume alignment does reasonably with, and the subcortical gray matter
structures (which are not in the HCP MMP 1.0) are also well-aligned in the
volume.

Tim


On Tue, Apr 2, 2019 at 1:32 PM Mor Regev  wrote:

> Thanks Tim! I understand that, but can't it be back projected?
>
> Mor
>
> On Tue, Apr 2, 2019 at 2:23 PM Timothy Coalson  wrote:
>
>> The HCP MMP 1.0 parcellation could not have been made without using
>> surface-based methods, due to their increased accuracy in aligning
>> functional areas over existing volume-based registrations.  Volume-based
>> group data generally cannot have the cortical precision that the HCP MMP
>> 1.0 implies.  See our paper showing this:
>>
>> https://www.ncbi.nlm.nih.gov/pubmed/29925602
>>
>> Tim
>>
>>
>> On Tue, Apr 2, 2019 at 11:41 AM Mor Regev  wrote:
>>
>>> Hello,
>>>
>>> I would like to use Glasser's parcellation in volume space. Is there an
>>> available nifti I could use?
>>>
>>> Thanks,
>>> Mor
>>>
>>>
>>> --
>>> Dr. Mor Regev
>>> Montreal Neurological Institute
>>> McGill University
>>> 3801 University St
>>> <https://maps.google.com/?q=3801+University+St+Montreal,+QC+Canada+H3A2B4=gmail=g>
>>> Montreal, QC Canada H3A2B4
>>> <https://maps.google.com/?q=3801+University+St+Montreal,+QC+Canada+H3A2B4=gmail=g>
>>>
>>> ___
>>> HCP-Users mailing list
>>> HCP-Users@humanconnectome.org
>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>>
>>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] MMP in volume

2019-04-02 Thread Timothy Coalson
The HCP MMP 1.0 parcellation could not have been made without using
surface-based methods, due to their increased accuracy in aligning
functional areas over existing volume-based registrations.  Volume-based
group data generally cannot have the cortical precision that the HCP MMP
1.0 implies.  See our paper showing this:

https://www.ncbi.nlm.nih.gov/pubmed/29925602

Tim


On Tue, Apr 2, 2019 at 11:41 AM Mor Regev  wrote:

> Hello,
>
> I would like to use Glasser's parcellation in volume space. Is there an
> available nifti I could use?
>
> Thanks,
> Mor
>
>
> --
> Dr. Mor Regev
> Montreal Neurological Institute
> McGill University
> 3801 University St
> 
> Montreal, QC Canada H3A2B4
> 
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Topup_AP dimension issue

2019-03-25 Thread Timothy Coalson
That looks like you cut off some brain tissue.  I'm not really sure what
your goal is here, but if you have images with different voxel sizes, what
you may actually need to do is to resample an image (flirt, applywarp, or
wb_command -volume-*-resample), and not crop it.

Tim


On Sun, Mar 24, 2019 at 10:22 PM Marta Moreno 
wrote:

> Thanks for your response. The image is now cropped, image attached.
>
> Leah.
>
>
> ***
> Leah Moreno, PhD
> Research Scientist
> Division of Experimental Therapeutics
> Department of Psychiatry
> Columbia University Medical Center
> 1051 Riverside Drive, Unit 21
> New York, NY 10032
> phone: (914) 218-7311
> email: mm4...@cumc.columbia.edu
>
> On Mar 24, 2019, at 10:58 PM, Glasser, Matthew  wrote:
>
> 0 66 to include all slices and probably 0 1 for timepoints tacked onto the
> end.
>
> Matt.
>
> From: Marta Moreno 
> Date: Sunday, March 24, 2019 at 9:54 PM
> To: Matt Glasser 
> Cc: HCP Users 
> Subject: Re: [HCP-Users] Topup_AP dimension issue
>
> Thanks for your response. Still seams to be empty. I have attached both
> files into this email in case it helps.
>
> Leah.
>
>
>
> ***
> Leah Moreno, PhD
> Research Scientist
> Division of Experimental Therapeutics
> Department of Psychiatry
> Columbia University Medical Center
> 1051 Riverside Drive, Unit 21
> New York, NY 10032
> phone: (914) 218-7311
> email: mm4...@cumc.columbia.edu
>
> On Mar 24, 2019, at 10:47 PM, Glasser, Matthew  wrote:
>
> Not negative.
>
> Matt.
>
> From: Marta Moreno 
> Date: Sunday, March 24, 2019 at 9:45 PM
> To: Matt Glasser 
> Cc: HCP Users 
> Subject: Re: [HCP-Users] Topup_AP dimension issue
>
> Thanks for your response. Do you mean running this?: fslroi
> NTTMS_s037_181015_3T_SpinEchoFieldMap_AP_1.nii.gz test.nii.gz -16 96 -16 96
> 66 66
>
> If that is the case the test file seams to be empty.
>
> Leah.
>
>
>
> On Mar 24, 2019, at 8:25 PM, Glasser, Matthew  wrote:
>
> Subtract 128 from 96 and then take half of that as your start and 96 as
> your length.
>
> Matt.
>
> From:  on behalf of Marta Moreno <
> mmorenoort...@icloud.com>
> Date: Sunday, March 24, 2019 at 6:40 PM
> To: HCP Users 
> Subject: [HCP-Users] Topup_AP dimension issue
>
> Dear Experts,
>
> Maybe a trivial question but how can I change the Topup_AP dimension' size
> from 128 128 66 to 96 96 66? Using fslroi I am cutting the brain.
>
> Thanks,
>
> Leah.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
>
>
>
> --
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
>
>
>
> --
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] related parcellation

2019-03-24 Thread Timothy Coalson
The labels are used in the order of their keys, which is also how the
exported label table is ordered.  If your dlabel file has more label names
than there ended up being parcels, you can first use
-cifti-parcel-mapping-to-label to get a minimal dlabel file that exactly
matches the parcels mapping:

https://www.humanconnectome.org/software/workbench-command/-cifti-parcel-mapping-to-label

I believe -file-information will also print the parcel names in the order
they occur.

Tim


On Sun, Mar 24, 2019 at 1:15 PM Glasser, Matthew  wrote:

> You can use wb_command -cifti-label-export-table or look inside the
> Workbench GUI.
>
> Matt.
>
> From:  on behalf of Marta Moreno <
> mmorenoort...@icloud.com>
> Date: Sunday, March 24, 2019 at 11:59 AM
> To: HCP Users 
> Subject: [HCP-Users] related parcellation
>
> Dear Experts,
>
> If I create a ‘pconn.nii’ from each subject through the commands wb_command
> -cifti-parcellate and -cifti-correlation to generate correlation matrix
> between parcels (struct 360*360). How can I know which number in raw and
> column correspond to parcel’s name in the dlabel.nii from 
> Q1-Q6_RelatedParcellation210.CorticalAreas_dil_Final_Final_Areas_Group_Colors.32k_fs_LR.dlabel.nii?
> p.e., correlation value between parcels 46 and a24.
>
> Thanks,
>
> Leah
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Resting state networks

2019-03-22 Thread Timothy Coalson
Inline replies.

Tim


On Fri, Mar 22, 2019 at 10:17 AM Claude Bajada 
wrote:

> Dear experts,
>
> Could I just confirm that the data that is found in:
>
>
> ${SubjectFolder}/MNINinLinear/Results/rfMRI_REST?_LR/rfMRI_REST?_LR_Atlas_MSMAll_hp2000_clean.dtseries.nii
>
> Is the resting state data using the MSMall aligned vertex coordinates
>

Yes.


> and that these data can be used to then produce Resting state networks
>

Yes.


> (I realize that these are available but I just want to make sure I am
> understanding the data that I have). Further are the four subfolders in
> "Results" four separate runs that I can either concatenate or produce
> four different RSNs?
>

Yes, but the LR and RL runs have different locations of signal dropout, so
we generally recommend at least pairing an LR with an RL to increase the
overall coverage.


>
> Am I also correct in assuming that these data are not smoothed?
>

No, there is a small amount of added smoothing done (2mm FWHM) while
putting the data into ~2mm grayordinates space - this could be done without
any added smoothing, but for historical reasons we have kept it consistent
with earlier releases.  Of course, this amount doesn't approach the
smoothing done in many non-HCP analysis streams.


> Regards,
>
> Claude
>
>
>
>
> 
>
> 
> Forschungszentrum Juelich GmbH
> 52425 Juelich
> Sitz der Gesellschaft: Juelich
> Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
> Vorsitzender des Aufsichtsrats: MinDir Dr. Karl Eugen Huthmacher
> Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
> Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt,
> Prof. Dr. Sebastian M. Schmidt
>
> 
>
> 
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Merging surface ROIs

2019-03-19 Thread Timothy Coalson
First, you'll need to export the label table of the original file (the
name, key value, and color values for each label, see -cifti-label-import),
with wb_command -cifti-label-export-table.  You'll need to either figure
out a key value (first number in each row of numbers) that hasn't been used
yet, or you can reuse the key value from one of the labels you are
removing.  Make an edited version of this label table, delete the lines for
the labels that you are removing, and add a line for your new label.

Then, you can turn those labels into ROIs with wb_command
-cifti-label-to-roi , and then use -cifti-math to combine them (I would do
something like "(a+b+c+d) > 0", just to be extra safe).  You can then use
-cifti-math on the dlabel file to change the value in that ROI to something
else, with something like "orig * (!roi) + roi * ".

However, you aren't done yet, as this new file still has the old label
table (with either the wrong name, or no name for your new label).  You
need to use -cifti-label-import on it (even though it is already a dlabel
file), and use that new text file you made that has the new label
information in it.

Tim


On Tue, Mar 19, 2019 at 4:55 PM Morin, Elyse  wrote:

> Dear HCP experts,
>
> I would like to know how to merge surface ROIs in a dlabel file to create
> a new label within that set.  For example, combining the labels 24a, 24b,
> 24c, 24d into a new area 24 label within that dlabel.nii file, which
> represents 24a-d merged.
>
> Does anyone have any experience/suggestions for how to go about this?
>
> Thank you,
> Elyse
>
> --
>
> This e-mail message (including any attachments) is for the sole use of
> the intended recipient(s) and may contain confidential and privileged
> information. If the reader of this message is not the intended
> recipient, you are hereby notified that any dissemination, distribution
> or copying of this message (including any attachments) is strictly
> prohibited.
>
> If you have received this message in error, please contact
> the sender by reply e-mail message and destroy all copies of the
> original message (including attachments).
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] A question about generating quality control scene file for the resting-state data

2019-03-18 Thread Timothy Coalson
I have occasionally seen the volume slice outline show something like that
before, but as far as I could tell, the surface was actually okay.  It may
just be a display bug in wb_view, but we haven't pinned it down.

Tim


On Mon, Mar 18, 2019 at 8:44 AM Aaron C  wrote:

> Dear HCP experts,
>
> I have a question about the quality control of rfMRI data processed by the
> HCP pipeline. Is there any shared script to create the quality control
> scene file ("rfMRI_1.scene") described in the HCP course practical material
> 5 (https://wustl.app.box.com/s/xfs2506iz6pa6t7bfhhkno3baphnppvy)?
>
> Also, for the attached figure (native space), is this indicating a problem
> in using the HCP structural preprocessing pipeline? Thank you.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] rs-fMRI with-in subject comparison

2019-03-04 Thread Timothy Coalson
There isn't a dedicated command to get the parcel names, but they are in
the output of wb_command -file-information on the parcellated file, or you
can take them from -cifti-label-export-table on the dlabel file.

Tim


On Mon, Mar 4, 2019 at 1:55 AM Tali Weiss  wrote:

> i did
> wb_command -cifti-parcellate
> wb_command -cifti-convert -to-text (to be continued in matlab)
>
> 1. Where I can find the titles of each of the 360 parcels?
> 2. I want to classify the parcel to networks (DMN, visual...)? is there a
> script in HCP that do it?
> --
> *From:* Glasser, Matthew [glass...@wustl.edu]
> *Sent:* Sunday, March 03, 2019 5:12 PM
> *To:* Tali Weiss; hcp-users@humanconnectome.org
> *Subject:* Re: [HCP-Users] rs-fMRI with-in subject comparison
>
> 1.  The HCP-YA data were not variance normalized.
> 2.  wb_command -cifti-parcellate
> 3.  There isn’t a good sub-cortical parcellation like the cortical
> parcellation yet unfortunately.
>
> If you are comparing functional connectivity across runs within a subject,
> you don’t need to concatenate or variance normalize the runs.
>
> Matt.
>
> From: Tali Weiss 
> Date: Sunday, March 3, 2019 at 4:28 AM
> To: Matt Glasser , "hcp-users@humanconnectome.org" <
> hcp-users@humanconnectome.org>
> Subject: RE: [HCP-Users] rs-fMRI with-in subject comparison
>
> Thank you Mattew!
>
>
>
> 1. Download Packages: State fMRI FIX-Denoised (Compact)
>
> {Subject}_REST/MNINonLinear/Results/{fMRIName}/{fMRIName}
> _Atlas_MSMAll_hp2000_clean.dtseries.nii
>
> - The raw data were zscore (overall standard deviation) and then cleaned
> by sICA+FIX?
>
> - In wb there is a layer: .dynconn.nii. Is it for each of the 4 rs-scan of
> each subject?
>
>
>
> 2. I’m not sure which command I need to use to extract the timecourse of
> each parcel and then to apply correlation between all parcel of each
> network.
>
>
>
> input_label=
> Q1-Q6_RelatedValidation210.CorticalAreas_dil_Final_Final_Areas_Group_Colors.32k_fs_LR.dlabel.nii
>
> wb_command -cifti-all-labels-to-rois $input_label 1
> ROIvalidation210.dscalar.nii
>
>
>
> what I need to do next?
>
>
>
> 3. I like to create correlation also between subcortical (volume).
>
> I read your article
>
> https://www.ncbi.nlm.nih.gov/pubmed/29925602
>
>
>
> What is your recommendation to define subcortical volume?
>
> My analysis is “within subject” paradigm (comparing the scans in different
> days).
>
>
> --
> *From:* Glasser, Matthew [glass...@wustl.edu]
> *Sent:* Thursday, February 28, 2019 3:39 AM
> *To:* Tali Weiss; hcp-users@humanconnectome.org
> *Subject:* Re: [HCP-Users] rs-fMRI with-in subject comparison
>
> 1.  You would need to post the full path and filename.
> 2.  This will be handled by multi-run sICA+FIX in the future.  We will
> recommend all data be cleaned with sICA+FIX.  Really you want to be
> dividing by the unstructured noise standard deviation, rather than the
> overall standard deviation.
> 3.  Parcels have more statistical power than grayordinates.  The HCP’s
> multi-modal parcellation is here: https://balsa.wustl.edu/file/show/3VLx
>
> Matt.
>
> From:  on behalf of Tali Weiss <
> tali.we...@weizmann.ac.il>
> Date: Wednesday, February 27, 2019 at 7:26 AM
> To: "hcp-users@humanconnectome.org" 
> Subject: [HCP-Users] rs-fMRI with-in subject comparison
>
> Dear Prof.Smith,
>
> I really appreciate your help.
> I like to compare the second rs-fmri scan (from two different days) of
> the same subject.
>
> 1.when i open MSMAII_hp2000_clean.dtseries in WB, I get also a layer
> "dynconn",
> for example: rfMRI_REST1_LR_Atlas_MSMAII_hp2000_clean.dynconn.nii
> Are those *group* average?
>
> 2. It is recommended in the *HCP Users FAQ*: "demean and normalize the
> individual timeseries."
> wb_command -cifti-math '(x - mean) / stdev' 
> I am confused because it is writing demean *and*normalize. (zscore
> include demean, am I missing something?).
>
> My design is "within", so should I only apply demean? or because it is in
> a different day I should apply zscore?
>
> 3. I believe that statistically there are not enough time points in one
> scan to use all grayordinates.
> Thus, I will need to chose parcels/ROI.
> What is the best parcels/ROI I can use? (I like to focus on the
> attentional network, working memory and DMN)
> Is there an easy way to get those ROIs from the tasks?
>
> Thank you
> Tali
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please 

Re: [HCP-Users] ICA FIX output missing

2019-02-26 Thread Timothy Coalson
That is saying that you don't have the matlab gifti library installed (or
it isn't on your matlab path).

Tim


On Tue, Feb 26, 2019 at 6:09 PM Leonardo Tozzi  wrote:

> Dear Michael,
>
>
>
> Thank you very much for all the consideration on the use of FIX for the
> task data.
>
> I have tried the addition you suggest. I think the command is detected,
> but I get the following error in tfMRI_EMOTION_RL_hp2000.ica/.fix.log:
>
>
>
> {^HUndefined function or variable 'gifti'.
>
>
>
> Error in ciftiopen (line 31)
>
> cifti = gifti([tmpfile '.gii']);
>
>
>
> Error in fix_3_clean (line 46)
>
>   BO=ciftiopen('Atlas.dtseries.nii',WBC);
>
> }^H
>
>
>
>
>
> Would you have any thoughts on this?
>
> Thank you,
>
>
>
>
>
> Leonardo Tozzi, MD, PhD
>
> Williams PanLab | Postdoctoral Fellow
>
> Stanford University | 401 Quarry Rd
>
> lto...@stanford.edu | (650) 5615738
>
>
>
>
>
> *From: *"Harms, Michael" 
> *Date: *Tuesday, February 26, 2019 at 7:37 AM
> *To: *"Glasser, Matthew" , Leonardo Tozzi <
> lto...@stanford.edu>, "hcp-users@humanconnectome.org" <
> hcp-users@humanconnectome.org>
> *Cc: *"Burgess, Gregory" 
> *Subject: *Re: [HCP-Users] ICA FIX output missing
>
>
>
>
>
> Hi Leonardo,
>
> Couple things:
>
>
>
> 1)  In the context of FIX, things get a little convoluted, since the FIX
> distribution has its own settings.sh file that needs to be set
> appropriately.  If you’ve hard-coded the FSL_FIX_WBC variable in that
> settings.sh file, then the location to wb_command in the
> Examples/Scripts/SetUpHCPPipeline.sh isn’t necessarily relevant.  In the
> settings.sh file for FIX on our cluster, we use the following construction:
>
>
>
> if [ -x "$(command -v wb_command)" ]; then
>
> FSL_FIX_WBC=$(command -v wb_command)
>
> else
>
> echo "ERROR in $0: wb_command (Workbench) must be in your path"
>
> exit 1
>
> fi
>
> so that FIX *does* actually respect that location of wb_command that is
> already in your path.
>
>
>
> 2) Regarding MR-FIX and the TaskfMRIAnalysis scripts, while they may run
> after MR-FIX, there are two issues that need to be addressed yet:
>
> a) The temporal filter, which was presumably already applied during
> MR-FIX, gets applied again with TaskfMRILevel1.sh.  This script needs to be
> modified to be smarter regarding the temporal filtering (i.e., provide an
> option to NOT reapply the temporal filter).
>
> b) The space spanned by the noise regressors from FIX is not regressed out
> of the task regressor prior to the GLM, which means that variance removed
> during FIX can be reintroduced during the task GLM fitting (depending on
> the extent to which the space spanned by the noise regressors overlaps with
> the task GLM).
>
>
>
> Cheers,
>
> -MH
>
>
>
> --
>
> Michael Harms, Ph.D.
>
> ---
>
> Associate Professor of Psychiatry
>
> Washington University School of Medicine
>
> Department of Psychiatry, Box 8134
>
> 660 South Euclid Ave.Tel: 314-747-6173
>
> St. Louis, MO  63110  Email: mha...@wustl.edu
>
>
>
> *From: *"Glasser, Matthew" 
> *Date: *Monday, February 25, 2019 at 6:53 PM
> *To: *Leonardo Tozzi , "Harms, Michael" <
> mha...@wustl.edu>, "hcp-users@humanconnectome.org" <
> hcp-users@humanconnectome.org>
> *Subject: *Re: [HCP-Users] ICA FIX output missing
>
>
>
> You’ll want to be using wb_command 1.3.2.  I am not aware of any
> modifications that are necessary to use the TaskfMRIAnalysis scripts on
> MR+FIX data and have analyzed hundreds of subjects after MR+FIX.
>
>
>
> As for this issue, is wb_command set properly here:
> https://github.com/Washington-University/HCPpipelines/blob/master/Examples/Scripts/SetUpHCPPipeline.sh
>
>
>
> What about on your ${PATH}?
>
>
>
> As for MR+FIX itself, we are only waiting on an FSL 6.0.1 release as
> testing has concluded successfully.
>
>
>
> Matt.
>
>
>
> *From: * on behalf of Leonardo
> Tozzi 
> *Date: *Monday, February 25, 2019 at 5:09 PM
> *To: *"Harms, Michael" , "hcp-users@humanconnectome.org"
> 
> *Subject: *Re: [HCP-Users] ICA FIX output missing
>
>
>
> Dear Michael,
>
>
>
> Thank you for pointing me to the logfiles.
>
> It seems like the script is not finding the directory where wb_command is.
> In my case, I am loading it as a module in my HPC cluster. I have also put
> its path in ICAFIX/fix1.067/settings.sh as follows:
>
>
>
> # Set this to the location of the HCP Workbench command for your platform
>
> FSL_FIX_WBC='/share/software/user/open/workbench/1.3.1/bin/wb_command';
>
>
>
> However, the script does not seem to “see” this path and instead uses the
> setting I was using on my local machine. In the logfile
> tfMRI_EMOTION_RL_hp2000.ica/.fix.log, I get the following error:
>
>
>
> /bin/bash: /Applications/workbench/bin_macosx64/wb_command: No such file
> or directory
>
>
>
> Is there another place in the scripts that is overriding my settings.sh?
>
> Concerning the length or the runs, I will look into the multirun
> 

Re: [HCP-Users] structural QC and medial wall

2019-02-25 Thread Timothy Coalson
A medial wall mask is used to mask out data for at least cifti files.  It
is hard to say for sure (the volume to surface mapping is more involved
than the closest vertex logic used in the GUI to identify a vertex), but I
would guess that both get masked out by the medial wall currently.  Future
registrations might make the edge of the medial wall mask closer to the
location where the cortical sheet ends (but that won't affect your second
question).

Tim


On Mon, Feb 25, 2019 at 9:48 AM Moataz Assem 
wrote:

> Hi,
>
>
>
> I have got a couple of questions about surface segmentation for cortex
> near the medial wall. I am attaching a couple of figs from structural data
> we collected using the HCP protocols.
>
>
>
> 1)  In fig1 attached, the cross hair (on volume) point to a part of
> the cortical ribbon while its corresponding blue dot on the surface lies
> “inside” the medial wall. Does that mean those voxels are excluded from
> further analysis (i.e. are corresponding functional voxels excluded)? Or is
> the medial wall there just for visual purposes.
>
> 2)  In fig 2 attached, there appears to be a mistake in the lateral
> ventricle where the choroid plexus (?) is misidentified as part of the
> cortical ribbon. Again the blue spot on the surface is within the medial
> wall. Would that mean this bit is excluded or will get “interpolated”
> somewhere?
>
>
>
> Thanks
>
>
>
> Moataz
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Surface area size of fsLR space

2019-02-22 Thread Timothy Coalson
The command wb_command -surface-vertex-areas will give you the area of each
vertex.  For vertex volume, you should use wb_command -surface-wedge-volume.

When comparing these kinds of measures, it is usually better to measure
them in an anatomically faithful space (such as the T1w space of each
subject, which is only rigidly registered).

Tim


On Fri, Feb 22, 2019 at 8:29 PM Huang Taicheng  wrote:

> Hello,
>
> I'm trying to convert vertex number to area size in fsLR space because
> area size, rather than vertex number, is comparable across data from
> different coordinate space (e.g. fsaverage, fsLR). However, unlike to
> freesurfer, I could not find any files like lh.area in HCP folders. This
> paper (see reference) mentioned that all of the gray matter sampled at a 2
> mm average vertex spacing on the surface and as 2 mm voxels subcortically.
> Could I explain this as the area size of each vertex is 4 mm2, and they are
> isotropic? I have calculated if my understanding is right, the volume of
> the grey matter in left hemisphere is 259936 mm3, which is similar to the
> value calculated in fsaverage space (by checking information of aseg.stats,
> 277114.15 mm3 as one of my subject).
> I could not make sure if my understanding is correct, more information is
> better for me to do further analysis.
>
> Thanks in advance!
>
> Taicheng
>
> Reference:
> Glasser MF, et al. (2013) The minimal preprocessing pipelines for the
> human connectome project. Neuroimage 80:105-124.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Confusion about Destrieux labels

2019-02-22 Thread Timothy Coalson
In that gifti file, the label table indicates that ??? is label 0, as is
recommended (it means things that haven't been labeled, such as the medial
wall).  The matlab gifti library must be shifting these values, possibly
because they are taken as indices into another matlab array (matlab doesn't
accept 0 as an array index).

However, the labels appear to go up to 75 (L_S_temporal_transverse) in the
gifti file.  I would expect that the freesurfer outputs already contain 75
labels.

Tim


On Fri, Feb 22, 2019 at 12:08 PM Reza Rajimehr  wrote:

> Hi,
>
> We have loaded Destrieux parcellation of one of the HCP subjects
> (100206.L.aparc.a2009s.32k_fs_LR.label.gii) in Matlab. It includes 76
> labels. Label 1 is ??? and label 2 is G_and_S_frontomargin. However,
> Destrieux paper describes 74 labels and their label 1
> is G_and_S_frontomargin. Could you please explain this discrepancy?
>
> Thanks,
> Reza
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] unwarping 164k surface to subject-specific ACPC-aligned headcoordinate space

2019-02-21 Thread Timothy Coalson
I see, I didn't realize we generated the 32k versions in T1w space but not
the 164k.

The output of the command should be named with "midthickness", not
"sphere", the shape will be the same as the input surface.  You will need
to use the midthickness surface from the T1w/Native folder as the input to
get the result you want (only rigidly aligned).  I think the
"100307.L.sphere.164k_fs_LR.surf.gii" sphere is just a copy of the standard
sphere, so it should work correctly, but I'm not sure.

Tim


On Thu, Feb 21, 2019 at 11:00 AM CHAUMON Maximilien <
maximilien.chau...@icm-institute.org> wrote:

> Thank you!
>
> In the data I have downloaded, the only place where I find a 164k surface
> is in the MNInonlinear directory, so I assume some nonlinear transformation
> has been applied. The T1w directory has no such high res surf.
>
> So This command should do what I want, right (I split in several lines for
> clarity) ?
> wb_command -surface-resample \
> 100307/MNINonLinear/Native/100307.L.midthickness.native.surf.gii \
> 100307/MNINonLinear/Native/100307.L.sphere.MSMAll.native.surf.gii \
> 100307/MNINonLinear/100307.L.sphere.164k_fs_LR.surf.gii \
> BARYCENTRIC \
> 100307/T1w/100307.L.sphere.164k_fs_LR.surf.gii
>
> Many thanks for your help!
>
>
>
> Le mer. 20 févr. 2019 à 20:34, Timothy Coalson  a écrit :
>
>> Sorry, the recommended sphere for resampling any subject will of course
>> be that subject's version of that file, not specifically subject 100307's
>> sphere.
>>
>> Tim
>>
>>
>> On Wed, Feb 20, 2019 at 1:31 PM Timothy Coalson  wrote:
>>
>>> On Wed, Feb 20, 2019 at 8:03 AM CHAUMON Maximilien <
>>> maximilien.chau...@icm-institute.org> wrote:
>>>
>>>> Hello,
>>>>
>>>> I'm looking at fine changes in MEG forward leadfields and would like to
>>>> use the 164k meshes in each subject (I know 164k vertices are overkill, but
>>>> I need this high res rendering for one of my figures). I'm interested in
>>>> the actual original location of individual vertices in the brain in
>>>> subject-specific ACPC-aligned headcoordinate space. I don't want any non
>>>> linear spatial transformation applied to the mesh.
>>>> So I would like to use the 164k mesh with coordinates without the
>>>> nonlinear transformation that (as far as I understand) was applied to all
>>>> the 164k_fs_LR files. Is there an easy way to revert the non linear
>>>> transformation?
>>>>
>>>
>>> Only the surface files under the MNINonLinear folder have had any
>>> nonlinear anatomical warp applied.  The surface files under T1w all line up
>>> with the distortion corrected, rigidly aligned T1w image (we don't really
>>> keep scanner space around, and we often call this distortion corrected
>>> rigid alignment space "native volume space").  Surface
>>> registration/resampling does not deform the anatomy, it just tiles the same
>>> contour in 3D space with a new set of triangles.
>>>
>>> Note that averaging the coordinates of vertices across subjects will
>>> change their location, and this will affect geometry and "foldedness", to a
>>> degree depending on what registration was used.  As long as you stick with
>>> individual surfaces, you don't need to worry about this.
>>>
>>>
>>>> I would then use the file
>>>> {Subject}.{Hemi}.midthickness.164k_fs_LR.surf.gii and apply the inverse
>>>> transformation,
>>>>
>>>> Alternatively, is there a way to easily downsample
>>>> {Subject}.{Hemi}.midthickness.native.surf.gii to 164k vertices? is this
>>>> then in subject-specific ACPC-aligned headcoordinate space? how could I
>>>> move to that space?
>>>>
>>>
>>> This is what the existing 164k surfaces in the T1w folder already are.
>>> Native mesh is commonly ~130k for HCP scans, so 164k is actually a small
>>> upsampling.  We recommend the MSMAll versions, as the same vertex number
>>> across subjects is more often in the same area than for other registrations.
>>>
>>> For reference (since we have already dealt with this resampling for
>>> you), the recommended sphere to use for resampling from native mesh to any
>>> fs_LR mesh is "MNINonLinear/Native/100307.L.sphere.MSMAll.native.surf.gii"
>>> (and the R version, of course).  Ignore the fact that it is in the
>>> MNINonLinear folder, sphere surfaces don't have a volume space, that is
>>> just where it got put.  The standard spheres for fs_LR are in the pipelines
>>> under global/templates/standard_mesh_atlases/, read the readme file.  The
>>> command to do surface resampling is wb_command -surface-resample.
>>>
>>>
>>>> Does that make sense?
>>>>
>>>> Many thanks,
>>>> Max
>>>>
>>>> ___
>>>> HCP-Users mailing list
>>>> HCP-Users@humanconnectome.org
>>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>>>
>>>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] unwarping 164k surface to subject-specific ACPC-aligned headcoordinate space

2019-02-20 Thread Timothy Coalson
Sorry, the recommended sphere for resampling any subject will of course be
that subject's version of that file, not specifically subject 100307's
sphere.

Tim


On Wed, Feb 20, 2019 at 1:31 PM Timothy Coalson  wrote:

> On Wed, Feb 20, 2019 at 8:03 AM CHAUMON Maximilien <
> maximilien.chau...@icm-institute.org> wrote:
>
>> Hello,
>>
>> I'm looking at fine changes in MEG forward leadfields and would like to
>> use the 164k meshes in each subject (I know 164k vertices are overkill, but
>> I need this high res rendering for one of my figures). I'm interested in
>> the actual original location of individual vertices in the brain in
>> subject-specific ACPC-aligned headcoordinate space. I don't want any non
>> linear spatial transformation applied to the mesh.
>> So I would like to use the 164k mesh with coordinates without the
>> nonlinear transformation that (as far as I understand) was applied to all
>> the 164k_fs_LR files. Is there an easy way to revert the non linear
>> transformation?
>>
>
> Only the surface files under the MNINonLinear folder have had any
> nonlinear anatomical warp applied.  The surface files under T1w all line up
> with the distortion corrected, rigidly aligned T1w image (we don't really
> keep scanner space around, and we often call this distortion corrected
> rigid alignment space "native volume space").  Surface
> registration/resampling does not deform the anatomy, it just tiles the same
> contour in 3D space with a new set of triangles.
>
> Note that averaging the coordinates of vertices across subjects will
> change their location, and this will affect geometry and "foldedness", to a
> degree depending on what registration was used.  As long as you stick with
> individual surfaces, you don't need to worry about this.
>
>
>> I would then use the file
>> {Subject}.{Hemi}.midthickness.164k_fs_LR.surf.gii and apply the inverse
>> transformation,
>>
>> Alternatively, is there a way to easily downsample
>> {Subject}.{Hemi}.midthickness.native.surf.gii to 164k vertices? is this
>> then in subject-specific ACPC-aligned headcoordinate space? how could I
>> move to that space?
>>
>
> This is what the existing 164k surfaces in the T1w folder already are.
> Native mesh is commonly ~130k for HCP scans, so 164k is actually a small
> upsampling.  We recommend the MSMAll versions, as the same vertex number
> across subjects is more often in the same area than for other registrations.
>
> For reference (since we have already dealt with this resampling for you),
> the recommended sphere to use for resampling from native mesh to any fs_LR
> mesh is "MNINonLinear/Native/100307.L.sphere.MSMAll.native.surf.gii" (and
> the R version, of course).  Ignore the fact that it is in the MNINonLinear
> folder, sphere surfaces don't have a volume space, that is just where it
> got put.  The standard spheres for fs_LR are in the pipelines
> under global/templates/standard_mesh_atlases/, read the readme file.  The
> command to do surface resampling is wb_command -surface-resample.
>
>
>> Does that make sense?
>>
>> Many thanks,
>> Max
>>
>> ___
>> HCP-Users mailing list
>> HCP-Users@humanconnectome.org
>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] unwarping 164k surface to subject-specific ACPC-aligned headcoordinate space

2019-02-20 Thread Timothy Coalson
On Wed, Feb 20, 2019 at 8:03 AM CHAUMON Maximilien <
maximilien.chau...@icm-institute.org> wrote:

> Hello,
>
> I'm looking at fine changes in MEG forward leadfields and would like to
> use the 164k meshes in each subject (I know 164k vertices are overkill, but
> I need this high res rendering for one of my figures). I'm interested in
> the actual original location of individual vertices in the brain in
> subject-specific ACPC-aligned headcoordinate space. I don't want any non
> linear spatial transformation applied to the mesh.
> So I would like to use the 164k mesh with coordinates without the
> nonlinear transformation that (as far as I understand) was applied to all
> the 164k_fs_LR files. Is there an easy way to revert the non linear
> transformation?
>

Only the surface files under the MNINonLinear folder have had any nonlinear
anatomical warp applied.  The surface files under T1w all line up with the
distortion corrected, rigidly aligned T1w image (we don't really keep
scanner space around, and we often call this distortion corrected rigid
alignment space "native volume space").  Surface registration/resampling
does not deform the anatomy, it just tiles the same contour in 3D space
with a new set of triangles.

Note that averaging the coordinates of vertices across subjects will change
their location, and this will affect geometry and "foldedness", to a degree
depending on what registration was used.  As long as you stick with
individual surfaces, you don't need to worry about this.


> I would then use the file
> {Subject}.{Hemi}.midthickness.164k_fs_LR.surf.gii and apply the inverse
> transformation,
>
> Alternatively, is there a way to easily downsample
> {Subject}.{Hemi}.midthickness.native.surf.gii to 164k vertices? is this
> then in subject-specific ACPC-aligned headcoordinate space? how could I
> move to that space?
>

This is what the existing 164k surfaces in the T1w folder already are.
Native mesh is commonly ~130k for HCP scans, so 164k is actually a small
upsampling.  We recommend the MSMAll versions, as the same vertex number
across subjects is more often in the same area than for other registrations.

For reference (since we have already dealt with this resampling for you),
the recommended sphere to use for resampling from native mesh to any fs_LR
mesh is "MNINonLinear/Native/100307.L.sphere.MSMAll.native.surf.gii" (and
the R version, of course).  Ignore the fact that it is in the MNINonLinear
folder, sphere surfaces don't have a volume space, that is just where it
got put.  The standard spheres for fs_LR are in the pipelines
under global/templates/standard_mesh_atlases/, read the readme file.  The
command to do surface resampling is wb_command -surface-resample.


> Does that make sense?
>
> Many thanks,
> Max
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Separating Parcellated Files

2019-02-18 Thread Timothy Coalson
Parcellated files contain only one value per parcel (per map), so it isn't
a good idea to try to reconstitute them into a spatial map before
analysis.  I think the correct thing to do is to put them through PALM in a
way that doesn't use spatial information (no tfce, no smoothing, etc).

Tim


On Mon, Feb 18, 2019, 4:27 PM Timothy Hendrickson  wrote:

> Hello,
>
> I am attempting to perform an analysis with the tool PALM with parcellated
> surface data. Within the PALM wiki underneath the CIFTI examples portion it
> is suggested to separate cifti files into it's discrete pieces, so I tried
> to separate the parcellated file into the left and right hemisphere,
> however I get a failure.
>
> /home/lnpi14-raid1/timothy-data-lnpi14/DOWNLOADS/workbench/bin_rh_linux64/../exe_rh_linux64/wb_command
> -cifti-separate merged_MyelinMap_MSMSulc.Glasser.pscalar.nii ROW -metric
> CORTEX_LEFT merged_MyelinMap_MSMSulc.Glasser.L.func.gii -metric
> CORTEX_RIGHT merged_MyelinMap_MSMSulc.Glasser.R.func.gii
>
> ERROR: specified direction does not contain brain models
>
> While running:
> /home/lnpi14-raid1/timothy-data-lnpi14/DOWNLOADS/workbench/bin_rh_linux64/../exe_rh_linux64/wb_command
> -cifti-separate merged_MyelinMap_MSMSulc.Glasser.pscalar.nii COLUMN -metric
> CORTEX_LEFT merged_MyelinMap_MSMSulc.Glasser.L.func.gii -metric
> CORTEX_RIGHT merged_MyelinMap_MSMSulc.Glasser.R.func.gii
>
> ERROR: specified direction does not contain brain models
>
> Any suggestions?
>
> Thank you!
>
> -Tim
>
>
> Timothy Hendrickson
> Neuroimaging Analyst/Staff Scientist
> University of Minnesota Informatics Institute
> University of Minnesota
> Bioinformatics M.S. Candidate
> Office: 612-624-0783
> Mobile: 507-259-3434 (texts okay)
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Extracting native space coordinates for brain stimulation

2019-02-14 Thread Timothy Coalson
As long as the volume tab is view yoked, and the "move to identified
location" button is enabled, yes (both of these have preferences as to
whether they are the default, but the initial preference setting is to have
them enabled by default).  The mm coordinates will be the same, surface or
volume (with rounding-like error for finding the nearest voxel center), the
volume tab is only required if you want the voxel indices (which are
specifically from the lowest volume layer that is enabled).  If you only
need mm coordinates, then you don't need to switch tabs, the identification
window will show the coordinates of the nearest vertex to your click.

If you want to specifically identify coordinates on the skull near specific
surface features, you may want to make use of the volume tab - there is an
oblique mode where you can rotate the slices to approximate the closest
point on the skull.  However, please double check any coordinates obtained
from oblique mode, as it hasn't been tested as thoroughly (and has had some
quirks in the past).  Orthogonal mode and "All" view, and toggling the
slices on and off, may be useful in checking the coordinates.

You will, of course, need a way to transfer those coordinates back to the
actual subject for stimulation, since as recently discussed, the T1w
coordinates are not the original scanner coordinates, but distortion
corrected, rigidly-registered coordinates.

Tim


On Thu, Feb 14, 2019 at 11:34 AM Somers, David  wrote:

> We are attempting to define brain stimulation coordinates in native volume
> space based on an analysis performed on individual subject functional data
> preprocessed using the HCP preprocessing pipeline. We think we have figured
> out how to do this correctly, but would greatly appreciate confirmation or
> correction.
>
> Our understanding is that if we are only interested in locations within
> the cortical ribbon we do not need to perform any resampling and we can
> simply load the results along with the surfaces and volumes located in
> ${Subject}/T1w/.
>
> wb_view
>  ${SubjectFolder}/${Subject}/T1w/fsaverage_LR32k/${Subject}.32k_fs_LR.wb.spec
>  func.dscalar.nii
>
> Now if we select a point on the surface (e.g.
> ${Subject}.R.midthickness.32k_fs_LR.surf.gii) we should be able to go to
> the Volume tab and get voxel coordinates of the corresponding point in
> native volume space. Is this correct?
>
> Thanks in advance for your answer.
>
> David C. Somers, Ph.D.
> Professor & Chair
> Dept. of Psychological & Brain Sciences
> Boston University
>
>
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Individualized HCP parcel to MNI coord

2019-02-13 Thread Timothy Coalson
Sorry, one more...we do have a command to put the coordinates of a surface
file into a metric file, so you can compute center of gravity in a
surface-based way.  Use -surface-coordinates-to-metric, then either
-cifti-create-dense-from-template to put the coordinates into cifti,
followed by -cifti-parcellate to generate all the centers of gravity, or
use -gifti-label-to-roi and -metric-stats to take the center of gravity of
a single parcel.

Tim


On Wed, Feb 13, 2019 at 1:34 PM Timothy Coalson  wrote:

> Correction to the volume center of gravity part: if you convert to ROIs
> before mapping to volume, you will have soft-edge "ROIs" in the volume, as
> it accounts for partial voluming.  You can use these as the weights in
> -volume-weighted-stats (and you wouldn't need the -roi option) to take this
> into account, if you want.  If you have binary ROIs in the volume,
> -volume-stats with -roi is sufficient.  However, the choice of soft-edge
> ROIs would matter more to an approach that actually used the shape of the
> areas, rather than to a method that only uses center of gravity.
>
> Tim
>
>
> On Wed, Feb 13, 2019 at 1:26 PM Timothy Coalson  wrote:
>
>> The command you are looking for is -metric-to-volume-mapping for an ROI,
>> or -label-to-volume-mapping for the entire label file.  Note that these
>> take gifti inputs, so you will need -cifti-separate and possibly
>> -gifti-label-to-roi first (alternatively, -volume-label-to-roi
>> afterwards).  We don't have a command to give the center of gravity of an
>> ROI, though - if you make a volume file containing the coordinates of every
>> voxel (in matlab or octave), then you can use -volume-weighted-stats on it
>> with -roi and -mean and that will give you center of gravity.
>>
>> The problem we keep mentioning with MNI volume space is when it is used
>> for averaging voxel data across subjects, because current volume
>> registrations don't achieve the cross-subject functional correspondence
>> that even folding-based surface registration does.  It is valid to use
>> surfaces and volumes of a single individual in MNI space (we actually do
>> this with functional data, mostly for computational and storage reasons),
>> and these are provided in the MNINonLinear folder.
>>
>> However, be aware that our MNI space is nonlinear registered, and as such
>> it has local deformations of features compared to what the subject anatomy
>> really is.  Our "T1w" space is actually rigidly (rotation and translation
>> only) registered to an MNI template, so you can treat it mostly like a
>> linear MNI registration (however, note that the MNI templates and the
>> related 12-DOF registrations have different expansion ratios on different
>> axes compared to average human brain size).
>>
>> So, our "T1w" space is an anatomically accurate space for the subject,
>> but has been mostly aligned to the MNI coordinate system.  You will have to
>> work out whether the slightly-larger MNI coordinates are compensated for in
>> your software, and thus you would need to either "add in" the typical MNI
>> scaling to our "T1w" coordinates, or do your own affine registration to
>> MNI.  Alternatively, if you can back out the computed solution into a
>> subject-specific space before actually positioning the coils, then it may
>> not matter that what you put into the software isn't "really" MNI space.
>> Volume space issues are always a bunch of fun...
>>
>> Tim
>>
>>
>> On Wed, Feb 13, 2019 at 12:36 PM Stevens, Michael <
>> michael.stev...@hhchealth.org> wrote:
>>
>>> It also occurs to me that if someone has already worked out even roughly
>>> approximate locations for parcels (e.g., centers?) that  correspond to
>>> volume space coordinates in MNI-land, this would be “good enough” for me to
>>> finalize the grant proposal in the next day or so.  Admittedly, I’d rather
>>> learn which files/commands I can use to do the transformations directly so
>>> I can explore a bit with real data.  But I’d be really happy just to be
>>> able to have something that allows us to generate the typical tDCS field
>>> intensity mappings and electrode configuration for this grant proposal’s
>>> methods section.
>>>
>>>
>>>
>>> Thanks!
>>>
>>> Mike
>>>
>>>
>>>
>>>
>>>
>>> *From:* Glasser, Matthew [mailto:glass...@wustl.edu]
>>> *Sent:* Tuesday, February 12, 2019 10:58 PM
>>> *To:* Stevens, Michael; hcp-users@humanconnectome.org
>>> *Subject:* Re: [HCP

Re: [HCP-Users] Individualized HCP parcel to MNI coord

2019-02-13 Thread Timothy Coalson
Correction to the volume center of gravity part: if you convert to ROIs
before mapping to volume, you will have soft-edge "ROIs" in the volume, as
it accounts for partial voluming.  You can use these as the weights in
-volume-weighted-stats (and you wouldn't need the -roi option) to take this
into account, if you want.  If you have binary ROIs in the volume,
-volume-stats with -roi is sufficient.  However, the choice of soft-edge
ROIs would matter more to an approach that actually used the shape of the
areas, rather than to a method that only uses center of gravity.

Tim


On Wed, Feb 13, 2019 at 1:26 PM Timothy Coalson  wrote:

> The command you are looking for is -metric-to-volume-mapping for an ROI,
> or -label-to-volume-mapping for the entire label file.  Note that these
> take gifti inputs, so you will need -cifti-separate and possibly
> -gifti-label-to-roi first (alternatively, -volume-label-to-roi
> afterwards).  We don't have a command to give the center of gravity of an
> ROI, though - if you make a volume file containing the coordinates of every
> voxel (in matlab or octave), then you can use -volume-weighted-stats on it
> with -roi and -mean and that will give you center of gravity.
>
> The problem we keep mentioning with MNI volume space is when it is used
> for averaging voxel data across subjects, because current volume
> registrations don't achieve the cross-subject functional correspondence
> that even folding-based surface registration does.  It is valid to use
> surfaces and volumes of a single individual in MNI space (we actually do
> this with functional data, mostly for computational and storage reasons),
> and these are provided in the MNINonLinear folder.
>
> However, be aware that our MNI space is nonlinear registered, and as such
> it has local deformations of features compared to what the subject anatomy
> really is.  Our "T1w" space is actually rigidly (rotation and translation
> only) registered to an MNI template, so you can treat it mostly like a
> linear MNI registration (however, note that the MNI templates and the
> related 12-DOF registrations have different expansion ratios on different
> axes compared to average human brain size).
>
> So, our "T1w" space is an anatomically accurate space for the subject, but
> has been mostly aligned to the MNI coordinate system.  You will have to
> work out whether the slightly-larger MNI coordinates are compensated for in
> your software, and thus you would need to either "add in" the typical MNI
> scaling to our "T1w" coordinates, or do your own affine registration to
> MNI.  Alternatively, if you can back out the computed solution into a
> subject-specific space before actually positioning the coils, then it may
> not matter that what you put into the software isn't "really" MNI space.
> Volume space issues are always a bunch of fun...
>
> Tim
>
>
> On Wed, Feb 13, 2019 at 12:36 PM Stevens, Michael <
> michael.stev...@hhchealth.org> wrote:
>
>> It also occurs to me that if someone has already worked out even roughly
>> approximate locations for parcels (e.g., centers?) that  correspond to
>> volume space coordinates in MNI-land, this would be “good enough” for me to
>> finalize the grant proposal in the next day or so.  Admittedly, I’d rather
>> learn which files/commands I can use to do the transformations directly so
>> I can explore a bit with real data.  But I’d be really happy just to be
>> able to have something that allows us to generate the typical tDCS field
>> intensity mappings and electrode configuration for this grant proposal’s
>> methods section.
>>
>>
>>
>> Thanks!
>>
>> Mike
>>
>>
>>
>>
>>
>> *From:* Glasser, Matthew [mailto:glass...@wustl.edu]
>> *Sent:* Tuesday, February 12, 2019 10:58 PM
>> *To:* Stevens, Michael; hcp-users@humanconnectome.org
>> *Subject:* Re: [HCP-Users] Individualized HCP parcel to MNI coord
>>
>>
>>
>> EXTERNAL email from Outside HHC! Do NOT open attachments or click links
>> from unknown senders.
>>
>> If we leave out the “MNI coordinates” part of this question and think in
>> terms of the volume space coordinates of the individual being studied, it
>> is perfectly valid to bring results from the group surface back to an
>> individual’s physical volume space. Is a single coordinate what you need or
>> would an ROI be better?
>>
>>
>>
>> Matt.
>>
>>
>>
>> *From: * on behalf of "Stevens,
>> Michael" 
>> *Date: *Tuesday, February 12, 2019 at 9:47 PM
>> *To: *"hcp-users@humanconnectome.org" 
>> *Subject: *[HCP-Us

Re: [HCP-Users] Individualized HCP parcel to MNI coord

2019-02-13 Thread Timothy Coalson
The command you are looking for is -metric-to-volume-mapping for an ROI, or
-label-to-volume-mapping for the entire label file.  Note that these take
gifti inputs, so you will need -cifti-separate and possibly
-gifti-label-to-roi first (alternatively, -volume-label-to-roi
afterwards).  We don't have a command to give the center of gravity of an
ROI, though - if you make a volume file containing the coordinates of every
voxel (in matlab or octave), then you can use -volume-weighted-stats on it
with -roi and -mean and that will give you center of gravity.

The problem we keep mentioning with MNI volume space is when it is used for
averaging voxel data across subjects, because current volume registrations
don't achieve the cross-subject functional correspondence that even
folding-based surface registration does.  It is valid to use surfaces and
volumes of a single individual in MNI space (we actually do this with
functional data, mostly for computational and storage reasons), and these
are provided in the MNINonLinear folder.

However, be aware that our MNI space is nonlinear registered, and as such
it has local deformations of features compared to what the subject anatomy
really is.  Our "T1w" space is actually rigidly (rotation and translation
only) registered to an MNI template, so you can treat it mostly like a
linear MNI registration (however, note that the MNI templates and the
related 12-DOF registrations have different expansion ratios on different
axes compared to average human brain size).

So, our "T1w" space is an anatomically accurate space for the subject, but
has been mostly aligned to the MNI coordinate system.  You will have to
work out whether the slightly-larger MNI coordinates are compensated for in
your software, and thus you would need to either "add in" the typical MNI
scaling to our "T1w" coordinates, or do your own affine registration to
MNI.  Alternatively, if you can back out the computed solution into a
subject-specific space before actually positioning the coils, then it may
not matter that what you put into the software isn't "really" MNI space.
Volume space issues are always a bunch of fun...

Tim


On Wed, Feb 13, 2019 at 12:36 PM Stevens, Michael <
michael.stev...@hhchealth.org> wrote:

> It also occurs to me that if someone has already worked out even roughly
> approximate locations for parcels (e.g., centers?) that  correspond to
> volume space coordinates in MNI-land, this would be “good enough” for me to
> finalize the grant proposal in the next day or so.  Admittedly, I’d rather
> learn which files/commands I can use to do the transformations directly so
> I can explore a bit with real data.  But I’d be really happy just to be
> able to have something that allows us to generate the typical tDCS field
> intensity mappings and electrode configuration for this grant proposal’s
> methods section.
>
>
>
> Thanks!
>
> Mike
>
>
>
>
>
> *From:* Glasser, Matthew [mailto:glass...@wustl.edu]
> *Sent:* Tuesday, February 12, 2019 10:58 PM
> *To:* Stevens, Michael; hcp-users@humanconnectome.org
> *Subject:* Re: [HCP-Users] Individualized HCP parcel to MNI coord
>
>
>
> EXTERNAL email from Outside HHC! Do NOT open attachments or click links
> from unknown senders.
>
> If we leave out the “MNI coordinates” part of this question and think in
> terms of the volume space coordinates of the individual being studied, it
> is perfectly valid to bring results from the group surface back to an
> individual’s physical volume space. Is a single coordinate what you need or
> would an ROI be better?
>
>
>
> Matt.
>
>
>
> *From: * on behalf of "Stevens,
> Michael" 
> *Date: *Tuesday, February 12, 2019 at 9:47 PM
> *To: *"hcp-users@humanconnectome.org" 
> *Subject: *[HCP-Users] Individualized HCP parcel to MNI coord
>
>
>
> Hi everyone,
>
>
>
> I know by now this is a tired subject that’s come up in several ways on
> the HCP listserv over the past few years.  But can y’all remind me the most
> straightforward way to go backwards from a given HCP parcel (e.g., IFSa)
> and find a reasonably close approximation in MNI space?
>
>
>
> The problems in working from “group level” surface data and the
> differences from subject to subject have been discussed here before.  But
> I’m pretty sure I’m not risking those pitfalls with what I’m planning
> here.  I’m finalizing a study design and I need a “starting point” to
> select precise brain stimulation coordinates for individuals.  This study
> formulation is based on several years’ worth of my fMRI data in HCP space
> that I’ve done some things with ICA and DCM to put together some
> interesting systems-level circuit maps.  Now, based on findings within HCP
> localized space, I want to use specific parcels to plan how best to
> modulate those circuits experimentally with tDCS.  The catch… of course… is
> that the neurotargeting software works in voxelwise space.  In practice, I
> suspect this won’t be too significant a hurdle.  I envision taking each
> 

Re: [HCP-Users] Sharing HCP-derived brain networks and graph-theoric measures

2019-02-07 Thread Timothy Coalson
For more detail, to import a parcels by parcels connectivity matrix into
cifti, you will need to make a parcel by parcel .pconn.nii file first - the
simplest approach is probably to take any dense cifti file (.dscalar.nii or
.dtseries.nii), run it through -cifti-parcellate to make a .pscalar.nii or
.ptseries.nii, then the result of that through -cifti-correlate to make a
.pconn.nii.

Also note that -cifti-convert -to-text should be able to produce a .csv by
specifying "-col-delim ,".  However, -cifti-convert only converts the data
values, it does not put any names or labels in the file (also, it expects
only data values in the input to -from-text).  The order of parcels is set
during -cifti-parcellate, determined by the key values of each label (which
in turn is set during -cifti-label-import).

Tim


On Thu, Feb 7, 2019 at 5:11 PM Timothy Coalson  wrote:

> BALSA does not require that files be "queryable".  It is scene-centric, so
> most of the files that get uploaded happen to be files that we know what to
> expect inside them, but it is designed such that this is an added
> convenience on top of storing the files, not as a requirement.
>
> Tim
>
>
> On Thu, Feb 7, 2019 at 4:58 PM Harms, Michael  wrote:
>
>>
>>
>> Supporting “arbitrary” formats requires putting a whole system in place
>> for describing the format, and making it queryable.   The good thing is
>> that if the pconn’s are available, users could use those inputs to their
>> own derived graph-theoretic measures.
>>
>>
>>
>> Cheers,
>>
>> -MH
>>
>>
>>
>> --
>>
>> Michael Harms, Ph.D.
>>
>> ---
>>
>> Associate Professor of Psychiatry
>>
>> Washington University School of Medicine
>>
>> Department of Psychiatry, Box 8134
>>
>> 660 South Euclid Ave.Tel: 314-747-6173
>>
>> St. Louis, MO  63110  Email: mha...@wustl.edu
>>
>>
>>
>> *From: * on behalf of Caio Seguin
>> 
>> *Date: *Thursday, February 7, 2019 at 4:52 PM
>> *To: *NEUROSCIENCE tim 
>> *Cc: *hcp-users 
>> *Subject: *Re: [HCP-Users] Sharing HCP-derived brain networks and
>> graph-theoric measures
>>
>>
>>
>> Thanks Tim and Matt for the quick reply.
>>
>>
>>
>> Most of the files are NxN connectivity matrices, where N could denote,
>> for instance, ROIs from different parcellation schemes or resting-state
>> functional networks.
>>
>>
>>
>> Ok, so one option is to transform these matrices into cifti files and
>> share them through BALSA. On the one hand, this is a nice solution for it
>> solves the user term uses. On the other, it is a bit of a roundabout way to
>> store these files in the context of my manuscript. The matrices are used to
>> derive graph-theoretic measures about brain organization (rather than for
>> visualization purposes), so researchers interested in that would need to
>> convert the cifti files back to CSV.
>>
>>
>>
>> More generally, do you suggest any methods to share HCP-derived files in
>> an arbitrary format?
>>
>>
>>
>> Thanks in advance for the help.
>>
>>
>>
>> Best,
>>
>> Caio
>>
>>
>>
>>
>>
>> Em sex, 8 de fev de 2019 às 06:19, Timothy Coalson 
>> escreveu:
>>
>> If your data is organized as a value per parcel/network, you should be
>> able to turn it into parcellated cifti files, which can be displayed in
>> wb_view (and therefore in scenes) as a matrix and/or as colored regions on
>> the surfaces and in the volume.
>>
>>
>>
>> See wb_command -cifti-parcellate (to make a template parcellated cifti
>> file you can use to import data into), -cifti-label-import (to get your
>> network ROIs into the format -cifti-parcellate wants), and -cifti-convert
>> (and its -from-text option, to read csv or other text data and output
>> cifti).
>>
>>
>>
>> Tim
>>
>>
>>
>>
>>
>> On Thu, Feb 7, 2019 at 7:05 AM Caio Seguin  wrote:
>>
>> Dear experts,
>>
>>
>>
>> I have used diffusion and resting-state functional MRI data from the HCP
>> to derive whole brain connectomes for individual participants. I used the
>> connectomes to computed graph-theoretic measures that are part of a
>> manuscript I am working on.
>>
>>
>>
>> My question concerns the sharing of these connectomes and graph-theoretic
>> measures. My current under

Re: [HCP-Users] Sharing HCP-derived brain networks and graph-theoric measures

2019-02-07 Thread Timothy Coalson
BALSA does not require that files be "queryable".  It is scene-centric, so
most of the files that get uploaded happen to be files that we know what to
expect inside them, but it is designed such that this is an added
convenience on top of storing the files, not as a requirement.

Tim


On Thu, Feb 7, 2019 at 4:58 PM Harms, Michael  wrote:

>
>
> Supporting “arbitrary” formats requires putting a whole system in place
> for describing the format, and making it queryable.   The good thing is
> that if the pconn’s are available, users could use those inputs to their
> own derived graph-theoretic measures.
>
>
>
> Cheers,
>
> -MH
>
>
>
> --
>
> Michael Harms, Ph.D.
>
> ---
>
> Associate Professor of Psychiatry
>
> Washington University School of Medicine
>
> Department of Psychiatry, Box 8134
>
> 660 South Euclid Ave.Tel: 314-747-6173
>
> St. Louis, MO  63110  Email: mha...@wustl.edu
>
>
>
> *From: * on behalf of Caio Seguin <
> caioseg...@gmail.com>
> *Date: *Thursday, February 7, 2019 at 4:52 PM
> *To: *NEUROSCIENCE tim 
> *Cc: *hcp-users 
> *Subject: *Re: [HCP-Users] Sharing HCP-derived brain networks and
> graph-theoric measures
>
>
>
> Thanks Tim and Matt for the quick reply.
>
>
>
> Most of the files are NxN connectivity matrices, where N could denote, for
> instance, ROIs from different parcellation schemes or resting-state
> functional networks.
>
>
>
> Ok, so one option is to transform these matrices into cifti files and
> share them through BALSA. On the one hand, this is a nice solution for it
> solves the user term uses. On the other, it is a bit of a roundabout way to
> store these files in the context of my manuscript. The matrices are used to
> derive graph-theoretic measures about brain organization (rather than for
> visualization purposes), so researchers interested in that would need to
> convert the cifti files back to CSV.
>
>
>
> More generally, do you suggest any methods to share HCP-derived files in
> an arbitrary format?
>
>
>
> Thanks in advance for the help.
>
>
>
> Best,
>
> Caio
>
>
>
>
>
> Em sex, 8 de fev de 2019 às 06:19, Timothy Coalson 
> escreveu:
>
> If your data is organized as a value per parcel/network, you should be
> able to turn it into parcellated cifti files, which can be displayed in
> wb_view (and therefore in scenes) as a matrix and/or as colored regions on
> the surfaces and in the volume.
>
>
>
> See wb_command -cifti-parcellate (to make a template parcellated cifti
> file you can use to import data into), -cifti-label-import (to get your
> network ROIs into the format -cifti-parcellate wants), and -cifti-convert
> (and its -from-text option, to read csv or other text data and output
> cifti).
>
>
>
> Tim
>
>
>
>
>
> On Thu, Feb 7, 2019 at 7:05 AM Caio Seguin  wrote:
>
> Dear experts,
>
>
>
> I have used diffusion and resting-state functional MRI data from the HCP
> to derive whole brain connectomes for individual participants. I used the
> connectomes to computed graph-theoretic measures that are part of a
> manuscript I am working on.
>
>
>
> My question concerns the sharing of these connectomes and graph-theoretic
> measures. My current understanding is that sharing this data is ok as long
> as I make sure users abide to the HCP data usage terms. What are your
> suggestions on how to do this?
>
>
>
> I've seen BALSA proposed to this end, since it provides a built-in
> mechanism of user terms, but my files are CSV or .mat files rather than WB
> scenes.
>
>
>
> Thanks in advance for your help.
>
>
>
> Best regards,
>
> Caio Seguin
>
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Sharing HCP-derived brain networks and graph-theoric measures

2019-02-07 Thread Timothy Coalson
As I recall, BALSA also allows arbitrary additional files to be uploaded to
studies.  I'm not sure about the details of how to do this, though (there
is an "upload files" button in the "files" modal for a study you own, but
I'm not sure where those files end up).

Tim


On Thu, Feb 7, 2019 at 4:52 PM Caio Seguin  wrote:

> Thanks Tim and Matt for the quick reply.
>
> Most of the files are NxN connectivity matrices, where N could denote, for
> instance, ROIs from different parcellation schemes or resting-state
> functional networks.
>
> Ok, so one option is to transform these matrices into cifti files and
> share them through BALSA. On the one hand, this is a nice solution for it
> solves the user term uses. On the other, it is a bit of a roundabout way to
> store these files in the context of my manuscript. The matrices are used to
> derive graph-theoretic measures about brain organization (rather than for
> visualization purposes), so researchers interested in that would need to
> convert the cifti files back to CSV.
>
> More generally, do you suggest any methods to share HCP-derived files in
> an arbitrary format?
>
> Thanks in advance for the help.
>
> Best,
> Caio
>
>
> Em sex, 8 de fev de 2019 às 06:19, Timothy Coalson 
> escreveu:
>
>> If your data is organized as a value per parcel/network, you should be
>> able to turn it into parcellated cifti files, which can be displayed in
>> wb_view (and therefore in scenes) as a matrix and/or as colored regions on
>> the surfaces and in the volume.
>>
>> See wb_command -cifti-parcellate (to make a template parcellated cifti
>> file you can use to import data into), -cifti-label-import (to get your
>> network ROIs into the format -cifti-parcellate wants), and -cifti-convert
>> (and its -from-text option, to read csv or other text data and output
>> cifti).
>>
>> Tim
>>
>>
>> On Thu, Feb 7, 2019 at 7:05 AM Caio Seguin  wrote:
>>
>>> Dear experts,
>>>
>>> I have used diffusion and resting-state functional MRI data from the HCP
>>> to derive whole brain connectomes for individual participants. I used the
>>> connectomes to computed graph-theoretic measures that are part of a
>>> manuscript I am working on.
>>>
>>> My question concerns the sharing of these connectomes and
>>> graph-theoretic measures. My current understanding is that sharing this
>>> data is ok as long as I make sure users abide to the HCP data usage terms.
>>> What are your suggestions on how to do this?
>>>
>>> I've seen BALSA proposed to this end, since it provides a built-in
>>> mechanism of user terms, but my files are CSV or .mat files rather than WB
>>> scenes.
>>>
>>> Thanks in advance for your help.
>>>
>>> Best regards,
>>> Caio Seguin
>>>
>>> ___
>>> HCP-Users mailing list
>>> HCP-Users@humanconnectome.org
>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>>
>>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Sharing HCP-derived brain networks and graph-theoric measures

2019-02-07 Thread Timothy Coalson
If your data is organized as a value per parcel/network, you should be able
to turn it into parcellated cifti files, which can be displayed in wb_view
(and therefore in scenes) as a matrix and/or as colored regions on the
surfaces and in the volume.

See wb_command -cifti-parcellate (to make a template parcellated cifti file
you can use to import data into), -cifti-label-import (to get your network
ROIs into the format -cifti-parcellate wants), and -cifti-convert (and its
-from-text option, to read csv or other text data and output cifti).

Tim


On Thu, Feb 7, 2019 at 7:05 AM Caio Seguin  wrote:

> Dear experts,
>
> I have used diffusion and resting-state functional MRI data from the HCP
> to derive whole brain connectomes for individual participants. I used the
> connectomes to computed graph-theoretic measures that are part of a
> manuscript I am working on.
>
> My question concerns the sharing of these connectomes and graph-theoretic
> measures. My current understanding is that sharing this data is ok as long
> as I make sure users abide to the HCP data usage terms. What are your
> suggestions on how to do this?
>
> I've seen BALSA proposed to this end, since it provides a built-in
> mechanism of user terms, but my files are CSV or .mat files rather than WB
> scenes.
>
> Thanks in advance for your help.
>
> Best regards,
> Caio Seguin
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Forbidden to download

2019-02-06 Thread Timothy Coalson
Did you agree to the "open access" data use terms before trying to
download?  The tutorial dataset is covered by these terms, but maybe it
doesn't make this obvious.

Tim


On Wed, Feb 6, 2019 at 5:31 PM Rosalia Dacosta Aguayo 
wrote:

> Dear HCP users,
>
> I registered to the HCP web. I installed what was needed following the pdf
> documentation. I also have firefox as my browser but when I tried to
> download the tutprial dataset it says me "forbidden"
>
> Any idea why this is happening and how to solve it?
>
> Yours sincerely,
> Rosalia
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Workbench tools installation in Ubuntu 18.04

2019-02-05 Thread Timothy Coalson
The simple solution is to delete or rename the "libz.so.1" file in the
libs_linux64 directory.

What is going on is that an OS-supplied libpng is being pulled in, which
was compiled against a newer libz than we bundle in the linux distribution,
but the fact that libz was found in our bundled libraries prevents the
OS-supplied libz from being found.  The reason it was bundled in the
distribution is probably because older distributions than our build
environment didn't have a new enough libz.

Tim


On Fri, Feb 1, 2019 at 9:29 PM erik lee  wrote:

> Thank you very much! I will give that a try.
>
> Best,
> Erik
>
> On Fri, Feb 1, 2019 at 8:23 PM Takuya Hayashi 
> wrote:
>
>> We've recently encountered similar problem in our Ubuntu 18.04 and solved
>> this by the followings:
>>
>> Download Zlib 1.2.9 from here:
>>
>> https://sourceforge.net/projects/libpng/files/zlib/1.2.9/zlib-1.2.9.tar.gz/download
>>
>> tar -xvf ~/Downloads/zlib-1.2.9.tar.gz
>> cd zlib-1.2.9
>> ./configure;
>> make;
>> make install;
>> cd ${your workbench dir}/libs_linux64
>> mv libz.so.1 libz.so.1.back
>> ln -sf /usr/local/lib/libz.so.1.2.9 libz.so.1
>>
>> You may need to use 'sudo' to run some of commands to avoid permission
>> issue.
>>
>> Takuya
>>
>> 2019年2月2日(土) 10:06 Glasser, Matthew :
>>
>>> The person to address this is on vacation and will be back next week.
>>>
>>> Matt.
>>>
>>> From:  on behalf of erik lee <
>>> erik.lee...@gmail.com>
>>> Date: Friday, February 1, 2019 at 6:20 PM
>>> To: "hcp-users@humanconnectome.org" 
>>> Subject: [HCP-Users] Workbench tools installation in Ubuntu 18.04
>>>
>>> Hi,
>>>
>>> I am trying to install the Connectome Workbench software on my linux
>>> machine (Ubuntu 18.04). When I try to run wb_view or wb_command, I get the
>>> following error:
>>>
>>> /usr/local/workbench/bin_linux64/../exe_linux64/wb_view:
>>> /usr/local/workbench/bin_linux64/../libs_linux64/libz.so.1: version
>>> `ZLIB_1.2.9' not found (required by
>>> /usr/lib/x86_64-linux-gnu/libpng16.so.16)
>>> /usr/local/workbench/bin_linux64/../exe_linux64/wb_view:
>>> /usr/local/workbench/bin_linux64/../libs_linux64/libz.so.1: version
>>> `ZLIB_1.2.3.4' not found (required by
>>> /usr/lib/x86_64-linux-gnu/libpng16.so.16)
>>>
>>> Does anyone know where to find these packages or how to get around this
>>> error??
>>>
>>> Thanks in advance,
>>> Erik
>>>
>>> ___
>>> HCP-Users mailing list
>>> HCP-Users@humanconnectome.org
>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>>
>>>
>>> --
>>>
>>> The materials in this message are private and may contain Protected
>>> Healthcare Information or other information of a sensitive nature. If you
>>> are not the intended recipient, be advised that any unauthorized use,
>>> disclosure, copying or the taking of any action in reliance on the contents
>>> of this information is strictly prohibited. If you have received this email
>>> in error, please immediately notify the sender via telephone or return mail.
>>>
>>> ___
>>> HCP-Users mailing list
>>> HCP-Users@humanconnectome.org
>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>>
>> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] A question about dtifit for the HCP data

2019-02-04 Thread Timothy Coalson
As another option for modeling the diffusion data, I would suggest fitting
more anatomically-realistic models to the data, such as a multiple crossing
fiber model (ball and sticks or similar, for instance with bedpostx).
However, I don't know if they currently provide an analogous measure to
kurtosis.

Tim


On Tue, Jan 29, 2019 at 6:42 PM Aaron C  wrote:

> Dear HCP experts,
>
> I saw some previous questions in the mailing list regarding dtifit, and
> some experts suggested that the simple tensor model is not appropriate at
> high b-values.  So I tried the two different options of either using
> "-kurt" flag or limiting the fitting to b=1000 shell, but the results from
> these two options were not nearly the same. Would you please suggest if
> there is any good way to evaluate the difference and choose one from these
> two options? Thank you.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Missing shared library for wb_command

2019-01-28 Thread Timothy Coalson
That is the error I would expect from launching the executable in
exe_linux.  If the script in bin_linux... was having problems, it should
say something like "/install/path/bin_linux64/../exe_linux64/wb_command".

Tim


On Mon, Jan 28, 2019, 4:43 PM Jayasekera, Dinal  I've been trying to use the workbench commands to work on some of my data.
> I'm attempting to run the wb_command script in workbench/bin_linux64
>
>
> Kind regards,
> *Dinal Jayasekera *
>
> PhD Candidate | InSITE Fellow 
> Ammar Hawasli Lab 
> Department of Biomedical Engineering
>  | Washington University in St.
> Louis 
>
> --
> *From:* Glasser, Matthew
> *Sent:* Monday, January 28, 2019 5:41:12 PM
> *To:* Jayasekera, Dinal; hcp-users@humanconnectome.org
> *Subject:* Re: [HCP-Users] Missing shared library for wb_command
>
> You’ve been running wb_command in the pipelines.  Are you launching the
> wrong thing?
>
> Matt.
>
> From: "Jayasekera, Dinal" 
> Date: Monday, January 28, 2019 at 5:40 PM
> To: Matt Glasser , "hcp-users@humanconnectome.org" <
> hcp-users@humanconnectome.org>
> Subject: Re: [HCP-Users] Missing shared library for wb_command
>
> Matt,
>
>
> This is a problem that came up when I started running wb_command. I've
> previously only been using wb_view.
>
>
> Kind regards,
> *Dinal Jayasekera *
>
> PhD Candidate | InSITE Fellow 
> Ammar Hawasli Lab 
> Department of Biomedical Engineering
>  | Washington University in St.
> Louis 
>
> --
> *From:* Glasser, Matthew
> *Sent:* Monday, January 28, 2019 5:39:08 PM
> *To:* Jayasekera, Dinal; hcp-users@humanconnectome.org
> *Subject:* Re: [HCP-Users] Missing shared library for wb_command
>
> Did this suddenly start happening?  You have been running workbench
> successfully for a while.
>
> Matt.
>
> From:  on behalf of "Jayasekera,
> Dinal" 
> Date: Monday, January 28, 2019 at 5:37 PM
> To: "hcp-users@humanconnectome.org" 
> Subject: [HCP-Users] Missing shared library for wb_command
>
> Dear all,
>
>
> I've been trying to run wb_command but I continue to keep getting this
> error:
>
>
> ./wb_command: error while loading shared libraries: libOSMesa.so.6: cannot
> open shared object file: No such file or directory
>
>
> I've looked online but can't seem to find much advice about what
> supporting libraries I may be missing. Has anyone encountered a similar
> error?
>
>
> Kind regards,
> *Dinal Jayasekera *
>
> PhD Candidate | InSITE Fellow 
> Ammar Hawasli Lab 
> Department of Biomedical Engineering
>  | Washington University in St.
> Louis 
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Defining functional ROI masks

2019-01-28 Thread Timothy Coalson
On Mon, Jan 28, 2019, 1:16 PM Shana Adise  Hello,
>
> Thank you in advance for your help! We would like to create functionally
> defined ROIs that are thresholded based on our F-map from PALM. Could you
> please clarify the files and inputs needed to go into -cifti-find-clusters?
>
>
> 1. 
> - Can the input file be the F-map generated by
> PALM? (eg. Results.dscalar.nii)
>

It can be anything, this command isn't specific to any statistic (but
expects larger values to be more important).

2. 
> - Can this be an F or t statistic value?
>

It is an arbitrary value, chosen by the user to be relevant to the values
in the provided input file.

3. 
> - Can this be a Cohen’s d value or does it have to
> be the cluster size in mm^2?
>

Area is only measured in mm^2.

4.  and 
> - Do you need these inputs if you are only using
> surface data?
>

They must be provided, but if there is no voxel data, their values don't
matter.

5. [-left-surface] and [-right-surface]
> - Should these midthickness files be an average
> across all subjects or can one subject’s mid thickness.32k_fs_LR.surf.gii
> be used?
>

If you are running the command on an individual subject's data, then that
individual's surfaces are appropriate.

*Following this command, we would use cifti-weighted-stats to extract the
> mean betas for each ROI for each subject.
>
> 6. Would we need to use cifti-label-import prior to running
> cifti-weighted-stats?
>

Yes, but you must also run -cifti-all-labels-to-roi on the label file.

Thank you for clarifying!
> Shana
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Question: How to transform MNINonLinear-derived dscalar surface to T1w space?

2019-01-24 Thread Timothy Coalson
Surface data is different - we don't actually put surface coordinates into
dscalar, or any other cifti files (or metric files).  In our data, the
surface coordinates are only contained in .surf.gii files.  Getting the
surface data into "Native" volume space is as simple as using the surfaces
in the T1w folder, rather than the ones in MNINonLinear.  The same data
file is used.

The FAQ you found is actually about going between different surface meshes,
not coordinate spaces.  Put simply, freesurfer uses ~40,000 vertices in
their most relevant standard mesh, while our version uses ~32,000 vertices,
and those instructions are about how to jump that gap.  They don't actually
change the coordinates of any blob in the data or fold of the surface.

However, the subcortical data is in voxels, not surfaces, and voxel-based
data is tied to a specific volume space (and are MNINonLinear in our
standard grayordinates space).  You can use wb_command -cifti-separate with
the -volume-all option to extract all the voxel-based data, and then use
the appropriate volume warpfield to resample it, however this will give a
masked version, and the resampling will blur the edges of the mask somewhat
- if this is an issue, you should probably dilate the extracted subcortical
data before resampling it (and also resample the mask, and remask it
afterwards).

Tim


On Thu, Jan 24, 2019 at 12:25 PM David Warren 
wrote:

> Hi all,
>
>   What is the procedure for transforming scalar surfaces in the
> MNINonLinear space back to a subject’s (processed) T1w space using
> wb_command tools?
>
>   Briefly, I am working with an independently-collected, HCP-like
> dataset which has been processed with the HCP Pipelines tools.  Based on
> processed outputs, I have a dscalar surface under MNINonLinear (in the
> fsaverage_LR32k) space that is congruent with surfaces appropriate for the
> MNINonLinear/ T1w_restore.nii.gz volume.  I would like to transform the
> dscalar surface so that it is instead congruent with surfaces appropriate
> for the T1w/T1w_acpc*.nii.gz volumes.
>
>   The PDF guide referenced on the wiki (
> https://wiki.humanconnectome.org/display/PublicData/HCP+Users+FAQ#HCPUsersFAQ-9.HowdoImapdatabetweenFreeSurferandHCP?)
> has some useful hints, but seems intended for transformations moving in the
> other direction.  Any help would be greatly appreciated.
>
>  Please let me know if I can provide any additional information.  Best,
>
> Dave
>
>
> --
>
> *David E. Warren, PhDAssistant Professor*
> Department of Neurological Sciences
>
> *University of Nebraska Medical Center*
> 988440 Nebraska Medical Center
> Omaha, NE 68198-8440
>
> phone 402.559.5805 | fax 402.559.3441
> david.war...@unmc.edu
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Sharing HCP derivatives

2019-01-18 Thread Timothy Coalson
BALSA was designed to support data use terms, not only does it have the HCP
data use terms available for easy selection when submitting a study, it
also allows entering other data use terms (so it can support non-HCP
datasets that require agreement to terms).

The HCP data use terms may not necessarily be transitive to derived data -
as I understand it, they basically say that it is up to your IRB (or
similar authority) as to what (if any) data use terms you should require on
your files derived from HCP data.  Obviously, if you used restricted data
in generating them, things would get a bit more complicated.

As for appropriateness, BALSA somewhat expects each dataset to be
associated with a publication.  It is intended mostly for highly-processed
results (of the kind shown in figures in a publication).  Additional data
beyond that may be okay, depending on its size.

Tim


On Fri, Jan 18, 2019 at 4:11 PM Ariel Rokem  wrote:

> Hello HCPers,
>
> Is BALSA appropriate for sharing detailed derivatives of HCP data? For
> example, if I've calculated maps of DKI metrics for all of the subjects in
> the HCP dMRI dataset, and am interested in sharing these maps with the
> broader community, together with an assessment and comparison with DTI
> metrics, would it be appropriate to share these individual-level maps
> through BALSA?
>
> In particular, since it shares credentials with ConnectomeDB, would BALSA
> take care of making sure that potential users of the data can access it
> only once they have agreed to the terms and conditions of use of the HCP
> dataset?
>
> Thanks!
>
> Ariel Rokem
> The University of Washington eScience Institute
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] CIFTI to Matlab

2019-01-17 Thread Timothy Coalson
Try doing:

system('/wb_command');

Except using the exact string you are providing to ciftiopen's second
argument.  You should get the usage information for wb_command if the path
is correct and things are working.

Tim


On Thu, Jan 17, 2019 at 4:38 PM Anita Sinha  wrote:

> Matt/Michael,
>
>
> When I run g = gifti, I get g = gifti object 1-by-1. I have confirmed that
> I have the correct file paths to the wb_command, however, it seems to
> duplicate part of a file path string, which is why it cannot find the file.
> Is this a known issue?
>
>
> This is what I'm running:
>
>
> y = '/.../.../CM.dtseries.nii';
> ciftiFile = ciftiopen(y,'Z:/.../workbench/bin_windows64/wb_command');
> CM = ciftiFile.cdata;
>
> and receive the same error:
>
> Error using read_gifti_file_standalone (line 20)
> [GIFTI] Loading of XML file
> C:\Users\AMS217\AppData\Local\Temp\tp366ef3da_4687_4d6e_9b4a_bb7f0ce6fd8e.gii
> failed.
>
> Error in gifti (line 100)
> this = read_gifti_file_standalone(varargin{1},giftistruct);
>
> Error in ciftiopen (line 34)
> cifti = gifti([tmpfile '.gii']);
>
>
> Do I need to convert it to char or something so it is read in correctly?
>
>
> Regards,
>
>
> Anita Sinha
>
> Biomedical Engineering Graduate Student
>
> University of Wisconsin-Madison
>
> amsi...@wisc.edu
> --
> *From:* Glasser, Matthew 
> *Sent:* Thursday, January 17, 2019 4:27:17 PM
> *To:* Harms, Michael; Anita Sinha; hcp-users@humanconnectome.org
> *Subject:* Re: [HCP-Users] CIFTI to Matlab
>
> That could also be due to it not finding the file or not finding
> wb_command.
>
> Matt.
>
> From:  on behalf of "Harms,
> Michael" 
> Date: Thursday, January 17, 2019 at 4:05 PM
> To: Anita Sinha , "hcp-users@humanconnectome.org" <
> hcp-users@humanconnectome.org>
> Subject: Re: [HCP-Users] CIFTI to Matlab
>
>
>
> Hi,
>
> It sounds like you don’t have the gifti library properly installed.
>
>
>
> Within matlab, what happens when you type
>
> g = gifti
>
>
>
> Do you get a gifti structure?
>
>
>
> Cheers,
>
> -MH
>
>
>
>
>
> --
>
> Michael Harms, Ph.D.
>
> ---
>
> Associate Professor of Psychiatry
>
> Washington University School of Medicine
>
> Department of Psychiatry, Box 8134
>
> 660 South Euclid Ave.Tel: 314-747-6173
>
> St. Louis, MO  63110  Email: mha...@wustl.edu
>
>
>
> *From: * on behalf of Anita Sinha <
> amsi...@wisc.edu>
> *Date: *Thursday, January 17, 2019 at 3:37 PM
> *To: *"hcp-users@humanconnectome.org" 
> *Subject: *[HCP-Users] CIFTI to Matlab
>
>
>
> To Whom It May Concern,
>
>
>
> I am attempting to open up a dtseries.nii CIFTI file into Matlab to
> extract the time series matrix from rs-fMRI data. I have downloaded
> workbench and gifti-1.8 and am following the directions outlined in
> https://wiki.humanconnectome.org/pages/viewpage.action?pageId=63963178 in
> #2 "How do you get CIFTI files into Matlab, but it isn't working.
>
>
>
> When I run this command:
>
> cii = ciftiopen('path/to/file','path/to/wb_command'); CIFTIdata = cii.cdata
>
>
>
> with the appropriate file paths, I keep receiving this error:
>
>
>
> Error using read_gifti_file_standalone (line 20)
>
> [GIFTI] Loading of XML file
> C:\Users\...\AppData\Local\Temp\tp904b5934_f040_4259_8a21_0b10e15aecc8.gii
> failed.
>
>
>
> Error in gifti (line 100)
>
> this = read_gifti_file_standalone(varargin{1},giftistruct);
>
>
>
> Error in ciftiopen (line 34)
>
> cifti = gifti([tmpfile '.gii']);
>
>
>
> I have added workbench and gifti to the path and saved everything in the
> same directory to mitigate file directory mismatch, but cannot past this
> error.
>
>
>
> Could you provide some help on how to resolve this?
>
>
>
> Thank you for your time.
>
>
>
> Regards,
>
>
>
> Anita
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on 

Re: [HCP-Users] How can the diffusion HCP data be registered to the MNI space?

2019-01-17 Thread Timothy Coalson
The strongest directional information in the diffusion data is in the white
matter, so I assume you are computing some measure from the scans and
specifically want to study its value only in gray matter?

The main command for this purpose is wb_command -volume-to-surface-mapping,
and we recommend the ribbon mapping method (to capture the entire signal
between white and pial surfaces, and downweight partial volumed voxels).
We do not recommend downsampling the voxel data before this step, it would
be better to use the 1.25mm^3 data directly.  If you want the subcortical
gray matter in our standard cifti space, then you should make a downsampled
version for only that purpose (and still using the original 1.25mm^3 volume
in the -volume-to-surface-mapping part).  Note that you should generally do
a small amount of smoothing (on the order of the larger voxel size) before
downsampling volume files, in order to not effectively waste SNR, because
the volume resampling methods are point estimates.

You can look through the fMRISurface pipeline for how we go about doing
this for fMRI data.

Tim


On Thu, Jan 17, 2019 at 10:42 AM Yael Shavit  wrote:

> It seems that the diffusion HCP data is in native space in a resolution of
> 1.25mm^3. I am interested in projecting it to surface, in 2mm^3. What
> would be the best way to do so, preferably using the HCP Pipelines?
> Thank you in advance.
>
> Yael Shavit
> Tel Aviv University
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] About wb_command -cifti-resample

2019-01-15 Thread Timothy Coalson
As for your second question, the transform between MNINonLinear and
"Native" space (actually, undistorted, rigid registered) is a nonlinear
warpfield, not a 4x4 matrix (affine).  As I recall, we ignore the 4x4
matrices in gifti surface files (.surf.gii), as they have caused more
trouble than good.  To get a native-space surface, the best way is to start
with a native-space surface in the resampling, which are available inside
the T1w folder, rather than MNINonLinear.

Another terminology trap to warn of in advance: "native space" and "native
mesh" are completely unrelated concepts: "native space" means the volume
space, that is, what coordinates you will find an anatomical feature at.
"Native mesh" is just whatever topology (number of triangles and how they
are connected) comes out of freesurfer's segmentation and tessellation of
the surfaces - these concepts are orthogonal, we can and do provide (among
other things) native space fs_LR mesh surfaces, and MNINonLinear space
native mesh surfaces.

Tim


On Tue, Jan 15, 2019 at 8:33 AM Rigel Wang <
rigel.w...@neuroinformatics-collaboratory.org> wrote:

> Hi HCP team,
>
> Thanks a lot for your dedication to science.
>
> I am using your HCP-S900 dataset and Connectome Workbench for research.
>
> Specifically, in order to compare the E/MEG inverse solution, we are
> expecting to get the low resolution resting state fMRI on native space
> surface.
> We are trying to use HCP-S900 dataset for a test.
>
> For example, we want to downsample the
> *105923\MNINonLinear\Result\rfMRI_REST1_LR\rfMRI_REST1_LR_Atlas_hp2000_clean.dtseries.nii.*
>
> First, I use the wb_coomand -surface-resample  in matlab to downsample
> the anatomical data(
> *system([wb_command, ' -surface-resample ',' ',file.highResSurface{hemi},'
> ',file.highResSphere{hemi},' ',file.lowResSphere{hemi},' ', res_method,'
> ',file.lowResSurface{hemi}]);) *for getting a low resolution suface which
> expecting to be around 3k vertices for each hemisphere. PS: downsampling
> the surface and projecting back to native space is for calculating the
> E/MEG leadfied.
>
> *Then I am trying to use the wb_coomand -cifti-resample for downsample the
> functional data(*
> *system([wb_command, ' -cifti-resample ',' ',file.highResCii,'
> ',res_src_direction,' ',cifti_templete,' ',res_template_direction,'
> ',res_surface_method,' ', res_volume_method,' ',file.lowResCii]);) . *
>
> *Question:*
>
> 1. I don't know what should I put in the  for wb_coomand
> -cifti-resample?
>
>
> 2. Can I use the same project matrix from
> \105923\MNINonLinear\fsaverage_LR32k\105923.R.{pial or
> white}.32k_fs_LR.surf.gii file, which is 4*4 matrix,  for projecting
> low-resolution 3k_fs_LR back, the output of -surface-resample, to
> individual native space?
>
>
> Thank you!
> Best,
> Rigel
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] 答复: 答复: About wb_command -cifti-resample

2019-01-15 Thread Timothy Coalson
Cifti files are defined not only by their resolution, but also by ROIs that
exclude uninteresting or redundant locations, in particular the medial wall
vertices and white matter voxels for fMRI.  This is the reason that
-cifti-resample needs a template, to define what is included/excluded.

I'm not all that familiar with the HCP MEG data, and the outputs may not be
in cifti format.  You can get the cortical data from a cifti file in
single-hemisphere gifti format using -cifti-separate, and you can use
-metric-resample on those, which is a simpler command (because the gifti
format does not have a provision for excluding vertices from the data
file).  You can also get the subcortical volume data using the same
command, though for volume downsampling you should smooth before resampling
in order not to waste signal (the volume resampling algorithms currently
available in wb_command are point estimates, unlike the adaptive
barycentric surface resampling).

If the MEG data does provide cifti format files, you should be able to use
any of them as the template in -cifti-resample, as I would expect them to
all use the same defining ROIs.

Tim


On Tue, Jan 15, 2019 at 10:08 AM Rigel Wang <
rigel.w...@neuroinformatics-collaboratory.org> wrote:

> Hi Jan-Mathijs,
> Thank you for your interest.
> I want to resample the high spatial resolution fMRI signal to a low
> spatial resolution to compare with the MEG signal in source space, which is
> for comparing different M/EEG inverse methods.
> So I want to convert the two modal signal in the same individual
> coordinate system with low spatial resolution.
> But I never did downsampling with HCP CIFTI file before. I want to use the
> HCP Connectome Workbench command line to downsampling which can
> avoid converting the format to other platforms for downsampling, such as
> freesurfer -mri_surf2surf needs -reconall to segment T1w to their format
> again then do downsampling.
> However, the -cifti-resample  ask for CIFTI format file as
> a target template.
> I don't know what should it be.
> I put the instruction description in the below.
> Thank you!
> Best,
> Rigel
>
>
> RESAMPLE A CIFTI FILE TO A NEW CIFTI SPACE
>wb_command -cifti-resample
>- the cifti file to resample
>- the direction of the input that should be resampled,
> ROW or
>  COLUMN
>- a cifti file containing the cifti space to
> resample to
>- the direction of the template to use as the
>  resampling space, ROW or COLUMN
>- specify a surface resampling method
>- specify a volume interpolation method
>- output - the output cifti file
>
> [-surface-largest] - use largest weight instead of weighted average or
>  popularity when doing surface resampling
>
> [-volume-predilate] - dilate the volume components before resampling
>   - distance, in mm, to dilate
>
> [-nearest] - use nearest value dilation
>
> [-weighted] - use weighted dilation (default)
>
> [-exponent] - specify exponent in weighting function
> - exponent 'n' to use in (1 / (distance ^ n))
> as the
>   weighting function (default 2)
>
> [-surface-postdilate] - dilate the surface components after resampling
>   - distance, in mm, to dilate
>
> [-nearest] - use nearest value dilation
>
> [-linear] - use linear dilation
>
> [-weighted] - use weighted dilation (default for non-label data)
>
> [-exponent] - specify exponent in weighting function
> - exponent 'n' to use in (area / (distance ^
> n)) as
>   the weighting function (default 2)
>
> [-affine] - use an affine transformation on the volume components
>   - the affine file to use
>
> [-flirt] - MUST be used if affine is a flirt affine
>  - the source volume used when generating the
> affine
>  - the target volume used when generating the
> affine
>
> [-warpfield] - use a warpfield on the volume components
>   - the warpfield to use
>
> [-fnirt] - MUST be used if using a fnirt warpfield
>  - the source volume used when generating the
>warpfield
>
> [-left-spheres] - specify spheres for left surface resampling
>   - a sphere with the same mesh as the current
> left
> surface
>   - a sphere with the new left mesh that is in
> register
> with the current sphere
>
> [-left-area-surfs] - specify left surfaces to do vertex area
> correction based on
>  - a relevant left anatomical surface with
> current
>mesh
>  - a relevant left anatomical surface with new mesh
>
> [-left-area-metrics] - specify left vertex area metrics to do area
> correction based on
>  - a metric file with vertex areas for the
> current
>mesh
>  - a metric file with vertex areas for the new mesh
>
> [-right-spheres] - specify spheres for right surface resampling
>  

Re: [HCP-Users] Inquiry of how to process HCP structural data

2019-01-10 Thread Timothy Coalson
For display, another possibility is to put the analysis results for all
areas into a parcellated cifti file, which will show each area in a color
representing the value for that area.

Tim


On Thu, Jan 10, 2019 at 6:17 AM Glasser, Matthew  wrote:

>
>
>1. These would not be with the HCP's parcellation, but rather with
>FreeSurfer’s gyral and sulcal parcellation.  The values are of the
>subject’s physical brain space, not after registration to MNI space.
>2. You could load the label file in Connectome Workbench and turn off
>the other labels in the features toolbox.
>
> Matt.
>
> From: ndyeung 
> Date: Wednesday, January 9, 2019 at 10:00 PM
> To: Matt Glasser 
> Subject: Re: Inquiry of how to process HCP structural data
>
> Dear Dr Glasser,
>
>
> This is Andy from University of Hong Kong. I have read your lecture slide (
> https://wustl.app.box.com/s/k7hf1bv3vaaav04ge3hdhvlh92ikvwok) on HCP
> project, and would like to try myself on Parcellated (i.e. area-­wise)
> Analyses: answering the question “what brain areas are ...” (e.g. MNI
> data table) with correlational analyses of brain areas (GMV - thickness
> & area?)/behavioral score (e.g. taste intensity). My ROIs include
> taste-related areas, e.g. insula, orbitofrontal cortex, caudate,
> anterior cingulate, thalamus, etc.
>
>
> I have downloaded the 3T preprocessed structural data, but am confused by
> how to run my wanted analysis. Which file(s) should I use from
> the MNINonLinear folder, and what programs can I use to run the correlation?
>
>
> Or actually I don't need to download the MRI data, since the csv data
> already contains the computed GMV data, so I can directly correlate with
> behavioral scores? However, in this way I still have 2 questions:
>
> 1. The parcellated subcortical Volumes and cortical surface Thickness &
> Area were calculated after passing data via standard HCP minimal pipeline.
> Does it mean these values are normalized? Adjusted for individual brain
> size and any other factors?
>
> 2. Let's say R insula thickness & area correlate to taste intensity score.
> How can I produce a figure showing parcellated R insula (highlighted red or
> something) while the rest of the brain is grey?
>
>
> Thank you very much for your time and guidance. I look forward to hearing
> back from you.
>
>
> Best regards,
>
> Andy Yeung, BDS, PhD
>
> HKU Faculty of Dentistry
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] generating a CIFTI file wit a different resolution

2019-01-07 Thread Timothy Coalson
The quick and dirty way is to resample the 1.6mm Atlas_ROIs.1.60.nii.gz
file to 1.25mm with wb_command -volume-affine-resample and the enclosing
voxel method.  You can then use that file in -cifti-create-dense-scalar,
along with cortical ROIs for each surface (see L.atlasroi.*.shape.gii in
the same grayordinate folders).

However, this resampling will produce somewhat jagged edges of the
subcortical ROIs.  The subcortical labels come from freesurfer
segmentations originally (of Conte69, I believe), which I believe exist at
a higher resolution than 1.6mm, so it should be possible to construct a
better subcortical definition by starting from those.

Tim


On Sat, Jan 5, 2019 at 7:12 PM Sina Mansour L. <
sina.mansour.lakou...@gmail.com> wrote:

> The measure would be a simple scalar (assigning a single value to any
> vertex on the surface and voxel in the volume). However the resolution
> would be 1.25mm as the volumes are created from the diffusion image. (I'm
> working with the HCP test retest data)
>
> In other words, is there any command that can generate a CIFTI file with a
> different volume resolution?
>
> On Sat, 5 Jan 2019 at 6:46 am, Timothy Coalson  wrote:
>
>> It is possible, though it makes comparisons to the existing 2mm cifti
>> data more challenging.  For instance, we have a 1.6mm space for our 7T
>> data, the files defining it are here:
>>
>>
>> https://github.com/Washington-University/HCPpipelines/tree/master/global/templates/170494_Greyordinates
>>
>> Making yet another resolution would start by resampling these kinds of
>> files.  Resampling your data into an existing cifti space is often
>> preferable, though, so it would be good to know what resolution you are
>> aiming for.
>>
>> Tim
>>
>>
>> On Fri, Jan 4, 2019 at 12:30 AM Sina Mansour L. <
>> sina.mansour.lakou...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I was wondering if there's a way to make a CIFTI file with a different
>>> resolution? Especially in the volume part of the subcortical regions.
>>> Basically, I'm interested in a projection of measures calculated for
>>> DWI images into a format that only shows the cortical surfaces and
>>> subcortical volumes (like the .dscalar.nii), but I wasn't sure what would
>>> be the right way to do so.
>>>
>>> Thanks,
>>> Sina
>>>
>> ___
>>>
>>
>>> HCP-Users mailing list
>>> HCP-Users@humanconnectome.org
>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>>
>>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] generating a CIFTI file wit a different resolution

2019-01-04 Thread Timothy Coalson
It is possible, though it makes comparisons to the existing 2mm cifti data
more challenging.  For instance, we have a 1.6mm space for our 7T data, the
files defining it are here:

https://github.com/Washington-University/HCPpipelines/tree/master/global/templates/170494_Greyordinates

Making yet another resolution would start by resampling these kinds of
files.  Resampling your data into an existing cifti space is often
preferable, though, so it would be good to know what resolution you are
aiming for.

Tim


On Fri, Jan 4, 2019 at 12:30 AM Sina Mansour L. <
sina.mansour.lakou...@gmail.com> wrote:

> Hi,
>
> I was wondering if there's a way to make a CIFTI file with a different
> resolution? Especially in the volume part of the subcortical regions.
> Basically, I'm interested in a projection of measures calculated for
> DWI images into a format that only shows the cortical surfaces and
> subcortical volumes (like the .dscalar.nii), but I wasn't sure what would
> be the right way to do so.
>
> Thanks,
> Sina
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Warping volume atlas and X, Y, Z of single point with standard2acpc_dc.nii.gz

2019-01-03 Thread Timothy Coalson
Yes, even if you did that and got it into the 2mm space correctly in FSL
conventions (this is not easy), the absolute warp file will have near-zero
values in one of the corners, even though the voxel that should have values
close to 0 should generally be towards the center of the volume.

Tim


On Thu, Jan 3, 2019 at 6:32 PM Glasser, Matthew  wrote:

> I believe that FSL convertwarp converts between relative and absolute
> conventions, though the FSL coordinates issue might prevent that from being
> helpful.
>
> Matt.
>
> From: Timothy Coalson 
> Date: Thursday, January 3, 2019 at 2:30 PM
> To: Matt Glasser 
> Cc: Georgios Michalareas , "hcp-users@humanconnectome.org"
> 
> Subject: Re: [HCP-Users] Warping volume atlas and X, Y, Z of single point
> with standard2acpc_dc.nii.gz
>
> On Thu, Jan 3, 2019 at 10:47 AM Glasser, Matthew 
> wrote:
>
>> 1) FSL does not respect Workbench’s header info, so the labels get
>> removed.  You might need to use wb_command -volume-resample or copy over
>> the header info.
>>
>
> Yes, use wb_command -volume-warpfield-resample and the enclosing voxel
> method.
>
>
>> 2) FSL has a utility to do this I believe (std2imgcoord), but it will be
>> slow (we used to use this for warping surfaces before we had the
>> wb_command. Not sure about the error message.
>>
>
> The wb_command error message means what it says - the warpfield is a
> volume file, and any coordinates that are outside of its bounding box have
> a completely unknown warp, because there is no voxel in that location to
> tell it what the displacement vector is.
>
> There isn't a direct way to put a coordinate through a warpfield in
> workbench, but since you specifically want to transform a voxel grid
> through it, you can get there with a few tricks.  First, convert the
> "native to standard" warpfield to "world" convention with wb_command
> -convert-warpfield - this convention gives at each voxel, the displacement
> that should be added to the coordinate (in proper nifti mm coordinates,
> rather than FSL's difficult conventions).  You can then resample this to
> the 2mm space with wb_command -volume-affine-resample (use an identity
> affine, and probably trilinear resampling) to get the 2mm voxel grid you
> want.  This is close, but the last step will need to be done manually,
> because the warpfield gives the relative displacement, but you want the
> absolute coordinate.  So, in each voxel, you need to add the coordinates of
> the center of the voxel - there is no command to do this, so do it in
> matlab, or python, or...
>
>
>> Matt.
>>
>> On 1/3/19, 10:31 AM, "hcp-users-boun...@humanconnectome.org on behalf of
>> Georgios Michalareas" > g...@ae.mpg.de> wrote:
>>
>> >Hi and Happy Hew Year,
>> >
>> >
>> >I have 2 questions regarding warping a volume atlas  and X,Y,Z
>> >coordinates of single points with the  standard2acpc_dc.nii.gz warp.
>> >
>> >Excuse me If my questions are too naive. Here they are:
>> >
>> >
>> >Question 1:
>> >
>> >==
>> >
>> >I would like to transform the volume atlas file
>> >
>> >MNINonLinear/ROIs/Atlas_ROIs.2.nii.gz
>> >
>> >from standard space to native space using the warp
>> >
>> >MNINonLinear/xfms/standard2acpc_dc.nii.gz
>> >
>> >I tried using "applywarp"
>> >
>> >'applywarp  --in=Atlas_ROIs.2.nii.gz  --ref=T1_native.nii.gz
>> >--out=WARPED_Atlas_Native.nii.gz
>>
>> >--warp=/mnt/beegfs/home/georgios.michalareas/workspace/projects/matthprob_
>> >MEG/data/smri/raw_nifti/BRS27/MNINonLinear/xfms/standard2acpc_dc.nii.gz
>> >--interp=spline'
>> >
>> >but in the resulting nifti there is no atlas label information .
>> >
>> >
>> >Question 2:
>> >
>> >==
>> >
>> >I would like to take the X,Y,Z coordinates of any voxel J in the 2mm
>> >template volume
>> >
>> >HCPpipelines-3.27.0/global/templates/MNI152_T1_2mm.nii.gz
>> >
>> >apply the standard2acpc_dc.nii.gz  warping to them and get the new X,Y,Z
>> >coordinates of J in the native space.
>> >
>> >The reason I want to do this is to make a regular reference 2 mm grid
>> >from the corrdinates of the voxels of the 2mm template volume and warp
>> >each of the grid vertices to native space.
>> >
>> >I made a surf.gii

Re: [HCP-Users] Warping volume atlas and X, Y, Z of single point with standard2acpc_dc.nii.gz

2019-01-03 Thread Timothy Coalson
On Thu, Jan 3, 2019 at 10:47 AM Glasser, Matthew  wrote:

> 1) FSL does not respect Workbench’s header info, so the labels get
> removed.  You might need to use wb_command -volume-resample or copy over
> the header info.
>

Yes, use wb_command -volume-warpfield-resample and the enclosing voxel
method.


> 2) FSL has a utility to do this I believe (std2imgcoord), but it will be
> slow (we used to use this for warping surfaces before we had the
> wb_command. Not sure about the error message.
>

The wb_command error message means what it says - the warpfield is a volume
file, and any coordinates that are outside of its bounding box have a
completely unknown warp, because there is no voxel in that location to tell
it what the displacement vector is.

There isn't a direct way to put a coordinate through a warpfield in
workbench, but since you specifically want to transform a voxel grid
through it, you can get there with a few tricks.  First, convert the
"native to standard" warpfield to "world" convention with wb_command
-convert-warpfield - this convention gives at each voxel, the displacement
that should be added to the coordinate (in proper nifti mm coordinates,
rather than FSL's difficult conventions).  You can then resample this to
the 2mm space with wb_command -volume-affine-resample (use an identity
affine, and probably trilinear resampling) to get the 2mm voxel grid you
want.  This is close, but the last step will need to be done manually,
because the warpfield gives the relative displacement, but you want the
absolute coordinate.  So, in each voxel, you need to add the coordinates of
the center of the voxel - there is no command to do this, so do it in
matlab, or python, or...


> Matt.
>
> On 1/3/19, 10:31 AM, "hcp-users-boun...@humanconnectome.org on behalf of
> Georgios Michalareas"  g...@ae.mpg.de> wrote:
>
> >Hi and Happy Hew Year,
> >
> >
> >I have 2 questions regarding warping a volume atlas  and X,Y,Z
> >coordinates of single points with the  standard2acpc_dc.nii.gz warp.
> >
> >Excuse me If my questions are too naive. Here they are:
> >
> >
> >Question 1:
> >
> >==
> >
> >I would like to transform the volume atlas file
> >
> >MNINonLinear/ROIs/Atlas_ROIs.2.nii.gz
> >
> >from standard space to native space using the warp
> >
> >MNINonLinear/xfms/standard2acpc_dc.nii.gz
> >
> >I tried using "applywarp"
> >
> >'applywarp  --in=Atlas_ROIs.2.nii.gz  --ref=T1_native.nii.gz
> >--out=WARPED_Atlas_Native.nii.gz
> >--warp=/mnt/beegfs/home/georgios.michalareas/workspace/projects/matthprob_
> >MEG/data/smri/raw_nifti/BRS27/MNINonLinear/xfms/standard2acpc_dc.nii.gz
> >--interp=spline'
> >
> >but in the resulting nifti there is no atlas label information .
> >
> >
> >Question 2:
> >
> >==
> >
> >I would like to take the X,Y,Z coordinates of any voxel J in the 2mm
> >template volume
> >
> >HCPpipelines-3.27.0/global/templates/MNI152_T1_2mm.nii.gz
> >
> >apply the standard2acpc_dc.nii.gz  warping to them and get the new X,Y,Z
> >coordinates of J in the native space.
> >
> >The reason I want to do this is to make a regular reference 2 mm grid
> >from the corrdinates of the voxels of the 2mm template volume and warp
> >each of the grid vertices to native space.
> >
> >I made a surf.gii file containing a pseudo-surface which had as vertices
> >the coordinates of all voxels of the 2mm template MNI152_T1_2mm.nii.gz ,
> >and a dummy single face:
> >
> >g=
> >faces: [1 2 3]
> >  mat: [4×4 double]
> > vertices: [902629×3 single]
> >
> >I then applied
> >
> >wb_command -surface-apply-warpfield  mrigrid.surf.gii
> >Tfm_native2standard.nii.gz  WARPED2NATIVE_mrigrid.surf.gii -fnirt
> >Tfm_standard2native.nii.gz
> >
> >but I got the following error
> >
> >While running:
> >/mnt/beegfs/home/georgios.michalareas/workspace/toolboxes/other/workbench-
> >v1.3.2/workbench/bin_rh_linux64/../exe_rh_linux64/wb_command
> >-surface-apply-warpfield mrigrid.surf.gii Tfm_native2standard.nii.gz
> >WARPED2NATIVE_mrigrid.surf.gii -fnirt Tfm_standard2native.nii.gz
> >
> >ERROR: surface exceeds the bounding box of the warpfield
> >
> >Is there a way to just transform any X,Y,Z of standard space to the
> >corresponding X,Y,Z of Native Space.
> >
> >
> >Thank you very much for your help
> >
> >Best
> >
> >Giorgos
> >
> >
> >
> >
> >
> >
> >
> >
> >--
> >
> >Dr. Georgios Michalareas
> >Neuroscience Department
> >Max Planck Institute for Empirical Aesthetics
> >
> >email: g...@aesthetics.mpg.de
> >phone: +49 69 8300479-325
> >
> >
> >___
> >HCP-Users mailing list
> >HCP-Users@humanconnectome.org
> >http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> 
> The materials in this message are private and may 

Re: [HCP-Users] Question about workbench calculation

2018-12-28 Thread Timothy Coalson
If you mean you just want to combine the magnitudes (the main output of
-cifti-gradient) across timepoints, that isn't hard.  There isn't a built
in option to do it, but you can do it afterwards by -cifti-math to square
everything, -cifti-reduce to sum across time, and then -cifti-math to
square root afterwards.  L2 norm may be a candidate for adding to
-cifti-reduce.

If you want the partials separated by spatial direction (dx, dy, dz), that
is a bit more involved to combine across timepoints, as the vector output
option has 3x as many maps, and they are interleaved (dx1, dy1, dz1, dx2,
dy2, etc).  You would need to build some long -cifti-separate commands to
make a file containing (dx1, dx2, dx3, etc), and similar for y and z.  In
hindsight, outputting 3 files from that option would probably have been a
better design.

Tim


On Fri, Dec 28, 2018 at 1:07 PM Montez, David 
wrote:

> Hi HCP workbench folks,
>
> I’m trying to perform a particular calculation on a resting state data set
> using wb_command and, if possible, would like some advice on how to go
> about it most directly.
>
> I’d like to compute the magnitude of a gradient at each point
> (voxel/vertex) on a surface and volume for a case in which each
> voxel/vertex corresponds to a vector valued function.
>
> To be explicit consider a volumetric data set V(x,y,z,t), where x,y,z are
> spatial coordinates and t is a time point. This could be just a typical
> resting state scan. Let  0 ≤ t ≤ m, where m is the last time point.
>
> For a given location, x0, y0,z0, the magnitude of the partial vector
> derivative dV/dx is given by:
>
> note:  vector magnitudes are denoted by || || and is just a typical L2 norm
>
> || dV/dx || = || V(x0+dx ,y0 ,z0, t=0,1,2,….t=m) - V(x0, y0, z0,
> t=0,1,2,….t=m) ||
>
> This is basically the magnitude of the vector difference of time series
> for two adjacent voxels
>
> It would appear that the workbench command -cifti-gradient is the
> appropriate starting place, but it is unclear how the command should be set
> up since computing the L2-norm would require calculating the sum-of-squares
> of the differences across all time points and the only option that I see is
> available is -average-output which would not serve my purpose if would
> simply perform the arithmetic mean across the time points if the difference
>
> This would be a relatively easy thing to implement in volume space, but
> I’d like to take advantage of working in surface-land for all of the usual
> reasons.
>
> Can you advise on what would be the most straightforward way to perform
> the calculation for || dV/dx || as outlined above.
>
> Many thanks!
>
> David Montez
>
>
>
> 
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] export workbench scenes as vector graphics

2018-12-27 Thread Timothy Coalson
Scene files are not a vector format internally, they store the entire GUI
state of wb_view (loaded files, window sizes, tab types and order, order of
layers in tabs, palette settings), and when they are displayed (or captured
via wb_command), the data files are loaded and all the display logic in
workbench is used to rasterize the result, effectively as a bitmap (which
then gets compressed into the chosen raster image format).  Vector graphics
are instead collections of lines, curves, text, and other simple
primitives, which are most suited for things like diagrams and charts
(though in many formats, PDF in particular, you can embed raster
graphics).  Vector formats are not suitable for general purpose image
storage, such as photographs or nontrivial 3D renderings.

The annotations themselves, without whatever they are displayed on top of,
are quite similar to vector graphics.  I don't know if they would be useful
by themselves.

Tim


On Thu, Dec 27, 2018 at 5:59 PM Ely, Benjamin  wrote:

> Thanks Matt, thought so but wanted to make sure. That should work for now.
>
> CC-ing John Harwell since he had a detailed answer for my last wb_view
> question. John, do you know if this would this be feasible to add as a
> feature? As Matt noted, the scenes already contain the image/annotation
> data in something equivalent to a vector format internally.
>
> Thank you!
> -Ely
>
> From: "Glasser, Matthew" 
> Date: Thursday, December 27, 2018 at 6:38 PM
> To: Benjamin Ely , HCP User List <
> hcp-users@humanconnectome.org>
> Subject: Re: [HCP-Users] export workbench scenes as vector graphics
>
> Hi Ely,
>
> Not at this time.  You can export at whatever resolution you want,
> however, so you could think of the scene as the vector graphics file and
> export at the desired resolution for a given purpose.
>
> Matt.
>
> From:  on behalf of "Ely,
> Benjamin" 
> Date: Thursday, December 27, 2018 at 5:17 PM
> To: HCP User List 
> Subject: [HCP-Users] export workbench scenes as vector graphics
>
> Hi everyone,
>
> I’ve created a number of figure panels using workbench viewer and would
> like to export them to vector-format graphics files (e.g. EPS or PDF) for
> use in a manuscript. Is there a way to do this? While there are many
> file-type options in the wb_view Capture Image menu and the wb_command
> -show–scene command, all of them appear to be raster formats, which aren’t
> ideal for use in publications. If this isn’t currently an option, would it
> be possible to implement for the next version release? I’m using v1.3.2 of
> workbench on Mac OSX v10.13.6, btw.
>
> Thanks and happy new year!
> -Ely
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
> 
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Wb-command Problem

2018-12-27 Thread Timothy Coalson
To expand on Jenn's answer, one way to see what wb_command does on windows
is to open "command prompt", cd to the "bin_windows64" folder where you
unzipped workbench, and then type "wb_command" and press enter.

If you do as the README.txt suggests and add that folder onto your PATH
environment variable, you will then be able to use "wb_command" in the
command prompt no matter what folder you are in.

Note, however, that "wb_shortcuts" is a bash script, and windows (unlike
mac and linux) does not understand bash scripts by default.  One solution
to this is Cygwin:

https://www.cygwin.com/

There is also "windows subsystem for linux", however it will need to use
the *linux* distribution of workbench, rather than the one built for
windows.

Note that the HCP pipelines, in addition to being written in bash, also use
other software (mainly freesurfer and FSL), which generally don't support
windows directly (instead they may say to run a linux virtual machine).
These software are mainly developed on linux because computing clusters
generally run linux (to reduce/avoid licensing hassles, to have more
control over the OS, and for better support for scripting).

Tim


On Thu, Dec 27, 2018 at 11:53 AM Elam, Jennifer  wrote:

> Hi Ben,
>
> You are not intended to double click wb_command or wb_import as
> applications, they are utilities that are used from a terminal or script
> (generally, after adding their location to your $PATH, per the install
> instructions in README.txt).  Double clicking them from a folder window is
> expected to make a window that closes immediately.
>
>
> Best,
>
> Jenn
>
>
> Jennifer Elam, Ph.D.
> Scientific Outreach, Human Connectome Project
> Washington University School of Medicine
> Department of Neuroscience, Box 8108
> 660 South Euclid Avenue
> St. Louis, MO 63110
> 314-362-9387
> e...@wustl.edu
> www.humanconnectome.org
>
> --
> *From:* hcp-users-boun...@humanconnectome.org <
> hcp-users-boun...@humanconnectome.org> on behalf of Benjamin Bao <
> benjamin9...@gmail.com>
> *Sent:* Thursday, December 27, 2018 9:41:06 AM
> *To:* hcp-users@humanconnectome.org
> *Subject:* [HCP-Users] Wb-command Problem
>
> Hello,
> I have recently downloaded the workbench program on my windons 10
> system. When I was trying to set a minimus cluster size, I have read that I
> need to use the wb_command program. However, immediate after I tried to
> open the program, it shuts off. I am wondering how should I proceed?
>I would really appreciate your help!
>
> Sincerely
> Ben Bao
>
> Nipissing University
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Time series data

2018-12-19 Thread Timothy Coalson
We generally do use timeseries for single-subject analysis.  The only
involvement of ICA there is in cleaning up things like artifacts - think of
it as using ICA to identify nuisance regressors.  The end result is still a
timeseries, but with greatly reduced artifacts.

You can use wb_command -cifti-parcellate to take a dtseries file and take
the within-ROI average timeseries of every area in a dlabel file.

The main thing I know of where we actually use ICA/PCA components as the
data to be analyzed is in MIGP, mainly as a way to do things that would
normally require concatenating the timeseries of all subjects, which would
take too much memory.

Tim


On Wed, Dec 19, 2018 at 5:46 PM Rakib Al-Fahad (ralfahad) <
ralfa...@memphis.edu> wrote:

> Matt,
>
>
>
> I agree with your pint.
>
>
>
> My specialization is in signal processing and machine learning. I am not
> sure about ICA based time series. If we consider 100 ICA components, how
> can we define each node name for graph theoretical measures? For example,
> the paper ‘Chronnectomic patterns and neural flexibility underlie executive
> function” [Jason et al. NeuroImage 147 (2017): 861-871] talked about 100
> components. They define DNN, subcortical, frontal, etc. networks. Can you
> give me any reference that can guide which component belongs to which
> network or how to name them?
>
>
>
> If somehow, I discover connectivity between Component_1 and Component_3 is
> significantly related to some behavior, how can I express it in
> neuroscience term? I believe my question is clear now.
>
>
>
>
>
> Thanks
>
> Rakib
>
>
>
> *From: *"Glasser, Matthew" 
> *Date: *Wednesday, December 19, 2018 at 5:28 PM
> *To: *"Rakib Al-Fahad (ralfahad)" , "
> HCP-Users@humanconnectome.org" 
> *Subject: *Re: [HCP-Users] Time series data
>
>
>
> What neuroscience questions are ICA timeseries unable to answer?
> Particularly that gyral/sulcal folding-based parcellation would be able to
> answer?  If you want a neuroanatomical parcellation into cortical areas,
> this is available here:
>
>
>
> https://balsa.wustl.edu/file/show/3VLx
>
>
>
> Matt.
>
>
>
> *From: * on behalf of "Rakib
> Al-Fahad (ralfahad)" 
> *Date: *Wednesday, December 19, 2018 at 4:40 PM
> *To: *"HCP-Users@humanconnectome.org" 
> *Subject: *[HCP-Users] Time series data
>
>
>
> Hello All,
>
>
>
> I want to analyze time series data and dynamic brain connectivity from
> rfMRI data. I don’t want to use ICA based time series because they cannot
> answer lot of neuroscience question. I prefer ROI based analysis. A 45+ ROI
> template would be useful (e.g. FreeSurfer template). Is it possible to use
> workbench on processed data to extract ROI based time series, or I have to
> run FreeSurfer on data? Please help me with some guidance and reference.
>
>
>
>
> [image: M logo]
>
> Rakib Al-Fahad
> Ph.D. Candidate
> Electrical and Computer Engineering
> The University of Memphis
> 901.279.4128
>
>
>
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] How to display an altas created by FreeSurfer using workbench

2018-12-10 Thread Timothy Coalson
The color and alpha numbers should be integers from 0 to 255.

Tim


On Mon, Dec 10, 2018 at 6:56 PM Zaixu Cui 
wrote:

> Hi Tim,
>
> Cool. Thank you so much. It works. I just used the command 'wb_command
> -metric-label-import 120292_lh.func.gii '' 120292_lh.label.gii'.
>
> I set the second parameter as null and the color was created as default.
> Is there an example of the text file of the label-list file. I used the
> file as attachment but failed. Is there any problems with this file?
>
> Thank you so much.
>
> Best wishes
> -
> Zaixu
>
>
>
> On Mon, 10 Dec 2018 at 19:02, Timothy Coalson  wrote:
>
>> If you save those vectors of values as .func.gii files (maybe this is how
>> you made the .func.gii files you have?), you can use wb_command
>> -metric-label-import to turn them into .label.gii files:
>>
>>
>> https://www.humanconnectome.org/software/workbench-command/-metric-label-import
>>
>> To get them onto HCP surfaces, see FAQ #9 here:
>>
>>
>> https://wiki.humanconnectome.org/display/PublicData/HCP+Users+FAQ#HCPUsersFAQ-9.HowdoImapdatabetweenFreeSurferandHCP
>> ?
>>
>> Tim
>>
>>
>> On Mon, Dec 10, 2018 at 5:50 PM Zaixu Cui 
>> wrote:
>>
>>> Hi, developers,
>>>
>>> Thank you so much for developing such a fantastic software. I am trying
>>> to visualize an atlas in FreeSurfer fsaverage5 space, but failed to create
>>> *.label.gii file for my atlas.
>>>
>>> I have successfully converted fsaverage5 template into
>>> *.midthickness.surf.gii and converted my atlas into *.func.gii file. Then,
>>> workbench can display the atlas as a metric map.
>>>
>>> However, I need to display it as an atlas, with each region a different
>>> color. So, I tried to covert the atlas into *.label.gii. My atlas is a
>>> 10242*1 vector for left hemisphere and a 10242*1 vector for right
>>> hemisphere. The values in the atlas range from 1 to 100. I tried a lot, but
>>> failed finally. Is there a manual about how to convert a vector into
>>> *.label.gii file that accepted by workbench?
>>>
>>> Thank you so much.
>>> Best
>>> -
>>> Zaixu Cui, Ph.D.
>>> Postdoc fellow at Department of Psychiatry, Perelman School of Medicine
>>> University of Pennsylvania
>>> E-mail: zaixu...@gmail.com
>>>  zaixu@pennmedicine.upenn.edu
>>>
>>> ___
>>> HCP-Users mailing list
>>> HCP-Users@humanconnectome.org
>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>>
>>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] How to display an altas created by FreeSurfer using workbench

2018-12-10 Thread Timothy Coalson
If you save those vectors of values as .func.gii files (maybe this is how
you made the .func.gii files you have?), you can use wb_command
-metric-label-import to turn them into .label.gii files:

https://www.humanconnectome.org/software/workbench-command/-metric-label-import

To get them onto HCP surfaces, see FAQ #9 here:

https://wiki.humanconnectome.org/display/PublicData/HCP+Users+FAQ#HCPUsersFAQ-9.HowdoImapdatabetweenFreeSurferandHCP
?

Tim


On Mon, Dec 10, 2018 at 5:50 PM Zaixu Cui 
wrote:

> Hi, developers,
>
> Thank you so much for developing such a fantastic software. I am trying to
> visualize an atlas in FreeSurfer fsaverage5 space, but failed to create
> *.label.gii file for my atlas.
>
> I have successfully converted fsaverage5 template into
> *.midthickness.surf.gii and converted my atlas into *.func.gii file. Then,
> workbench can display the atlas as a metric map.
>
> However, I need to display it as an atlas, with each region a different
> color. So, I tried to covert the atlas into *.label.gii. My atlas is a
> 10242*1 vector for left hemisphere and a 10242*1 vector for right
> hemisphere. The values in the atlas range from 1 to 100. I tried a lot, but
> failed finally. Is there a manual about how to convert a vector into
> *.label.gii file that accepted by workbench?
>
> Thank you so much.
> Best
> -
> Zaixu Cui, Ph.D.
> Postdoc fellow at Department of Psychiatry, Perelman School of Medicine
> University of Pennsylvania
> E-mail: zaixu...@gmail.com
>  zaixu@pennmedicine.upenn.edu
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] average dconn from individual dconns

2018-11-28 Thread Timothy Coalson
wb_command -cifti-help has a cheat sheet (and other useful explanation):

...
   The common types of cifti files and the mapping types they use are:
  dconn: ROW is dense, COLUMN is dense
  dscalar: ROW is scalars, COLUMN is dense
  dtseries: ROW is series, COLUMN is dense
...

"dense" is the mapping type that represents vertices and voxels, is the one
you need to modify.  If you want to do it to the dconns, then you will need
to do it twice, because both dimensions are dense.

You can look through the hcp-users history here:

https://www.mail-archive.com/hcp-users@humanconnectome.org/

Tim



On Wed, Nov 28, 2018 at 8:44 PM Kenley, Jeanette  wrote:

> Thank you Tim and Matt for all the information.
>
>
> I currently am not using the HCP and unfortunately do not have the time to
> reprocess the data.
>
>
> I greatly appreciate the detailed explanation of the metric files and the
> difference between that and the label files.  I am very new to surfaces and
> so everything is helpful!
>
>
> Just a couple more questions please.
>
> How do I know whether I want to use the 'COLUMN' or 'ROW' option?
>
> Is there a message board that I can look through other previously asked
> questions for future help?
>
>
> Thanks again,
>
> Jeanette
>
>
> --
> *From:* Glasser, Matthew
> *Sent:* Wednesday, November 28, 2018 6:47 PM
> *To:* NEUROSCIENCE tim; Kenley, Jeanette
> *Cc:* hcp-users; Kaplan, Sydney
> *Subject:* Re: [HCP-Users] average dconn from individual dconns
>
> To be more specific: In the HCP we use a technique called MIGP to make
> group fMRI data and generate dense connectomes from that.  Concatenated
> dense timeseries get very large when you have a lot of subjects and
> timepoints.
>
> Matt.
>
> From:  on behalf of Timothy
> Coalson 
> Date: Wednesday, November 28, 2018 at 2:54 PM
> To: "Kenley, Jeanette" 
> Cc: hcp-users , "Kaplan, Sydney" <
> sydney.kap...@wustl.edu>
> Subject: Re: [HCP-Users] average dconn from individual dconns
>
> The HCP pipelines deliberately resample the subcortical data in such a way
> that the subcortical voxels used in each subject are the same, this is how
> we handle the problem you are having.
>
> If you concatenate your timeseries across subjects before correlation, you
> don't need to generate a dconn for every subject, so this strategy uses
> less disk space (but you have to be careful about normalization before
> concatenation).
>
> Label files are not ROI files, though they might work in this case (you
> could instead use the .shape.gii files here:
> https://github.com/Washington-University/HCPpipelines/tree/master/global/templates/91282_Greyordinates).
> An ROI file is supposed to be a metric file (.func.gii, .shape.gii) of just
> 1's and 0's (the actual logic used is "greater than zero").  Label files
> can be used as metric files, the warning is just to let you know that you
> did something unusual.  I'd need to see the actual error message to comment
> on it.
>
> If you don't want to go back to the timeseries (or reprocess with the HCP
> pipelines), -cifti-restrict-dense-map may be the simplest solution.
>
> Tim
>
>
> On Wed, Nov 28, 2018 at 2:00 PM Kenley, Jeanette 
> wrote:
>
> I am still new to the wb_command suite and trying to understand how to
> best use them.
>
>
> I have created an individual cifti (dconn.nii) for each of my subjects in
> 32kfslr and would like to make an average.
>
> I would like to use
>
> wb_command -cifti-average output.dconn.nii -cifti subject1.dconn.nii -cift
> subject2.dconn.nii . . .-cifti subjectN.dconn.nii
>
>
> But this results in a dimension mismatch since the sub cortical portions
> of the individual subjects are different.
>
>
> Is there a way to only grab the surface data from each of the individual
> dconns so I do not get the dimension error.
>
> I tried using
>
> wb_command -cifti-restrict-dense-map  subjectN.dconn.nii 'ROW/COLUMN'
> subjectN_output.dconn.nii -left-roi $left -right-roi $right
>
>
> where the left and right rois are:
>
> subjectN.L.aparc.32k_fs_LR.label.gii
>
> subjectN.R.aparc.32k_fs_LR.label.gii
>
>
> but since I am newer to the wb_commands, I am not sure what each of the
> inputs should be or if this is the correct command.
>
>
> This also resulted in an error.
> WARNING: Metric File: subjectN.L.aparc.32k_fs_LR.label.gii contains data
> array with NIFTI_INTENT_LABEL !!!
>
> WARNING: Metric File: subjectN.R.aparc.32k_fs_LR.label.gii contains data
> array with NIFTI_INTENT_LABEL !!!
>
> As well as an error when I tried to use the file saying something about
> fie

Re: [HCP-Users] average dconn from individual dconns

2018-11-28 Thread Timothy Coalson
The HCP pipelines deliberately resample the subcortical data in such a way
that the subcortical voxels used in each subject are the same, this is how
we handle the problem you are having.

If you concatenate your timeseries across subjects before correlation, you
don't need to generate a dconn for every subject, so this strategy uses
less disk space (but you have to be careful about normalization before
concatenation).

Label files are not ROI files, though they might work in this case (you
could instead use the .shape.gii files here:
https://github.com/Washington-University/HCPpipelines/tree/master/global/templates/91282_Greyordinates).
An ROI file is supposed to be a metric file (.func.gii, .shape.gii) of just
1's and 0's (the actual logic used is "greater than zero").  Label files
can be used as metric files, the warning is just to let you know that you
did something unusual.  I'd need to see the actual error message to comment
on it.

If you don't want to go back to the timeseries (or reprocess with the HCP
pipelines), -cifti-restrict-dense-map may be the simplest solution.

Tim


On Wed, Nov 28, 2018 at 2:00 PM Kenley, Jeanette  wrote:

> I am still new to the wb_command suite and trying to understand how to
> best use them.
>
>
> I have created an individual cifti (dconn.nii) for each of my subjects in
> 32kfslr and would like to make an average.
>
> I would like to use
>
> wb_command -cifti-average output.dconn.nii -cifti subject1.dconn.nii -cift
> subject2.dconn.nii . . .-cifti subjectN.dconn.nii
>
>
> But this results in a dimension mismatch since the sub cortical portions
> of the individual subjects are different.
>
>
> Is there a way to only grab the surface data from each of the individual
> dconns so I do not get the dimension error.
>
> I tried using
>
> wb_command -cifti-restrict-dense-map  subjectN.dconn.nii 'ROW/COLUMN'
> subjectN_output.dconn.nii -left-roi $left -right-roi $right
>
>
> where the left and right rois are:
>
> subjectN.L.aparc.32k_fs_LR.label.gii
>
> subjectN.R.aparc.32k_fs_LR.label.gii
>
>
> but since I am newer to the wb_commands, I am not sure what each of the
> inputs should be or if this is the correct command.
>
>
> This also resulted in an error.
> WARNING: Metric File: subjectN.L.aparc.32k_fs_LR.label.gii contains data
> array with NIFTI_INTENT_LABEL !!!
>
> WARNING: Metric File: subjectN.R.aparc.32k_fs_LR.label.gii contains data
> array with NIFTI_INTENT_LABEL !!!
>
> As well as an error when I tried to use the file saying something about
> fields not being the same size.
>
>
> Or is there a way to generate the surface dconns via command line without
> the subcortical information so I don't get the dimension mismatch error
> when I try to make an average?
>
>
> Any help is greatly appreciated.  And specific example file names are also
> very much appreciated.
>
>
> Thanks!
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] HCP parcellation template

2018-11-28 Thread Timothy Coalson
You can display the resampled metric on the surfaces from the group-average
HCP "subject" to ensure that things are where they are supposed to be
(note, resampling both the data and the surfaces of a subject is
recommended, but will always look reasonable even if the wrong spheres were
used - using the group average surfaces as a reference ensures that your
data aligns with other data that is known to be resampled correctly).

The HCP MMP v1.0 parcellation doesn't include subcortical data, so it that
is the parcellation you want to use, you can ignore subcortical data for
now.  You can use wb_command -cifti-create-dense-from-template to make a
cifti version of your data (you can use the parcellation .dlabel.nii file
as the template).

If you want to also look at the subcortical data in cifti format, then we
would recommend reanalyzing your data using the HCP pipelines instead of
FSFAST.  If you haven't acquired the scans that the HCP pipelines require
(high-res T2, fieldmaps), there is ciftify:

https://github.com/edickie/ciftify

This can probably be followed by the task analysis scripts in the HCP
pipelines.

Tim


On Wed, Nov 28, 2018 at 11:20 AM Zhi Li  wrote:

> Hi Tim,
>
> Thank you very much for your kind suggestion. After the preprocessing of
> FSFAST I got the left and right surface-based data in 'nii.gz' format.
> Following the instructions I converted them to .gii format with
> mris_convert first, then to func.gii with the command "wb_command
> -metric-resampleADAP_BARY_AREA
>  -area-metrics  ". I am new to the HCP
> pipeline, may I ask what I missed and how could I further convert the
> "func.gii" to the "cifti format"? And how to check if this resample is
> right? The cifti file should also include the subcortical regions, right?
> Do I need to combine the left and right surface and subcortcial volume into
> one cifti file? Sorry for the very basic questions. Thank you very much.
>
> Best wishes,
>
> Zhi Li
>
> On Tue, 27 Nov 2018 at 16:09, Timothy Coalson  wrote:
>
>> The HCP MMP v1.0 parcellation is defined on MSMAll-registered surfaces.
>> I am not familiar with FSFAST, but if it gives you surface-based data, you
>> should be able to resample the data or parcellation so that they are on the
>> same mesh, following these instructions:
>>
>>
>> https://wiki.humanconnectome.org/display/PublicData/HCP+Users+FAQ#HCPUsersFAQ-9.HowdoImapdatabetweenFreeSurferandHCP
>> ?
>>
>> If you resample/convert your timeseries data to cifti .dtseries.nii in
>> our standard grayordinates, then the wb_command -cifti-parcellate command
>> can extract average timeseries of all the ROIs in our parcellation (or any
>> parcellation represented in our standard grayordinates).
>>
>> Note that we do not recommend using any group-average volume methods for
>> human cortical data, because volume-based registration doesn't deal
>> particularly well with intersubject differences in cortical folding (or in
>> functional location compared to folds):
>>
>> https://www.ncbi.nlm.nih.gov/pubmed/29925602
>>
>> Tim
>>
>>
>>
>> On Mon, Nov 26, 2018 at 5:28 PM Zhi Li  wrote:
>>
>>> Hello HCP experts,
>>>
>>> I would like to extract time series from the FSFAST-preprocessed fMRI
>>> data (acquired with traditional scanning protocol) with the HCP average
>>> template, may I know where I could find it (annotation format) ? I read
>>> from another post that I could use mri_aparc2aseg to map it to the
>>> anatomical volume. Do you have any other suggestions on extracting time
>>> series with the HCP rois? Thank you very much.
>>>
>>> Best wishes,
>>>
>>> Zhi Li
>>>
>>> ___
>>> HCP-Users mailing list
>>> HCP-Users@humanconnectome.org
>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>>
>>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] HCP parcellation template

2018-11-27 Thread Timothy Coalson
The HCP MMP v1.0 parcellation is defined on MSMAll-registered surfaces.  I
am not familiar with FSFAST, but if it gives you surface-based data, you
should be able to resample the data or parcellation so that they are on the
same mesh, following these instructions:

https://wiki.humanconnectome.org/display/PublicData/HCP+Users+FAQ#HCPUsersFAQ-9.HowdoImapdatabetweenFreeSurferandHCP
?

If you resample/convert your timeseries data to cifti .dtseries.nii in our
standard grayordinates, then the wb_command -cifti-parcellate command can
extract average timeseries of all the ROIs in our parcellation (or any
parcellation represented in our standard grayordinates).

Note that we do not recommend using any group-average volume methods for
human cortical data, because volume-based registration doesn't deal
particularly well with intersubject differences in cortical folding (or in
functional location compared to folds):

https://www.ncbi.nlm.nih.gov/pubmed/29925602

Tim



On Mon, Nov 26, 2018 at 5:28 PM Zhi Li  wrote:

> Hello HCP experts,
>
> I would like to extract time series from the FSFAST-preprocessed fMRI data
> (acquired with traditional scanning protocol) with the HCP average
> template, may I know where I could find it (annotation format) ? I read
> from another post that I could use mri_aparc2aseg to map it to the
> anatomical volume. Do you have any other suggestions on extracting time
> series with the HCP rois? Thank you very much.
>
> Best wishes,
>
> Zhi Li
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Number of cortical vertices in cifti and gifti files

2018-11-20 Thread Timothy Coalson
Surface registration (whether left/right or otherwise) is by vertices, not
by cifti indices, and is therefore a separate issue from the medial wall
masks (I would bet that the actual mismatches between the two medial walls
is more than 20 vertices - the Venn diagram of the sets of included
vertices is likely nonzero in all categories).  You should think of cifti
indices as a "packed" representation of the data, and if you need to do
anything spatial with cifti data, you generally need to "unpack" it first,
do the operation, and then repack it.  This is how wb_command operates
internally for spatial cifti operations.  If you don't need to do a spatial
operation, then the simplest option is generally to put all the data into
cifti files using the same brainordinates, and then you can use them
indexwise.

If we made our standard dscalar/dtseries use the whole surface, but dconn
exclude the medial wall, then we couldn't do indexwise math between those
cifti files (and doing correlation gradient on it would give us a dscalar
with the medial wall excluded again).  Having more of our cifti files use
the same indices as each other still seems like the better option.

If you prefer for your use, you can make a template cifti file that does
not exclude the medial wall, and use -cifti-create-dense-from-template to
expand any standard cifti file to full-surface.  Doing something similar
for the subcortical voxels, however, would result in much larger files (and
-cifti-create-dense-from-template respects subcortical structure boundaries
when the input is cifti, so you would need a small script involving
-cifti-separate if you wanted a full in-order FOV representation of the
voxels).

Tim


On Tue, Nov 20, 2018 at 6:30 AM Reza Rajimehr  wrote:

> Sorry ... I should have said 20 vertices difference.
>
>
> On Tue, Nov 20, 2018 at 3:52 PM Reza Rajimehr  wrote:
>
>> ... 29716 indices in RH and 29696 indices in LH of cifti file.
>>
>>
>> On Tue, Nov 20, 2018 at 3:43 PM Reza Rajimehr  wrote:
>>
>>> But there are 10 more vertices in right hemisphere than left hemisphere
>>> ...
>>>
>>>
>>> On Tue, Nov 20, 2018 at 2:50 PM Glasser, Matthew 
>>> wrote:
>>>
>>>> Left and right are registered.
>>>>
>>>> Matt.
>>>>
>>>> From: Reza Rajimehr 
>>>> Date: Tuesday, November 20, 2018 at 2:04 AM
>>>> To: Matt Glasser 
>>>> Cc: Timothy Coalson , hcp-users <
>>>> hcp-users@humanconnectome.org>
>>>>
>>>> Subject: Re: [HCP-Users] Number of cortical vertices in cifti and
>>>> gifti files
>>>>
>>>> Thanks Matt! Yes, having symmetric medial wall and having registered
>>>> left and right hemispheres would be helpful and important.
>>>>
>>>>
>>>> On Tue, Nov 20, 2018 at 5:54 AM Glasser, Matthew 
>>>> wrote:
>>>>
>>>>> We don’t want the non-greymatter medial wall in CIFTI; however, the
>>>>> medial wall may undergo revision in the future and making it symmetric
>>>>> might be more convenient, though this would might require rerunning the
>>>>> left/right registration, which remains in all HCP Pipelines derived
>>>>> standard mesh CIFTI and GIFTI files from the original landmark-based
>>>>> registration published in Van Essen et al 2012 Cerebral Cortex.
>>>>>
>>>>> Matt.
>>>>>
>>>>> From:  on behalf of Timothy
>>>>> Coalson 
>>>>> Date: Monday, November 19, 2018 at 2:57 PM
>>>>> To: Reza Rajimehr 
>>>>> Cc: hcp-users 
>>>>>
>>>>> Subject: Re: [HCP-Users] Number of cortical vertices in cifti and
>>>>> gifti files
>>>>>
>>>>> External interoperation with GIFTI and NIFTI data is the purpose of
>>>>> the -cifti-separate command, it is much simpler than using
>>>>> -cifti-export-dense-mapping.
>>>>>
>>>>> A central goal of CIFTI was the exclusion of non-interesting locations
>>>>> from the file (this becomes more important with dense connectivity files,
>>>>> and is a far larger effect for voxel structures (avoiding filling out the
>>>>> entire FOV)).  We do not expect people to use the
>>>>> -cifti-export-dense-mapping command just for file format interoperability
>>>>> (but only for special circumstances, basically if spatial relationship
>>>>> information must be obtained in software without full cifti support,
>>>>> without leaving 

Re: [HCP-Users] Number of cortical vertices in cifti and gifti files

2018-11-19 Thread Timothy Coalson
External interoperation with GIFTI and NIFTI data is the purpose of the
-cifti-separate command, it is much simpler than using
-cifti-export-dense-mapping.

A central goal of CIFTI was the exclusion of non-interesting locations from
the file (this becomes more important with dense connectivity files, and is
a far larger effect for voxel structures (avoiding filling out the entire
FOV)).  We do not expect people to use the -cifti-export-dense-mapping
command just for file format interoperability (but only for special
circumstances, basically if spatial relationship information must be
obtained in software without full cifti support, without leaving the cifti
format).

Tim


On Mon, Nov 19, 2018 at 2:43 PM Reza Rajimehr  wrote:

> Thanks! The file that we looked into was an MSMSulc file. I guess that the
> difference of 10 indices/vertices between left and right hemis exists for
> an MSMAll file as well, but I haven’t checked it yet.
>
> One comment: The medial wall has been left out in CIFTI files because the
> medial wall is not cortical grey matter. This is legitimate, however, it
> introduces confusions when communicating between CIFTI and GIFTI files. It
> would be nice if HCP developers, in future, consider a version of CIFTI
> with medial wall included.
>
>
> On Mon, Nov 19, 2018 at 11:53 PM Timothy Coalson  wrote:
>
>> The left and right hemisphere are intended to be in register, though I
>> don't recall what effort was put into this in MSMAll (maybe only the
>> dedrifting to sulc).  I believe the left and right hemisphere medial wall
>> masks were generated separately without trying to synchronize across
>> hemispheres, and used early on, and for ongoing compatibility we stayed
>> with the same masks.
>>
>> Tim
>>
>>
>> On Mon, Nov 19, 2018 at 11:51 AM Reza Rajimehr 
>> wrote:
>>
>>> We successfully used -cifti-export-dense-mapping to get the mapping
>>> from cifti indices to surface vertices (all indices are zero-based).
>>>
>>> wb_command -cifti-export-dense-mapping
>>> 100408_tfMRI_WM_level2_hp200_s2.dscalar.nii COLUMN -surface CORTEX_LEFT
>>> leftcortex.txt
>>>
>>> wb_command -cifti-export-dense-mapping
>>> 100408_tfMRI_WM_level2_hp200_s2.dscalar.nii COLUMN -surface CORTEX_RIGHT
>>> rightcortex.txt
>>>
>>> In the text files, there are 29696 left hemisphere indices and 29716
>>> right hemisphere indices. I always thought that left and right hemispheres
>>> are in register in the standard space/mesh, and I was expecting to see the
>>> same number of indices/vertices (exactly 29706) for both hemispheres. But
>>> apparently this is not the case! Any reason for this?
>>>
>>>
>>> On Fri, Nov 16, 2018 at 11:58 PM Timothy Coalson  wrote:
>>>
>>>> The easiest to use (especially if your goal is to match other cifti
>>>> files) is generally -cifti-create-dense-from-template.  It will even turn a
>>>> 59k surface-only cifti into a standard 91282 cifti (or vice versa, if you
>>>> are so inclined).
>>>>
>>>> Yes, -cifti-export-dense-mapping will give you the cifti index to gifti
>>>> vertex mapping as a text file, if you want to do things the hard way.
>>>>
>>>> Tim
>>>>
>>>>
>>>> On Fri, Nov 16, 2018 at 2:06 PM Harms, Michael 
>>>> wrote:
>>>>
>>>>>
>>>>>
>>>>> See the various -cifti-create-* commands.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>>
>>>>> Michael Harms, Ph.D.
>>>>>
>>>>> ---
>>>>>
>>>>> Associate Professor of Psychiatry
>>>>>
>>>>> Washington University School of Medicine
>>>>>
>>>>> Department of Psychiatry, Box 8134
>>>>>
>>>>> 660 South Euclid Ave
>>>>> <https://maps.google.com/?q=660+South+Euclid+Ave=gmail=g>.
>>>>> Tel: 314-747-6173
>>>>>
>>>>> St. Louis, MO  63110  Email: mha...@wustl.edu
>>>>>
>>>>>
>>>>>
>>>>> *From: *Reza Rajimehr 
>>>>> *Date: *Friday, November 16, 2018 at 1:50 PM
>>>>> *To: *"Harms, Michael" 
>>>>> *Cc: *"Glasser, Matthew" , "
>>>>> hcp-users@humanconnectome.org" 
>>>>> *Subject: *Re: [HCP-Users]

Re: [HCP-Users] Number of cortical vertices in cifti and gifti files

2018-11-19 Thread Timothy Coalson
The left and right hemisphere are intended to be in register, though I
don't recall what effort was put into this in MSMAll (maybe only the
dedrifting to sulc).  I believe the left and right hemisphere medial wall
masks were generated separately without trying to synchronize across
hemispheres, and used early on, and for ongoing compatibility we stayed
with the same masks.

Tim


On Mon, Nov 19, 2018 at 11:51 AM Reza Rajimehr  wrote:

> We successfully used -cifti-export-dense-mapping to get the mapping from
> cifti indices to surface vertices (all indices are zero-based).
>
> wb_command -cifti-export-dense-mapping
> 100408_tfMRI_WM_level2_hp200_s2.dscalar.nii COLUMN -surface CORTEX_LEFT
> leftcortex.txt
>
> wb_command -cifti-export-dense-mapping
> 100408_tfMRI_WM_level2_hp200_s2.dscalar.nii COLUMN -surface CORTEX_RIGHT
> rightcortex.txt
>
> In the text files, there are 29696 left hemisphere indices and 29716 right
> hemisphere indices. I always thought that left and right hemispheres are in
> register in the standard space/mesh, and I was expecting to see the same
> number of indices/vertices (exactly 29706) for both hemispheres. But
> apparently this is not the case! Any reason for this?
>
>
> On Fri, Nov 16, 2018 at 11:58 PM Timothy Coalson  wrote:
>
>> The easiest to use (especially if your goal is to match other cifti
>> files) is generally -cifti-create-dense-from-template.  It will even turn a
>> 59k surface-only cifti into a standard 91282 cifti (or vice versa, if you
>> are so inclined).
>>
>> Yes, -cifti-export-dense-mapping will give you the cifti index to gifti
>> vertex mapping as a text file, if you want to do things the hard way.
>>
>> Tim
>>
>>
>> On Fri, Nov 16, 2018 at 2:06 PM Harms, Michael  wrote:
>>
>>>
>>>
>>> See the various -cifti-create-* commands.
>>>
>>>
>>>
>>>
>>>
>>> --
>>>
>>> Michael Harms, Ph.D.
>>>
>>> ---
>>>
>>> Associate Professor of Psychiatry
>>>
>>> Washington University School of Medicine
>>>
>>> Department of Psychiatry, Box 8134
>>>
>>> 660 South Euclid Ave
>>> <https://maps.google.com/?q=660+South+Euclid+Ave=gmail=g>.
>>> Tel: 314-747-6173
>>>
>>> St. Louis, MO  63110  Email: mha...@wustl.edu
>>>
>>>
>>>
>>> *From: *Reza Rajimehr 
>>> *Date: *Friday, November 16, 2018 at 1:50 PM
>>> *To: *"Harms, Michael" 
>>> *Cc: *"Glasser, Matthew" , "
>>> hcp-users@humanconnectome.org" 
>>> *Subject: *Re: [HCP-Users] Number of cortical vertices in cifti and
>>> gifti files
>>>
>>>
>>>
>>> Is there a command to convert LH (or RH) gifti file to hcp cifti file?
>>>
>>>
>>>
>>>
>>>
>>> On Fri, Nov 16, 2018 at 10:55 PM Harms, Michael 
>>> wrote:
>>>
>>>
>>>
>>> No, it is more complicated than that.
>>>
>>>
>>>
>>> I believe what you need is -cifti-export-dense-mapping
>>>
>>>
>>>
>>> Cheers,
>>>
>>> -MH
>>>
>>>
>>>
>>> --
>>>
>>> Michael Harms, Ph.D.
>>>
>>> ---
>>>
>>> Associate Professor of Psychiatry
>>>
>>> Washington University School of Medicine
>>>
>>> Department of Psychiatry, Box 8134
>>>
>>> 660 South Euclid Ave
>>> <https://maps.google.com/?q=660+South+Euclid+Ave=gmail=g>.
>>> Tel: 314-747-6173
>>>
>>> St. Louis, MO  63110  Email: mha...@wustl.edu
>>>
>>>
>>>
>>> *From: * on behalf of Reza
>>> Rajimehr 
>>> *Date: *Friday, November 16, 2018 at 1:17 PM
>>> *To: *"Glasser, Matthew" 
>>> *Cc: *"hcp-users@humanconnectome.org" 
>>> *Subject: *Re: [HCP-Users] Number of cortical vertices in cifti and
>>> gifti files
>>>
>>>
>>>
>>> Thanks! Can I simply say that:
>>>
>>>
>>>
>>> For left hemisphere:
>>>
>>> vertex number in cifti = vertex number (up to 29706) in LH gifti
>>>
>>>
>>>
>>> For right hemisphere:
>>>
>>> vertex number in cifti = vertex number (up to 29706) in RH gifti + 

Re: [HCP-Users] Debugging IcaFIxProcessingBatch.sh

2018-11-16 Thread Timothy Coalson
.fix.log says that the specified path for the cifti matlab functions is
wrong (nonexistent directory).

Tim


On Fri, Nov 16, 2018 at 2:58 PM Jayasekera, Dinal <
dinal.jayasek...@wustl.edu> wrote:

> Tim,
>
>
> I implemented your version of hcp_fix that you sent but I continue to get
> the same error about missing files:
>
>
> rfMRI_REST1_PA_hp2000.ica/reg:
> total 1492
> drwxrwxr-x 2 functionalspinelab functionalspinelab4096 Nov 16 13:21 .
> drwxrwxr-x 6 functionalspinelab functionalspinelab4096 Nov 16 13:21 ..
> lrwxrwxrwx 1 functionalspinelab functionalspinelab  19 Nov 16 13:21
> example_func.nii.gz -> ../mean_func.nii.gz
> lrwxrwxrwx 1 functionalspinelab functionalspinelab  36 Nov 16 13:21
> highres.nii.gz -> ../../../../T1w_restore_brain.nii.gz
> -rw-rw-r-- 1 functionalspinelab functionalspinelab 149 Nov 16 13:21
> highres2example_func.mat
> -rw-rw-r-- 1 functionalspinelab functionalspinelab 1456377 Nov 16 13:22
> veins.nii.gz
> -rw-rw-r-- 1 functionalspinelab functionalspinelab   57137 Nov 16 13:22
> veins_exf.nii.gz
> lrwxrwxrwx 1 functionalspinelab functionalspinelab  25 Nov 16 13:21
> wmparc.nii.gz -> ../../../../wmparc.nii.gz
> hcp_fix: INFORM: functionmotionconfounds log file is to be named:
> .fix.functionmotionconfounds.log instead of .fix.log
> hcp_fix: DEBUG: current folder
> /home/functionalspinelab/Desktop/Dinal/mystudy/NSI_11/MNINonLinear/Results/rfMRI_REST1_PA/rfMRI_REST1_PA_hp2000.ica/reg
> hcp_fix: INFORM: running FIX
> hcp_fix: INFORM: About to run:
> /home/functionalspinelab/Desktop/Dinal/Applications/fix1.067/fix
> rfMRI_REST1_PA_hp2000.ica
> /home/functionalspinelab/Desktop/Dinal/Applications/fix1.067/training_files/HCP_hp2000.RData
> 10 -m -h 2000
> FIX Feature extraction for Melodic output directory:
> rfMRI_REST1_PA_hp2000.ica
>  create edge masks
>  run FAST
>  registration of standard space masks
>  extract features
> FIX Classifying components in Melodic directory: rfMRI_REST1_PA_hp2000.ica
> using training file:
> /home/functionalspinelab/Desktop/Dinal/Applications/fix1.067/training_files/HCP_hp2000.RData
> and threshold 10
> FIX Applying cleanup using cleanup file:
> rfMRI_REST1_PA_hp2000.ica/fix4melview_HCP_hp2000_thr10.txt and motion
> cleanup set to 1
> Could not find a supported file with prefix
> "rfMRI_REST1_PA_hp2000.ica/filtered_func_data_clean"
> Could not find a supported file with prefix
> "rfMRI_REST1_PA_hp2000.ica/filtered_func_data_clean_vn"
>
> Would this be an issue with hcp_fix or some other supporting program?
>
>
> Kind regards,
> *Dinal Jayasekera*
>
> PhD Candidate | InSITE Fellow
> Ammar Hawasli Lab
> Department of Biomedical Engineering | Washington University in St. Louis
>
> --
> *From:* Timothy Coalson 
> *Sent:* Tuesday, November 13, 2018 4:50:07 PM
> *To:* Jayasekera, Dinal
> *Cc:* Glasser, Matthew; Dierker, Donna; hcp-users@humanconnectome.org
> *Subject:* Re: [HCP-Users] Debugging IcaFIxProcessingBatch.sh
>
> Yes, it needs a new argument after bandpass, TRUE or FALSE for whether to
> do motion regression.
>
> Tim
>
>
> On Tue, Nov 13, 2018 at 4:19 PM, Jayasekera, Dinal <
> dinal.jayasek...@wustl.edu> wrote:
>
> Tim,
>
>
> I forgot to mention this but with the addition of the motion regression
> argument to the hcp_fix script you linked, some changes have to be made to
> the order and number of positional arguments being called by 
> ./IcaFixProcessingBatch.sh
> right?
>
>
> Kind regards,
> *Dinal Jayasekera*
>
> PhD Candidate | InSITE Fellow
> Ammar Hawasli Lab
> Department of Biomedical Engineering | Washington University in St. Louis
>
> --
> *From:* Timothy Coalson 
> *Sent:* Monday, November 12, 2018 4:04:33 PM
> *To:* Jayasekera, Dinal
> *Cc:* Glasser, Matthew; Dierker, Donna; hcp-users@humanconnectome.org
>
> *Subject:* Re: [HCP-Users] Debugging IcaFIxProcessingBatch.sh
>
> The path he gave was not intended to be pasted in, the script doesn't know
> about those variables, and the hcp_fix script is still using sh rather than
> bash.  Try the version from here:
>
> https://github.com/coalsont/Pipelines/tree/reapply_inprogress
>
> Tim
>
>
> On Mon, Nov 12, 2018 at 3:42 PM, Jayasekera, Dinal <
> dinal.jayasek...@wustl.edu> wrote:
>
> Dear Matt,
>
>
> So changing that line did fix the error with the 0.7mm brain mask.
> However, there seems to be an additional error:
>
>
> hcp_fix: INFORM: functionmotionconfounds log file is to be named:
> .fix.functionmotionconfounds.log instead of .fix.log
> hcp_fix: DEBUG: current folder
> /home/functionalspinel

Re: [HCP-Users] Number of cortical vertices in cifti and gifti files

2018-11-16 Thread Timothy Coalson
The easiest to use (especially if your goal is to match other cifti files)
is generally -cifti-create-dense-from-template.  It will even turn a 59k
surface-only cifti into a standard 91282 cifti (or vice versa, if you are
so inclined).

Yes, -cifti-export-dense-mapping will give you the cifti index to gifti
vertex mapping as a text file, if you want to do things the hard way.

Tim


On Fri, Nov 16, 2018 at 2:06 PM Harms, Michael  wrote:

>
>
> See the various -cifti-create-* commands.
>
>
>
>
>
> --
>
> Michael Harms, Ph.D.
>
> ---
>
> Associate Professor of Psychiatry
>
> Washington University School of Medicine
>
> Department of Psychiatry, Box 8134
>
> 660 South Euclid Ave.Tel: 314-747-6173
>
> St. Louis, MO  63110  Email: mha...@wustl.edu
>
>
>
> *From: *Reza Rajimehr 
> *Date: *Friday, November 16, 2018 at 1:50 PM
> *To: *"Harms, Michael" 
> *Cc: *"Glasser, Matthew" , "
> hcp-users@humanconnectome.org" 
> *Subject: *Re: [HCP-Users] Number of cortical vertices in cifti and gifti
> files
>
>
>
> Is there a command to convert LH (or RH) gifti file to hcp cifti file?
>
>
>
>
>
> On Fri, Nov 16, 2018 at 10:55 PM Harms, Michael  wrote:
>
>
>
> No, it is more complicated than that.
>
>
>
> I believe what you need is -cifti-export-dense-mapping
>
>
>
> Cheers,
>
> -MH
>
>
>
> --
>
> Michael Harms, Ph.D.
>
> ---
>
> Associate Professor of Psychiatry
>
> Washington University School of Medicine
>
> Department of Psychiatry, Box 8134
>
> 660 South Euclid Ave
> .
> Tel: 314-747-6173
>
> St. Louis, MO  63110  Email: mha...@wustl.edu
>
>
>
> *From: * on behalf of Reza
> Rajimehr 
> *Date: *Friday, November 16, 2018 at 1:17 PM
> *To: *"Glasser, Matthew" 
> *Cc: *"hcp-users@humanconnectome.org" 
> *Subject: *Re: [HCP-Users] Number of cortical vertices in cifti and gifti
> files
>
>
>
> Thanks! Can I simply say that:
>
>
>
> For left hemisphere:
>
> vertex number in cifti = vertex number (up to 29706) in LH gifti
>
>
>
> For right hemisphere:
>
> vertex number in cifti = vertex number (up to 29706) in RH gifti + 29706
>
>
>
> Or the mapping is more complicated than this?
>
>
>
>
>
> On Fri, Nov 16, 2018 at 9:56 PM Glasser, Matthew 
> wrote:
>
> That is correct, the medial wall is kept out.  Usually when I want to do
> that I split the CIFTI file into hemispheric GIFTI files, but perhaps there
> is a good way to load in a specific mapping based on something we can
> output from wb_command.
>
>
>
> Matt.
>
>
>
> *From: * on behalf of Reza
> Rajimehr 
> *Date: *Friday, November 16, 2018 at 10:17 AM
> *To: *"hcp-users@humanconnectome.org" 
> *Subject: *[HCP-Users] Number of cortical vertices in cifti and gifti
> files
>
>
>
> Hi,
>
>
>
> A cifti file has 91282 vertices/voxels, a combined LR cifti file has 59412
> vertices, and an individual hemisphere gifti file has 32492 vertices. So
> the number of cortical vertices in cifti files is less than the number of
> cortical vertices in gifti files (left hemi vertices + right hemi vertices
> = 64984). Looks like this is related to not having medial wall vertices in
> the cifti files, right?
>
>
>
> We have loaded these files in Matlab. Now we want to know which vertex in
> right (or left) hemisphere gifti file corresponds to which vertex in the
> cifti file. How can we achieve this?
>
>
>
> Best,
>
> Reza
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or other information of a sensitive nature. If you
> are not the intended recipient, be advised that any unauthorized use,
> disclosure, copying or the taking of any action in reliance on the contents
> of this information is strictly prohibited. If you have received this email
> in error, please immediately notify the sender via telephone or return mail.
>
>
> --
>
> The materials in this message are private and may contain Protected
> Healthcare Information or 

Re: [HCP-Users] Binary label from surface

2018-11-16 Thread Timothy Coalson
The purpose of the surface ROIs in -cifti-create-label is to prevent the
cifti file from needing to represent data inside the medial wall (where it
would generally be nonsensical).  Since you haven't converted the data to
any kind of label format yet, it is not the command you want - putting the
data into cifti before turning it into label format avoids a particular
quirk of gifti label format, so it is what I would recommend for your case.

First, use -cifti-create-dense-from-template, with a standard 91282
grayordinate file as the template, and your surface ROIs as the data
files.  This will give you a cifti file with your ROIs in it (but because
of the mapping to surface step, they are probably not binary).  Then,
threshold this file with -cifti-math (you probably want something like 'x >
0.5'), and finally use -cifti-label-import.

Are these cortical volume-based clusters results from individual subjects?
If not, it should be beneficial if you map the individual fMRI data to the
individual's own surfaces, and then doing cross-subject cortical analysis
using surfaces:

https://www.ncbi.nlm.nih.gov/pubmed/29925602

Tim


On Fri, Nov 16, 2018 at 12:35 PM Leonardo Tozzi  wrote:

> To Whom It May Concern,
>
>
>
> I apologise in advance since this is probably a trivial question, but I
> tried browsing the list and couldn’t find a clear answer. I would like to
> obtain a binary dlabel for a surface. The overarching goal is to create a
> network mask converting it from some volumetric clusters.
>
> So far I have done:
>
>
>
> wb_command -volume-to-surface-mapping $maskdir/$mask.nii.gz
> $subsdir/$sub/MNINonLinear/fsaverage_LR32k/${sub}.R.midthickness.32k_fs_LR.surf.gii
> ${sub}_${mask}_R.func.gii -ribbon-constrained
> $subsdir/$sub/MNINonLinear/fsaverage_LR32k/${sub}.R.white.32k_fs_LR.surf.gii
> $subsdir/$sub/MNINonLinear/fsaverage_LR32k/${sub}.R.pial.32k_fs_LR.surf.gii
>
>
>
> To map the network clusters in MNI space to the individual grey matter
> surface (for left and right). I get two .func.gii files (one for the left
> cortex, one for the right).
>
> But then if I do:
>
>
>
> wb_command -cifti-create-label ${sub}_${mask}.dlabel.nii  -left-label
> $subsdir/$sub/MNINonLinear/fsaverage_LR32k/${sub}.L.aparc.32k_fs_LR.label.gii
> -roi-left ${sub}_${mask}_L.func.gii -right-label
> $subsdir/$sub/MNINonLinear/fsaverage_LR32k/${sub}.R.aparc.32k_fs_LR.label.gii
> -roi-right ${sub}_${mask}_R.func.gii
>
>
>
> I get a parcellation whose labels correspond for example to the Freesurfer
> atlas. Is there a way to simply have all the greyordinates of my network be
> 1 and have my own custom label, say “Network1”? I could change the values
> in matlab but I was wondering if there is a way to do it “properly” in
> wb_command.
>
> Thank you,
>
>
>
>
>
>
>
>
>
> Leonardo Tozzi, MD, PhD
>
> Williams PanLab | Postdoctoral Fellow
>
> Stanford University | 401 Quarry Rd
>
> lto...@stanford.edu | (650) 5615738
>
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Debugging IcaFIxProcessingBatch.sh

2018-11-13 Thread Timothy Coalson
Yes, it needs a new argument after bandpass, TRUE or FALSE for whether to
do motion regression.

Tim


On Tue, Nov 13, 2018 at 4:19 PM, Jayasekera, Dinal <
dinal.jayasek...@wustl.edu> wrote:

> Tim,
>
>
> I forgot to mention this but with the addition of the motion regression
> argument to the hcp_fix script you linked, some changes have to be made to
> the order and number of positional arguments being called by 
> ./IcaFixProcessingBatch.sh
> right?
>
>
> Kind regards,
> *Dinal Jayasekera*
>
> PhD Candidate | InSITE Fellow
> Ammar Hawasli Lab
> Department of Biomedical Engineering | Washington University in St. Louis
>
> --
> *From:* Timothy Coalson 
> *Sent:* Monday, November 12, 2018 4:04:33 PM
> *To:* Jayasekera, Dinal
> *Cc:* Glasser, Matthew; Dierker, Donna; hcp-users@humanconnectome.org
>
> *Subject:* Re: [HCP-Users] Debugging IcaFIxProcessingBatch.sh
>
> The path he gave was not intended to be pasted in, the script doesn't know
> about those variables, and the hcp_fix script is still using sh rather than
> bash.  Try the version from here:
>
> https://github.com/coalsont/Pipelines/tree/reapply_inprogress
>
> Tim
>
>
> On Mon, Nov 12, 2018 at 3:42 PM, Jayasekera, Dinal <
> dinal.jayasek...@wustl.edu> wrote:
>
> Dear Matt,
>
>
> So changing that line did fix the error with the 0.7mm brain mask.
> However, there seems to be an additional error:
>
>
> hcp_fix: INFORM: functionmotionconfounds log file is to be named:
> .fix.functionmotionconfounds.log instead of .fix.log
> hcp_fix: DEBUG: current folder /home/functionalspinelab/Deskt
> op/Dinal/mystudy/NSI_12/MNINonLinear/Results/rfMRI_REST1_PA/
> rfMRI_REST1_PA_hp2000.ica/reg
> /home/functionalspinelab/Desktop/Dinal/Pipelines/ICAFIX/hcp_fix: 159:
> /home/functionalspinelab/Desktop/Dinal/Pipelines/ICAFIX/hcp_fix:
> //MNINonLinear/brainmask_fs.nii.gz: not found
> hcp_fix: INFORM: running FIX
> /home/functionalspinelab/Desktop/Dinal/Pipelines/ICAFIX/hcp_fix: 171:
> /home/functionalspinelab/Desktop/Dinal/Pipelines/ICAFIX/hcp_fix: [[: not
> found
> hcp_fix: INFORM: About to run: /home/functionalspinelab/Deskt
> op/Dinal/Applications/fix1.066/fix rfMRI_REST1_PA_hp2000.ica
> /home/functionalspinelab/Desktop/Dinal/Applications/fix1.
> 066/training_files/HCP_hp2000.RData 10 -m -h 2000
> FIX Feature extraction for Melodic output directory:
> rfMRI_REST1_PA_hp2000.ica
>  create edge masks
>  run FAST
>  registration of standard space masks
>  extract features
> FIX Classifying components in Melodic directory: rfMRI_REST1_PA_hp2000.ica
> using training file: /home/functionalspinelab/Deskt
> op/Dinal/Applications/fix1.066/training_files/HCP_hp2000.RData and
> threshold 10
> FIX Applying cleanup using cleanup file: 
> rfMRI_REST1_PA_hp2000.ica/fix4melview_HCP_hp2000_thr10.txt
> and motion cleanup set to 1
> Could not find a supported file with prefix "rfMRI_REST1_PA_hp2000.ica/fil
> tered_func_data_clean"
> Could not find a supported file with prefix "rfMRI_REST1_PA_hp2000.ica/fil
> tered_func_data_clean_vn"
>
> This is the line in hcp_fix which relates to these two files:
>
>
> $FSLDIR/bin/immv ${fmri}.ica/filtered_func_data_clean ${fmri}_clean
> $FSLDIR/bin/immv ${fmri}.ica/filtered_func_data_clean_vn ${fmri}_clean_vnf
>
> I initially thought the specified path to these two files were incorrect
> but it turns out these files were never created. Are these files meant to
> be created by hcp_fix?
>
>
> Kind regards,
> *Dinal Jayasekera*
>
> PhD Candidate | InSITE Fellow
> Ammar Hawasli Lab
> Department of Biomedical Engineering | Washington University in St. Louis
>
> --
> *From:* Glasser, Matthew
> *Sent:* Friday, November 9, 2018 7:36:59 PM
> *To:* NEUROSCIENCE tim
> *Cc:* Jayasekera, Dinal; Dierker, Donna; hcp-users@humanconnectome.org
>
> *Subject:* Re: [HCP-Users] Debugging IcaFIxProcessingBatch.sh
>
> We should change that line to use this file:
>
> ${StudyFolder}/${Subject}/MNINonLinear/brainmask_fs.nii.gz
>
> Matt.
>
> From: Timothy Coalson 
> Date: Friday, November 9, 2018 at 7:18 PM
> To: Matt Glasser 
> Cc: "Jayasekera, Dinal" , "Dierker, Donna" <
> do...@wustl.edu>, "hcp-users@humanconnectome.org" <
> hcp-users@humanconnectome.org>
> Subject: Re: [HCP-Users] Debugging IcaFIxProcessingBatch.sh
>
> Here is the line from hcp_fix where it uses a 0.7mm mask:
>
> $FSLDIR/bin/fslmaths veins -div `$FSLDIR/bin/fslstats veins -k
> ${FSL_FIXDIR}/mask_files/hcp_0.7mm_brain_mask -P 50` -mul 2.18 -thr 10
> -min 50 -div 50 veins
>
> https://github.com/Washington-Univer

Re: [HCP-Users] Convert subject-specific ICA node maps to volumetric space?

2018-11-13 Thread Timothy Coalson
-cifti-convert does not do this, it only dumps the matrix into different
file formats as-is, the spatial relationships are not accessible from its
output.

If you are only interested in subcortical/cerebellum data, that is trivial
to extract from cifti as a volumetric nifti using -cifti-separate with
-volume-all.  In the future, when resolution increases sufficiently, we
hope to move the cerebellum data to surface representation also.

Group volume-registered space has significant drawbacks, in that it does
not align cortical functional areas to the degree that surface/cifti
processing can.  This is because much of the cortex has variable folding
between subjects, which volume registration can't unfold to try to match,
and because functional areas vary somewhat in their position relative to
these folds.  See our recent paper:

https://www.ncbi.nlm.nih.gov/pubmed/29925602

This actually means that mapping the cifti data back into the volume should
have better cross-subject functional correspondence than computing group
maps from the original volumetric data - the drawback, however, is that the
folding patterns can't be preserved (since they are different across
subjects), and you will end up with a much less folded cortical ribbon with
oddly sharp boundaries (and perfect correlation along the normal vector of
the surface).  Processing the data in cifti format will use less memory, be
more correct to the nature of the data, and make it more obvious that you
aren't really using a volumetric processing stream.  Matlab and python both
have support for reading and writing cifti files (though in matlab there
isn't good support for spatial relationships), as well as a few other
languages (the cifti code from workbench has been made into a c++ library,
though you will also need a gifti library to make full use of spatial
relationships).  wb_command can do several types of spatial processing
operations on cifti, and -cifti-convert will allow you to feed the data
through virtually any tool that doesn't use spatial neighborhood
information.

Tim


On Tue, Nov 13, 2018 at 10:45 AM, Nicola Toschi 
wrote:

> Thank you Steve,
>
> I was hoping to get them out of the CIFTI versions (also so I don't have
> to reprocess the 4D FIX data for all subjects). Maybe with cifti-convert?
> The brain would not be fully covered here i realise but much better than
> nothing.
>
> Alternatively, do you think there is a chance of obtaining them somehow
> (since they have been calculated by you guys anyway)? Can i put in a
> request somewhere maybe?
>
> Thank you in advance!
>
> Nicola
>
>
> On 11/13/2018 5:13 PM, Steve Smith wrote:
>
> Hi - we calculated them in order to estimate the volumetric group maps.
> However AFAIK we didn't bundle them in the PTN release.
> You can easily get them though by regressing node timeseries into 4D
> FIX-cleaned data.
>
> Cheers.
>
>
> On 13 Nov 2018, at 16:09, Nicola Toschi  wrote:
>
> Dear HCP list and experts,
>
> I was wondering if the subject-specific node maps from the latest PTN
> release (which are available in CIFTI format) are also available in
> volumetric space (which would aid the specific analysis we are running).
>
> Or if not, maybe a pointer on how to convert them?
>
> Thank you very much in advance!
>
> nicola
>
> ---
> This email has been checked for viruses by Avast antivirus software.
> https://www.avast.com/antivirus
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
>
> 
> ---
> Stephen M. Smith, Professor of Biomedical Engineering
> Head of Analysis,  WIN (FMRIB) Oxford
>
> FMRIB, JR Hospital, Headington, Oxford  OX3 9DU, UK
> +44 (0) 1865 610470
> st...@fmrib.ox.ac.ukhttp://www.fmrib.ox.ac.uk/~steve
> 
> ---
>
> Stop the cultural destruction of Tibet 
>
>
>
>
>
>
>
>
>
>
>
> 
>  Virus-free.
> www.avast.com
> 
> <#m_-1125881958447918270_DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Debugging IcaFIxProcessingBatch.sh

2018-11-12 Thread Timothy Coalson
The path he gave was not intended to be pasted in, the script doesn't know
about those variables, and the hcp_fix script is still using sh rather than
bash.  Try the version from here:

https://github.com/coalsont/Pipelines/tree/reapply_inprogress

Tim


On Mon, Nov 12, 2018 at 3:42 PM, Jayasekera, Dinal <
dinal.jayasek...@wustl.edu> wrote:

> Dear Matt,
>
>
> So changing that line did fix the error with the 0.7mm brain mask.
> However, there seems to be an additional error:
>
>
> hcp_fix: INFORM: functionmotionconfounds log file is to be named:
> .fix.functionmotionconfounds.log instead of .fix.log
> hcp_fix: DEBUG: current folder /home/functionalspinelab/
> Desktop/Dinal/mystudy/NSI_12/MNINonLinear/Results/rfMRI_
> REST1_PA/rfMRI_REST1_PA_hp2000.ica/reg
> /home/functionalspinelab/Desktop/Dinal/Pipelines/ICAFIX/hcp_fix: 159:
> /home/functionalspinelab/Desktop/Dinal/Pipelines/ICAFIX/hcp_fix:
> //MNINonLinear/brainmask_fs.nii.gz: not found
> hcp_fix: INFORM: running FIX
> /home/functionalspinelab/Desktop/Dinal/Pipelines/ICAFIX/hcp_fix: 171:
> /home/functionalspinelab/Desktop/Dinal/Pipelines/ICAFIX/hcp_fix: [[: not
> found
> hcp_fix: INFORM: About to run: /home/functionalspinelab/
> Desktop/Dinal/Applications/fix1.066/fix rfMRI_REST1_PA_hp2000.ica
> /home/functionalspinelab/Desktop/Dinal/Applications/
> fix1.066/training_files/HCP_hp2000.RData 10 -m -h 2000
> FIX Feature extraction for Melodic output directory:
> rfMRI_REST1_PA_hp2000.ica
>  create edge masks
>  run FAST
>  registration of standard space masks
>  extract features
> FIX Classifying components in Melodic directory: rfMRI_REST1_PA_hp2000.ica
> using training file: /home/functionalspinelab/Desktop/Dinal/Applications/
> fix1.066/training_files/HCP_hp2000.RData and threshold 10
> FIX Applying cleanup using cleanup file: rfMRI_REST1_PA_hp2000.ica/
> fix4melview_HCP_hp2000_thr10.txt and motion cleanup set to 1
> Could not find a supported file with prefix "rfMRI_REST1_PA_hp2000.ica/
> filtered_func_data_clean"
> Could not find a supported file with prefix "rfMRI_REST1_PA_hp2000.ica/
> filtered_func_data_clean_vn"
>
> This is the line in hcp_fix which relates to these two files:
>
>
> $FSLDIR/bin/immv ${fmri}.ica/filtered_func_data_clean ${fmri}_clean
> $FSLDIR/bin/immv ${fmri}.ica/filtered_func_data_clean_vn ${fmri}_clean_vnf
>
> I initially thought the specified path to these two files were incorrect
> but it turns out these files were never created. Are these files meant to
> be created by hcp_fix?
>
>
> Kind regards,
> *Dinal Jayasekera*
>
> PhD Candidate | InSITE Fellow
> Ammar Hawasli Lab
> Department of Biomedical Engineering | Washington University in St. Louis
>
> --
> *From:* Glasser, Matthew
> *Sent:* Friday, November 9, 2018 7:36:59 PM
> *To:* NEUROSCIENCE tim
> *Cc:* Jayasekera, Dinal; Dierker, Donna; hcp-users@humanconnectome.org
>
> *Subject:* Re: [HCP-Users] Debugging IcaFIxProcessingBatch.sh
>
> We should change that line to use this file:
>
> ${StudyFolder}/${Subject}/MNINonLinear/brainmask_fs.nii.gz
>
> Matt.
>
> From: Timothy Coalson 
> Date: Friday, November 9, 2018 at 7:18 PM
> To: Matt Glasser 
> Cc: "Jayasekera, Dinal" , "Dierker, Donna" <
> do...@wustl.edu>, "hcp-users@humanconnectome.org" <
> hcp-users@humanconnectome.org>
> Subject: Re: [HCP-Users] Debugging IcaFIxProcessingBatch.sh
>
> Here is the line from hcp_fix where it uses a 0.7mm mask:
>
> $FSLDIR/bin/fslmaths veins -div `$FSLDIR/bin/fslstats veins -k
> ${FSL_FIXDIR}/mask_files/hcp_0.7mm_brain_mask -P 50` -mul 2.18 -thr 10
> -min 50 -div 50 veins
>
> https://github.com/Washington-University/HCPpipelines/blob/
> master/ICAFIX/hcp_fix#L159
>
> So, the script hardcodes a path to a 0.7mm mask.
>
> Tim
>
>
> On Fri, Nov 9, 2018 at 7:13 PM, Glasser, Matthew 
> wrote:
>
> Yes you probably need to have bash instead of a non-bash shell.  Perhaps
> fixing that will solve the problem.  As far as the resolution of the T1w
> and T2w, presumably so long as they are the same resolution everything
> should work.  We have tested on 0.8mm human data and 0.5mm monkey data.
>
> Matt.
>
> From: Timothy Coalson 
> Date: Friday, November 9, 2018 at 3:53 PM
> To: "Jayasekera, Dinal" , Matt Glasser <
> glass...@wustl.edu>
> Cc: "Dierker, Donna" , "hcp-users@humanconnectome.org" <
> hcp-users@humanconnectome.org>
>
> Subject: Re: [HCP-Users] Debugging IcaFIxProcessingBatch.sh
>
> This is the first problem I see in the text you pasted in the email:
>
> Mask and image must be the same size
>
> This looks like 

Re: [HCP-Users] Diffusion connectivity matrix with cortical and subcortical parcellation

2018-11-09 Thread Timothy Coalson
wb_command -cifti-export-dense-mapping may be useful for getting the voxel
seeds in the correct order (for the surface, you may be able to just use
the file from the HCP Pipelines global/templates/91282_Greyordinates
folder).

I seem to recall that previously we did this in 3 probtrackx runs, and
concatenated them back together afterwards.  But, the best way to do this
mostly depends on what probtrackx is capable of, which I don't really know.

Tim


On Fri, Nov 9, 2018 at 7:24 PM, Leonardo Tozzi  wrote:

> Dear Matt,
>
>
>
> In my previous call, I did get a matrix called fdt_matrix1.dot, which is
> large (1.33 GB). I guess this is a dense matrix, but then I wonder again,
> there might be something wrong there as well with the positioning of the
> subcortical structures, since I used them as seeds.
>
> So I could make the same call as before, with the –avoid flag for the CSF
> but I guess I would still at least need to input the GrayOrdinates.txt
> which is still a list of seeds (-x option). How would I obtain the seed
> list for the grayordinates?
> Thank you,
>
>
>
>
>
> Leonardo Tozzi, MD, PhD
>
> Williams PanLab | Postdoctoral Fellow
>
> Stanford University | 401 Quarry Rd
> <https://maps.google.com/?q=401+Quarry+Rd=gmail=g>
>
> lto...@stanford.edu | (650) 5615738
>
>
>
>
>
> *From: *"Glasser, Matthew" 
> *Date: *Friday, November 9, 2018 at 5:17 PM
> *To: *Leonardo Tozzi , NEUROSCIENCE tim <
> tsc...@mst.edu>
> *Cc: *Stamatios Sotiropoulos ,
> hcp-users 
>
> *Subject: *Re: [HCP-Users] Diffusion connectivity matrix with cortical
> and subcortical parcellation
>
>
>
> I think that --omatrix1 always outputs a dense matrix.
>
>
>
> Matt.
>
>
>
> *From: *Leonardo Tozzi 
> *Date: *Friday, November 9, 2018 at 7:15 PM
> *To: *Timothy Coalson 
> *Cc: *Matt Glasser , Stamatios Sotiropoulos <
> stamatios.sotiropou...@ndcn.ox.ac.uk>, hcp-users <
> hcp-users@humanconnectome.org>
> *Subject: *Re: [HCP-Users] Diffusion connectivity matrix with cortical
> and subcortical parcellation
>
>
>
> Dear Timothy,
>
>
>
> Exactly, the goal was to have a structural connectome that has the same
> parcels as a functional one and covers the whole brain.
>
> The problem with the surface ROI is that I would be missing the
> subcortical structures, which I would like to retain. Maybe there is a
> simpler way of doing this that I am missing. I could compute a dense
> connectome and parcellate it in a second step maybe? I thought that doing a
> ROI to ROI approach would be simpler, but I might have been mistaken.
>
> Looking at the tutorial document, it seems I can obtain a dense connectome
> with the following call:
>
> probtrackx2 --samples=../T1w/Diffusion.bedpostX/merged
> --mask=../T1w/Diffusion.bedpostX/nodif_brain_mask
> --xfm=xfms/standard2acpc_dc --invxfm=xfms/acpc_dc2standard
> --seedref=T1w_restore.2.nii.gz --loopcheck --forcedir -c 0.2 --sampvox=2
> --randfib=1 --stop=Connectome/stop --wtstop=Connectome/wtstop
> –forcefirststep --waypoints=ROIs/Whole_Brain_Trajectory_ROI_2 -x
> Connectomes/GrayOrdinates.txt --omatrix1 --dir=Connectomes
>
> I am just wondering, what are these inputs and how would I obtain them:
> --stop=Connectome/stop, --wtstop=Connectome/wtstop,
> waypoints=ROIs/Whole_Brain_Trajectory_ROI_2 and
> Connectomes/GrayOrdinates.txt ?
>
>
>
> Thank you very much,
>
>
>
>
>
> Leonardo Tozzi, MD, PhD
>
> Williams PanLab | Postdoctoral Fellow
>
> Stanford University | 401 Quarry Rd
> <https://maps.google.com/?q=401+Quarry+Rd=gmail=g>
>
> lto...@stanford.edu | (650) 5615738
>
>
>
>
>
> *From: *Timothy Coalson 
> *Date: *Friday, November 9, 2018 at 5:02 PM
> *To: *Leonardo Tozzi 
> *Cc: *"Glasser, Matthew" , Stamatios Sotiropoulos <
> stamatios.sotiropou...@ndcn.ox.ac.uk>, hcp-users <
> hcp-users@humanconnectome.org>
> *Subject: *Re: [HCP-Users] Diffusion connectivity matrix with cortical
> and subcortical parcellation
>
>
>
> So, it gets a little complicated, because you have to be careful about
> what order the different sections of seeds were put together in.  I don't
> know how specifying multiple ROIs to probtrackx works, keeping it simple
> and doing a single combined surface ROI that covers all the areas you want
> is more likely to be usable.  The likely reason for the volume loop is
> because in cifti, the different subcortical structures are stored as
> separate sections, but the entire used part of a surface is stored
> contiguously in vertex order.
>
>
>
> I am still missing the big picture here: why do you want to use labels to
> constrain 

Re: [HCP-Users] Diffusion connectivity matrix with cortical and subcortical parcellation

2018-11-09 Thread Timothy Coalson
I don't know how to get probtrackx to output a parcels by parcels matrix,
or even if it can.  If the output files are large (meaning they contain
per-vertex and per-voxel tracks), then the ROIs have not helped you.

Making a dense connectome and then parcellating it will definitely work,
and won't take more computation, but it will (at least temporarily) take
more disk space (and IO time).

Tim


On Fri, Nov 9, 2018 at 7:15 PM, Leonardo Tozzi  wrote:

> Dear Timothy,
>
>
>
> Exactly, the goal was to have a structural connectome that has the same
> parcels as a functional one and covers the whole brain.
>
> The problem with the surface ROI is that I would be missing the
> subcortical structures, which I would like to retain. Maybe there is a
> simpler way of doing this that I am missing. I could compute a dense
> connectome and parcellate it in a second step maybe? I thought that doing a
> ROI to ROI approach would be simpler, but I might have been mistaken.
>
> Looking at the tutorial document, it seems I can obtain a dense connectome
> with the following call:
>
> probtrackx2 --samples=../T1w/Diffusion.bedpostX/merged
> --mask=../T1w/Diffusion.bedpostX/nodif_brain_mask
> --xfm=xfms/standard2acpc_dc --invxfm=xfms/acpc_dc2standard
> --seedref=T1w_restore.2.nii.gz --loopcheck --forcedir -c 0.2 --sampvox=2
> --randfib=1 --stop=Connectome/stop --wtstop=Connectome/wtstop
> –forcefirststep --waypoints=ROIs/Whole_Brain_Trajectory_ROI_2 -x
> Connectomes/GrayOrdinates.txt --omatrix1 --dir=Connectomes
>
> I am just wondering, what are these inputs and how would I obtain them:
> --stop=Connectome/stop, --wtstop=Connectome/wtstop,
> waypoints=ROIs/Whole_Brain_Trajectory_ROI_2 and
> Connectomes/GrayOrdinates.txt ?
>
>
>
> Thank you very much,
>
>
>
>
>
> Leonardo Tozzi, MD, PhD
>
> Williams PanLab | Postdoctoral Fellow
>
> Stanford University | 401 Quarry Rd
> <https://maps.google.com/?q=401+Quarry+Rd=gmail=g>
>
> lto...@stanford.edu | (650) 5615738
>
>
>
>
>
> *From: *Timothy Coalson 
> *Date: *Friday, November 9, 2018 at 5:02 PM
>
> *To: *Leonardo Tozzi 
> *Cc: *"Glasser, Matthew" , Stamatios Sotiropoulos <
> stamatios.sotiropou...@ndcn.ox.ac.uk>, hcp-users <
> hcp-users@humanconnectome.org>
> *Subject: *Re: [HCP-Users] Diffusion connectivity matrix with cortical
> and subcortical parcellation
>
>
>
> So, it gets a little complicated, because you have to be careful about
> what order the different sections of seeds were put together in.  I don't
> know how specifying multiple ROIs to probtrackx works, keeping it simple
> and doing a single combined surface ROI that covers all the areas you want
> is more likely to be usable.  The likely reason for the volume loop is
> because in cifti, the different subcortical structures are stored as
> separate sections, but the entire used part of a surface is stored
> contiguously in vertex order.
>
>
>
> I am still missing the big picture here: why do you want to use labels to
> constrain the tractography?  Is what you actually want an all parcels by
> all parcels matrix?
>
>
>
> Tim
>
>
>
>
>
> On Fri, Nov 9, 2018 at 6:37 PM, Leonardo Tozzi 
> wrote:
>
> Dear Timothy,
>
>
>
> Thank you very much for your quick response.
>
> To clarify some points: some ROIs were surface based and some voxel based.
> To create them, I followed the steps I outlined along this thread, which I
> am summarizing below:
>
>
>
> # creating cortical labels
>
> wb_command -cifti-separate merged.dlabel.nii COLUMN -label CORTEX_LEFT
> cortL.label.gii
>
> wb_command -cifti-separate merged.dlabel.nii COLUMN -label CORTEX_RIGHT
> cortR.label.gii
>
>
>
> # creating subcortical labels
>
> wb_command -cifti-separate merged.dlabel.nii COLUMN -volume-all
> subcort.nii.gz
>
>
>
> # creating the cortical ROIs
>
> for region in R_V1_ROI R_MST_ROI (etc.)
>
> do
>
> wb_command -gifti-label-to-roi cortR.gii $PWD/ROIs/${region}.func.gii
> -name $region
>
> wb_command -gifti-label-to-roi cortL.gii $PWD/ROIs/${region}.func.gii
> -name $region
>
> done
>
>
>
> # creating the subcortical ROIs
>
> for region in L_Amygdala R_Amygdala (etc.)
>
> do
>
> wb_command -volume-label-to-roi subcort.nii.gz 
> $PWD/ROIs_subcort/${region}.nii.gz
> -name $region
>
> done
>
>
>
> # converting cortical ROIs to ASCII
>
> for region in R_V1_ROI R_MST_ROI (etc.)
>
> do
>
> surf2surf -i /Users/leonardotozzi/Desktop/DiffusionConnectivityTest/
> conn008/MNINonLinear/fsaverage_LR32k/conn008.R.white.32k_fs_LR.surf.gii
> -o $PWD/ROIs_cort/$region.asc --values=$

  1   2   3   4   5   6   >