I don't know what 3dDeconvolve does, but if you used it to try to do
anything spatial (spatial sharpening, PSF, etc), then the files produced by
wb_command -cifti-convert would be entirely inappropriate. The only
meaningful operations that can be done on -cifti-convert "volume-ish" files
are tempo
Sorry, in my second paragraph I meant "breaking the usual convention of the
scene file XML".
Tim
On Fri, Jun 21, 2019 at 2:28 PM Timothy Coalson wrote:
> The paths inside a scene file's XML are supposed to be relative to the
> location of the scene file. You are expecte
The paths inside a scene file's XML are supposed to be relative to the
location of the scene file. You are expected to usually have the scene
file in a directory nearby where the data it refers to is, to reduce how
far it crosses your filesystem structure to generate the relative paths.
You can us
(i) You can use wb_command -cifti-label-export-table on the dlabel file to
get the order of the parcels in a fixed format, though there are extra
lines and numbers in the output text file. 360 parcels is a rather long
table, you might consider a matrix figure instead, and only mention the
highligh
Could you clarify what exactly you tried to download? I don't know of any
non-human data in connectomedb, either.
Tim
On Tue, Jun 18, 2019 at 9:17 AM DE CASTRO Vanessa
wrote:
> Good morning, I'm trying to download the example data to run the HCP
> Pipelines, and also the ones related to the n
You cannot have multiple label keys with the same name in a single label
map.
There are two very separate things that label volumes can do in cifti
create commands: one is to set the name of a particular subset of voxels to
group them into a "structure" (which has a limited set of available values
Tractography visualization is somewhat rough around the edges. What was
the full probtrackx command you used? Do you have bingham parameter
volumes for the fiber orientations (mean, stdev, theta, phi, psi, ka, kb),
or only the fiber orientation sample volumes?
Tim
On Fri, Jun 7, 2019 at 7:48 A
"wb_command -gifti-help" is intended to help explain these file formats:
https://www.humanconnectome.org/software/workbench-command/-gifti-help
There are other -*-help options for other formats, and other aspects of
wb_command, like how to read the command usage info:
https://www.humanconnectome
Also, if you literally want spatial gradient magnitude, you can run
-cifti-gradient on the correct dimension of the pdconn (or dpconn) file.
Tim
On Mon, Jun 3, 2019 at 8:14 PM Timothy Coalson wrote:
> If you just want to look at them first, you can load them into wb_view.
> Depend
>>
>>
>>
>> --
>>
>> Joseph M. Orr, Ph.D.
>>
>> Assistant Professor
>>
>> Department of Psychological and Brain Sciences
>>
>> Texas A&M Institute for Neuroscience
>>
>> Texas A&M University
>>
>>
gt;
>
> --
>
> Joseph M. Orr, Ph.D.
>
> Assistant Professor
>
> Department of Psychological and Brain Sciences
>
> Texas A&M Institute for Neuroscience
>
> Texas A&M University
>
> College Station, TX
>
>
>
>
>
> On Mon, Jun 3, 2019
In particular, if you are only tracking the parcels x parcels matrix, using
the 59k surfaces should make effectively no difference. They mostly exist
to try to capture the higher resolution fMRI data. Even if you were
capturing per-vertex tractography counts, the uncertainty in the
probabilistic
ld
> come out from that. Anyway I'm not sure I understand why would it fail. Is
> it something related to rounding coordinates or working directly in image
> space?
>
> Regards,
> Jaime
>
> El mar., 28 may. 2019 a las 20:46, Timothy Coalson ()
> escribió:
>
>>
If you need to preserve the mm coordinates of the ROI, I would not trust
FSL's resampling with an identity transform to get it right, because that
will produce a different shift depending on what coordinates a particular
corner voxel is at in each image (as I understand it, FSL's conventions
come f
This may not be related to your particular problem, but you need to have
FSL 6.0.1 for some of the pipelines (MR FIX in particular). Using fslhd on
the BothPhases and Mask files should give others on the list some
information to work with.
Tim
On Tue, May 21, 2019 at 8:18 AM Simon Wein <
simon.
You can also do that with wb_command -cifti-parcellate using "-method SUM".
Tim
On Fri, May 17, 2019 at 11:04 AM Aaron C wrote:
> Hi Stam and Matt,
>
> I have one more question about this. I got the dense connectome and would
> like to calculate structural connectivity between the parcels in M
Dense files have independent values for every vertex and voxel that is
used. What you are describing (one value per ROI) is a parcellated file,
such as pscalar. To use them, first make your ROIs into a dlabel file
(give each ROI a separate integer value, and then use wb_command
-cifti-label-impor
Open a new terminal or do "rehash" in the existing one, and see if that
works - shells have tricks to remember what executables are in what
directories, so they don't have to search the filesystem every time they
run something, and this cleverness can be tricked by changing things
without telling t
Additionally, this may already be fixed in the latest master (as of 3 weeks
ago).
Tim
On Thu, May 9, 2019 at 1:47 PM Timothy Coalson wrote:
> The quick solution is to add that path to your default matlab path, with
> the added benefit that you can then use ciftiopen and related in yo
The quick solution is to add that path to your default matlab path, with
the added benefit that you can then use ciftiopen and related in your own
code. Our setups always have a version of these functions in the default
matlab path, which is probably why we missed this.
Tim
On Thu, May 9, 2019
For just finding the overlap of some (positive-only) map with the parcels,
the script would likely be a lot simpler if you used -cifti-parcellate with
the "-method SUM" option (when doing so, I would also recommend using
vertex areas, so that the resulting numbers are surface-area integrals
rather
We recommend sharing the results as data files (as mentioned, this is the
intent of BALSA), even if you choose to report MNI coordinates in the
text. Something to keep in mind is that group average surfaces do not
behave like group average volume data, the surface gets smoothed out
wherever foldin
Correction, the issue to follow is #107:
https://github.com/Washington-University/HCPpipelines/issues/107
Tim
On Tue, Apr 23, 2019 at 4:35 PM Harms, Michael wrote:
>
>
> For users that want to follow this, please see:
>
> https://github.com/Washington-University/HCPpipelines/issues/108
>
>
>
We recommend mapping the individual cortical data to surfaces before doing
anything else with the data. If you have fieldmaps, and high-res T1w and
T2w, you may be able to use the HCP pipelines to do this:
https://github.com/Washington-University/HCPpipelines
If you don't have these scans, anoth
I haven't used eddy, but since it looks like an output file, my first
thought is permissions - does that folder exist, and can the user you are
running the eddy job as write files to it? Does a file with that name
exist, and not have write permissions?
Tim
On Mon, Apr 22, 2019 at 11:39 AM Timot
For the most part, it exists because I was testing a new method for
computing the TFCE transform itself (it is an integral containing cluster
size which other utilities generally approximate by using many different
thresholds). We do not currently do statistical testing within wb_command.
Tim
O
gt; sample of 30 patients pre/post tx.
>
> Thanks a lot for your help and patience.
>
> Leah.
>
>
>
> On Apr 15, 2019, at 9:39 PM, Timothy Coalson wrote:
>
> I would also suggest changing your log level to INFO in wb_view,
> preferences (the wb_command option does
I don't know which parcels are assigned to each network, but if you need to
know the current order of the parcels, wb_command -file-information will
show that.
If you have a dlabel file with the networks as labels, you can put that
through -cifti-parcellate to get each parcel labeled with its majo
database and add the HCP data use terms.
>
> Matt.
>
> From: on behalf of Timothy
> Coalson
> Date: Tuesday, April 16, 2019 at 5:11 PM
> To: "Burgess, Gregory"
> Cc: "Curtiss, Sandy" , 李婧玮 ,
> Thomas Yeo , "hcp-users@humanconnectome.org" &
However, he is not sharing the HCP data files themselves, but only results
obtained from using them. The data use terms only state that the *original
data* must be distributed under the same terms. Derived data appears to
only be covered by "all relevant rules and regulations imposed by my
instit
I have pushed a similar edit to reapply MR fix, please update to the latest
master.
Tim
On Mon, Apr 15, 2019 at 8:27 PM Timothy Coalson wrote:
> They weren't instructions, I pushed an edit, and it was a different script.
>
> Tim
>
>
> On Mon, Apr 15, 2019 at 8:08 PM Gl
uctions for this.
>
> Also, the log_Warn line is again concerning as to whether you followed the
> installation instructions and all version 4.0.0 files here.
>
> Matt.
>
> From: Marta Moreno
> Date: Monday, April 15, 2019 at 8:53 AM
> To: Matt Glasser
> Cc: HCP
The cleanest way to do it is to use -cifti-create-dense-from-template to
put the vertex area metrics into a cifti file (this may have already been
done, look at what files exist with "va" in the name), and then use
-cifti-parcellate on that with -method SUM.
Tim
On Fri, Apr 12, 2019 at 4:30 PM K
Connectome workbench is agnostic to species, though there are some defaults
(identification symbol size) which default to a size suited to the human
brain. We frequently use it with primate data.
Workbench can display probabilistic trajectories generated with fsl's
bedpostx/probtrackx tools (for
To add some context, the _acpc_dc_restore_brain versions of the files are
*outputs* of the structural pipelines. We do not run freesurfer on masked
T1w images.
Matt is asking for the exact arguments you provided to
FreeSurferPipeline.sh, by providing the full command line that was run that
contai
The files themselves are in the pipelines repository, if that helps:
https://github.com/Washington-University/HCPpipelines/tree/master/global/templates
It is visually obvious that they are not left/right symmetric, assuming
that is what you were asking.
Tim
On Fri, Apr 5, 2019 at 4:54 PM Glass
You could use the group average MNI surfaces from connectomedb for
visualization. If you need a single coordinate per parcel, you can use
wb_command -surface-coordinates-to-metric on the surfaces, combine those
metric files into a cifti file with -cifti-create-dense-from-template, and
use -cifti-p
The 1.0 and 3.0 versions on github are nearly identical, that was just a
naming issue.
The version in FSL may be based on version 2, and is missing a library
needed for HOCR, so some options in v3 aren't available. You should be
able to use the fsl versions of the executables other than msm (so,
No, since the subcortical data needed to be in MNI space, we chose to use
MNI space surfaces for each subject so that we only needed to generate a
single motion-corrected volume timeseries. Because the per-subject
processing uses the individual surfaces and the same warpfield for surface
and volum
d?
>
> Mor
>
> On Tue, Apr 2, 2019 at 2:23 PM Timothy Coalson wrote:
>
>> The HCP MMP 1.0 parcellation could not have been made without using
>> surface-based methods, due to their increased accuracy in aligning
>> functional areas over existing volume-based registr
The HCP MMP 1.0 parcellation could not have been made without using
surface-based methods, due to their increased accuracy in aligning
functional areas over existing volume-based registrations. Volume-based
group data generally cannot have the cortical precision that the HCP MMP
1.0 implies. See
That looks like you cut off some brain tissue. I'm not really sure what
your goal is here, but if you have images with different voxel sizes, what
you may actually need to do is to resample an image (flirt, applywarp, or
wb_command -volume-*-resample), and not crop it.
Tim
On Sun, Mar 24, 2019
The labels are used in the order of their keys, which is also how the
exported label table is ordered. If your dlabel file has more label names
than there ended up being parcels, you can first use
-cifti-parcel-mapping-to-label to get a minimal dlabel file that exactly
matches the parcels mapping:
Inline replies.
Tim
On Fri, Mar 22, 2019 at 10:17 AM Claude Bajada
wrote:
> Dear experts,
>
> Could I just confirm that the data that is found in:
>
>
> ${SubjectFolder}/MNINinLinear/Results/rfMRI_REST?_LR/rfMRI_REST?_LR_Atlas_MSMAll_hp2000_clean.dtseries.nii
>
> Is the resting state data usin
First, you'll need to export the label table of the original file (the
name, key value, and color values for each label, see -cifti-label-import),
with wb_command -cifti-label-export-table. You'll need to either figure
out a key value (first number in each row of numbers) that hasn't been used
yet
I have occasionally seen the volume slice outline show something like that
before, but as far as I could tell, the surface was actually okay. It may
just be a display bug in wb_view, but we haven't pinned it down.
Tim
On Mon, Mar 18, 2019 at 8:44 AM Aaron C wrote:
> Dear HCP experts,
>
> I ha
There isn't a dedicated command to get the parcel names, but they are in
the output of wb_command -file-information on the parcellated file, or you
can take them from -cifti-label-export-table on the dlabel file.
Tim
On Mon, Mar 4, 2019 at 1:55 AM Tali Weiss wrote:
> i did
> wb_command -cifti-
That is saying that you don't have the matlab gifti library installed (or
it isn't on your matlab path).
Tim
On Tue, Feb 26, 2019 at 6:09 PM Leonardo Tozzi wrote:
> Dear Michael,
>
>
>
> Thank you very much for all the consideration on the use of FIX for the
> task data.
>
> I have tried the a
A medial wall mask is used to mask out data for at least cifti files. It
is hard to say for sure (the volume to surface mapping is more involved
than the closest vertex logic used in the GUI to identify a vertex), but I
would guess that both get masked out by the medial wall currently. Future
reg
The command wb_command -surface-vertex-areas will give you the area of each
vertex. For vertex volume, you should use wb_command -surface-wedge-volume.
When comparing these kinds of measures, it is usually better to measure
them in an anatomically faithful space (such as the T1w space of each
sub
In that gifti file, the label table indicates that ??? is label 0, as is
recommended (it means things that haven't been labeled, such as the medial
wall). The matlab gifti library must be shifting these values, possibly
because they are taken as indices into another matlab array (matlab doesn't
ac
onLinear/100307.L.sphere.164k_fs_LR.surf.gii \
> BARYCENTRIC \
> 100307/T1w/100307.L.sphere.164k_fs_LR.surf.gii
>
> Many thanks for your help!
>
>
>
> Le mer. 20 févr. 2019 à 20:34, Timothy Coalson a écrit :
>
>> Sorry, the recommended sphere for resampling any subje
Sorry, the recommended sphere for resampling any subject will of course be
that subject's version of that file, not specifically subject 100307's
sphere.
Tim
On Wed, Feb 20, 2019 at 1:31 PM Timothy Coalson wrote:
> On Wed, Feb 20, 2019 at 8:03 AM CHAUMON Maximilien <
> max
On Wed, Feb 20, 2019 at 8:03 AM CHAUMON Maximilien <
maximilien.chau...@icm-institute.org> wrote:
> Hello,
>
> I'm looking at fine changes in MEG forward leadfields and would like to
> use the 164k meshes in each subject (I know 164k vertices are overkill, but
> I need this high res rendering for
Parcellated files contain only one value per parcel (per map), so it isn't
a good idea to try to reconstitute them into a spatial map before
analysis. I think the correct thing to do is to put them through PALM in a
way that doesn't use spatial information (no tfce, no smoothing, etc).
Tim
On M
As long as the volume tab is view yoked, and the "move to identified
location" button is enabled, yes (both of these have preferences as to
whether they are the default, but the initial preference setting is to have
them enabled by default). The mm coordinates will be the same, surface or
volume (
-parcellate to generate all the centers of gravity, or
use -gifti-label-to-roi and -metric-stats to take the center of gravity of
a single parcel.
Tim
On Wed, Feb 13, 2019 at 1:34 PM Timothy Coalson wrote:
> Correction to the volume center of gravity part: if you convert to ROIs
> before m
On Wed, Feb 13, 2019 at 1:26 PM Timothy Coalson wrote:
> The command you are looking for is -metric-to-volume-mapping for an ROI,
> or -label-to-volume-mapping for the entire label file. Note that these
> take gifti inputs, so you will need -cifti-separate and possibly
> -gifti-label-
The command you are looking for is -metric-to-volume-mapping for an ROI, or
-label-to-volume-mapping for the entire label file. Note that these take
gifti inputs, so you will need -cifti-separate and possibly
-gifti-label-to-roi first (alternatively, -volume-label-to-roi
afterwards). We don't hav
file (also, it expects
only data values in the input to -from-text). The order of parcels is set
during -cifti-parcellate, determined by the key values of each label (which
in turn is set during -cifti-label-import).
Tim
On Thu, Feb 7, 2019 at 5:11 PM Timothy Coalson wrote:
> BALSA does not
interested in that would need to
> convert the cifti files back to CSV.
>
>
>
> More generally, do you suggest any methods to share HCP-derived files in
> an arbitrary format?
>
>
>
> Thanks in advance for the help.
>
>
>
> Best,
>
> Caio
>
>
&
rbitrary format?
>
> Thanks in advance for the help.
>
> Best,
> Caio
>
>
> Em sex, 8 de fev de 2019 às 06:19, Timothy Coalson
> escreveu:
>
>> If your data is organized as a value per parcel/network, you should be
>> able to turn it into parcellated cif
If your data is organized as a value per parcel/network, you should be able
to turn it into parcellated cifti files, which can be displayed in wb_view
(and therefore in scenes) as a matrix and/or as colored regions on the
surfaces and in the volume.
See wb_command -cifti-parcellate (to make a temp
Did you agree to the "open access" data use terms before trying to
download? The tutorial dataset is covered by these terms, but maybe it
doesn't make this obvious.
Tim
On Wed, Feb 6, 2019 at 5:31 PM Rosalia Dacosta Aguayo
wrote:
> Dear HCP users,
>
> I registered to the HCP web. I installed
The simple solution is to delete or rename the "libz.so.1" file in the
libs_linux64 directory.
What is going on is that an OS-supplied libpng is being pulled in, which
was compiled against a newer libz than we bundle in the linux distribution,
but the fact that libz was found in our bundled librar
As another option for modeling the diffusion data, I would suggest fitting
more anatomically-realistic models to the data, such as a multiple crossing
fiber model (ball and sticks or similar, for instance with bedpostx).
However, I don't know if they currently provide an analogous measure to
kurtos
That is the error I would expect from launching the executable in
exe_linux. If the script in bin_linux... was having problems, it should
say something like "/install/path/bin_linux64/../exe_linux64/wb_command".
Tim
On Mon, Jan 28, 2019, 4:43 PM Jayasekera, Dinal I've been trying to use the wo
On Mon, Jan 28, 2019, 1:16 PM Shana Adise Hello,
>
> Thank you in advance for your help! We would like to create functionally
> defined ROIs that are thresholded based on our F-map from PALM. Could you
> please clarify the files and inputs needed to go into -cifti-find-clusters?
>
>
> 1.
>
Surface data is different - we don't actually put surface coordinates into
dscalar, or any other cifti files (or metric files). In our data, the
surface coordinates are only contained in .surf.gii files. Getting the
surface data into "Native" volume space is as simple as using the surfaces
in the
BALSA was designed to support data use terms, not only does it have the HCP
data use terms available for easy selection when submitting a study, it
also allows entering other data use terms (so it can support non-HCP
datasets that require agreement to terms).
The HCP data use terms may not necessa
Try doing:
system('/wb_command');
Except using the exact string you are providing to ciftiopen's second
argument. You should get the usage information for wb_command if the path
is correct and things are working.
Tim
On Thu, Jan 17, 2019 at 4:38 PM Anita Sinha wrote:
> Matt/Michael,
>
>
> W
The strongest directional information in the diffusion data is in the white
matter, so I assume you are computing some measure from the scans and
specifically want to study its value only in gray matter?
The main command for this purpose is wb_command -volume-to-surface-mapping,
and we recommend t
As for your second question, the transform between MNINonLinear and
"Native" space (actually, undistorted, rigid registered) is a nonlinear
warpfield, not a 4x4 matrix (affine). As I recall, we ignore the 4x4
matrices in gifti surface files (.surf.gii), as they have caused more
trouble than good.
Cifti files are defined not only by their resolution, but also by ROIs that
exclude uninteresting or redundant locations, in particular the medial wall
vertices and white matter voxels for fMRI. This is the reason that
-cifti-resample needs a template, to define what is included/excluded.
I'm not
For display, another possibility is to put the analysis results for all
areas into a parcellated cifti file, which will show each area in a color
representing the value for that area.
Tim
On Thu, Jan 10, 2019 at 6:17 AM Glasser, Matthew wrote:
>
>
>1. These would not be with the HCP's parc
e resolution
> would be 1.25mm as the volumes are created from the diffusion image. (I'm
> working with the HCP test retest data)
>
> In other words, is there any command that can generate a CIFTI file with a
> different volume resolution?
>
> On Sat, 5 Jan 2019 at 6:46
It is possible, though it makes comparisons to the existing 2mm cifti data
more challenging. For instance, we have a 1.6mm space for our 7T data, the
files defining it are here:
https://github.com/Washington-University/HCPpipelines/tree/master/global/templates/170494_Greyordinates
Making yet ano
Thu, Jan 3, 2019 at 6:32 PM Glasser, Matthew wrote:
> I believe that FSL convertwarp converts between relative and absolute
> conventions, though the FSL coordinates issue might prevent that from being
> helpful.
>
> Matt.
>
> From: Timothy Coalson
> Date: Thursday, Ja
On Thu, Jan 3, 2019 at 10:47 AM Glasser, Matthew wrote:
> 1) FSL does not respect Workbench’s header info, so the labels get
> removed. You might need to use wb_command -volume-resample or copy over
> the header info.
>
Yes, use wb_command -volume-warpfield-resample and the enclosing voxel
meth
If you mean you just want to combine the magnitudes (the main output of
-cifti-gradient) across timepoints, that isn't hard. There isn't a built
in option to do it, but you can do it afterwards by -cifti-math to square
everything, -cifti-reduce to sum across time, and then -cifti-math to
square ro
Scene files are not a vector format internally, they store the entire GUI
state of wb_view (loaded files, window sizes, tab types and order, order of
layers in tabs, palette settings), and when they are displayed (or captured
via wb_command), the data files are loaded and all the display logic in
w
To expand on Jenn's answer, one way to see what wb_command does on windows
is to open "command prompt", cd to the "bin_windows64" folder where you
unzipped workbench, and then type "wb_command" and press enter.
If you do as the README.txt suggests and add that folder onto your PATH
environment var
We generally do use timeseries for single-subject analysis. The only
involvement of ICA there is in cleaning up things like artifacts - think of
it as using ICA to identify nuisance regressors. The end result is still a
timeseries, but with greatly reduced artifacts.
You can use wb_command -cift
.gii'.
>
> I set the second parameter as null and the color was created as default.
> Is there an example of the text file of the label-list file. I used the
> file as attachment but failed. Is there any problems with this file?
>
> Thank you so much.
>
> Best wishes
> -
&g
If you save those vectors of values as .func.gii files (maybe this is how
you made the .func.gii files you have?), you can use wb_command
-metric-label-import to turn them into .label.gii files:
https://www.humanconnectome.org/software/workbench-command/-metric-label-import
To get them onto HCP s
NEUROSCIENCE tim; Kenley, Jeanette
> *Cc:* hcp-users; Kaplan, Sydney
> *Subject:* Re: [HCP-Users] average dconn from individual dconns
>
> To be more specific: In the HCP we use a technique called MIGP to make
> group fMRI data and generate dense connectomes from that. Concat
The HCP pipelines deliberately resample the subcortical data in such a way
that the subcortical voxels used in each subject are the same, this is how
we handle the problem you are having.
If you concatenate your timeseries across subjects before correlation, you
don't need to generate a dconn for
> right? The cifti file should also include the subcortical regions, right?
> Do I need to combine the left and right surface and subcortcial volume into
> one cifti file? Sorry for the very basic questions. Thank you very much.
>
> Best wishes,
>
> Zhi Li
>
> On Tue, 27 Nov 201
The HCP MMP v1.0 parcellation is defined on MSMAll-registered surfaces. I
am not familiar with FSFAST, but if it gives you surface-based data, you
should be able to resample the data or parcellation so that they are on the
same mesh, following these instructions:
https://wiki.humanconnectome.org/
>>> But there are 10 more vertices in right hemisphere than left hemisphere
>>> ...
>>>
>>>
>>> On Tue, Nov 20, 2018 at 2:50 PM Glasser, Matthew
>>> wrote:
>>>
>>>> Left and right are registered.
>>>>
>&
gt; would be nice if HCP developers, in future, consider a version of CIFTI
> with medial wall included.
>
>
> On Mon, Nov 19, 2018 at 11:53 PM Timothy Coalson wrote:
>
>> The left and right hemisphere are intended to be in register, though I
>> don't recall
But
> apparently this is not the case! Any reason for this?
>
>
> On Fri, Nov 16, 2018 at 11:58 PM Timothy Coalson wrote:
>
>> The easiest to use (especially if your goal is to match other cifti
>> files) is generally -cifti-create-dense-from-template. It will even tu
e other supporting program?
>
>
> Kind regards,
> *Dinal Jayasekera*
>
> PhD Candidate | InSITE Fellow
> Ammar Hawasli Lab
> Department of Biomedical Engineering | Washington University in St. Louis
>
> --
> *From:* Timothy Coalson
> *Sen
The easiest to use (especially if your goal is to match other cifti files)
is generally -cifti-create-dense-from-template. It will even turn a 59k
surface-only cifti into a standard 91282 cifti (or vice versa, if you are
so inclined).
Yes, -cifti-export-dense-mapping will give you the cifti index
The purpose of the surface ROIs in -cifti-create-label is to prevent the
cifti file from needing to represent data inside the medial wall (where it
would generally be nonsensical). Since you haven't converted the data to
any kind of label format yet, it is not the command you want - putting the
da
Hawasli Lab
> Department of Biomedical Engineering | Washington University in St. Louis
>
> --
> *From:* Timothy Coalson
> *Sent:* Monday, November 12, 2018 4:04:33 PM
> *To:* Jayasekera, Dinal
> *Cc:* Glasser, Matthew; Dierker, Donna
-cifti-convert does not do this, it only dumps the matrix into different
file formats as-is, the spatial relationships are not accessible from its
output.
If you are only interested in subcortical/cerebellum data, that is trivial
to extract from cifti as a volumetric nifti using -cifti-separate wi
> *To:* NEUROSCIENCE tim
> *Cc:* Jayasekera, Dinal; Dierker, Donna; hcp-users@humanconnectome.org
>
> *Subject:* Re: [HCP-Users] Debugging IcaFIxProcessingBatch.sh
>
> We should change that line to use this file:
>
> ${StudyFolder}/${Subject}/MNINonLinear/brainmask_fs.nii.g
ember 9, 2018 at 5:17 PM
> *To: *Leonardo Tozzi , NEUROSCIENCE tim <
> tsc...@mst.edu>
> *Cc: *Stamatios Sotiropoulos ,
> hcp-users
>
> *Subject: *Re: [HCP-Users] Diffusion connectivity matrix with cortical
> and subcortical parcellation
>
>
>
> I think that -
nnectomes/GrayOrdinates.txt ?
>
>
>
> Thank you very much,
>
>
>
>
>
> Leonardo Tozzi, MD, PhD
>
> Williams PanLab | Postdoctoral Fellow
>
> Stanford University | 401 Quarry Rd
> <https://maps.google.com/?q=401+Quarry+Rd&entry=gmail&source=g>
&
1 - 100 of 663 matches
Mail list logo