of FSL options. Do
> you know if there is anyone who might be able to help with this?
>
> Thanks again,
> -hp
>
> *From:* Timothy Coalson
> *Sent:* Friday, March 16, 2018 2:15 PM
> *To:* Glasser, Matthew
> *Cc:* HERACLES PANAGIOTIDES ; hcp-users@humanconnectome.org
> *Subject:*
we can do both; an effect size map and then .95-1 in the corr p map?
>
>
> On Mar 16, 2018, at 4:59 PM, Timothy Coalson <tsc...@mst.edu> wrote:
>
> Since the extent that passes significance tests is dependent on number of
> subjects and other statistical power considerations, we
Load both of the volumes into wb_view (high-res mask and low-res fMRI - you
may want to separate out a single frame from the fMRI to keep memory usage
down) and see if they align with each other (if they don't display
correctly, turn on oblique volume drawing mode). If they do align, the
answer
Since the extent that passes significance tests is dependent on number of
subjects and other statistical power considerations, we instead recommend
viewing the effect size (beta) map. You can overlay outlines of what
passed the significance threshold by making that into a label file with
Yes, the newest pipelines used an option that isn't in the 1.2.3 release.
Grab the "dev_latest" zip file that matches your OS from here:
http://brainvis.wustl.edu/workbench/
Tim
On Mon, Mar 12, 2018 at 5:57 PM, Viessmann, Olivia M. <
oviessm...@mgh.harvard.edu> wrote:
> Hello,
>
>
> I am
When we compute parcellated connectivity, we first compute the average
timeseries within the parcels, and then correlate those, as it vastly
reduces the impact of noise. If we first computed the correlations, and
then averaged them within parcels, we would be losing a huge amount of
power.
The
ease edit your Subject line so it is more specific
> than "Re: Contents of HCP-Users digest..."
>
>
> Today's Topics:
>
> 1. Re: Best Approach for using old volumetric data to pick
> parcels-of-interest (Stevens, Michael)
> 2. Movement regressor missing (Linnman
Unfortunately, it is worse than that - even ignoring the individual
variability issue (which should not be ignored), a small change in MNI
coordinate can jump from one bank of a sulcus to the other, so having only
the center coordinate of a cluster makes this a badly posed problem (and
thus
me-to-surface
> mapping you describe? I’ll whip up a quick script to loop through about
> 120 datasets from this R01 project and let you know how well it works.
>
>
>
> Mike
>
>
>
>
>
> *From:* Timothy Coalson [mailto:tsc...@mst.edu]
> *Sent:* Friday, February 23
t know if I can run
> probtrackx using it.
>
> Regards,
> Kamal
>
> From: Timothy Coalson <tsc...@mst.edu>
> Date: Wednesday, February 28, 2018 at 6:56 PM
> To: "Glasser, Matthew" <glass...@wustl.edu>
> Cc: "Shadi, Kamal" <kamal.shad...@g
Workbench does this internally, but it doesn't currently have a command to
output it (the number of neighbors varies per vertex, so it could be hard
to make efficient use of in something like matlab). The .surf.gii file has
this information in it, in the triangles array, so if you load it, you
Also beware, since one of the dimensions is white matter voxels, the dconn
from -convert-matrix4-to-matrix2 will be larger than a standard 91k dconn
(78GB based on the information you posted, versus ~30GB for a 91k dconn).
Surface-to-surface tractography would normally be done with a different
The 6dof transform using the MNI template is only to get the orientation of
the structural images to be more predictable - we actually refer to the
result of this transform as "native volume space", because it is somewhat
more useful for our purposes than the scanner coordinates. It is based
only
The weights are to be able to place the border point somewhere other than
at a vertex. The three vertices are one of the triangles in the surface
file, and the weights represent where to place the point on that triangle.
If you want a border point at a vertex, the weight for that vertex should
Surface-based methods may boost your statistical power enough (by better
alignment, exclusion of irrelevant tissue, and smoothing that doesn't cross
sulcal banks, if you decide you need smoothing) that you may not need to
rely as much on existing ROIs. Parcel-based statistics have a lot of
power,
The error message appears to be from syntax in a bash script, so it should
be possible to track it down.
If you put "set -x" at the top of the script it fails in and rerun it, it
will tell you each command before executing it, which should help narrow
down which line is the problem. If the line
On the technical end, cifti dtseries are arranges such that a row is a
timecourse - you will need to select columns rather than rows. wb_command
-cifti-merge can do this.
On the math side, I don't know the details needed to sync the EV files with
the dtseries (whether the 0s from the EVs match
pfields but cannot seem to
> find them could you point me in the right direction/directory?
>
> Cheers,
>
> Claude
>
> On il-Ħamis, 22 ta Fra, 2018 01:07 , Timothy Coalson wrote:
>
> It may be better to use the individual subject's "native space"
> definitions. The
It may be better to use the individual subject's "native space"
definitions. The files in the T1w folder are in what we refer to as native
volume space (it is actually rigidly-aligned MNI space, but rigid alignment
preserves shape, so it can be used as if it were distortion-corrected
scanner
It appears that .obj is a fairly simple text format (
https://en.wikipedia.org/wiki/Wavefront_.obj_file), so you could hack it
together in shell, if nothing else. You can use wb_command -gifti-convert
to convert to ASCII encoding, and that would allow you to get the
coordinate and triangle
orkbench? I didn't
> find it on the HCP website. Thank you.
>
>
> Aaron
> --
> *From:* Timothy Coalson <tsc...@mst.edu>
> *Sent:* Wednesday, February 14, 2018 4:53:54 PM
> *To:* Aaron C
> *Cc:* hcp-users@humanconnectome.org
> *Subject:* Re: [HCP-
Not currently, no, though it could be added. It is undefined between left
and right hemispheres, so the obvious format would be a metric file, rather
than cifti grayordinates. Our uses of geodesic distance are generally
limited to a short distance, we haven't really found it useful to know the
Once you have done it once (or if you already have another cifti file that
has the correct setup), you can use -cifti-create-dense-from-template to do
this to another file in one step.
Tim
On Thu, Feb 8, 2018 at 7:53 PM, Glasser, Matthew wrote:
> You need to export the
Something that just bit me: you should probably specify "--rel" on
convertwarp and applywarp, because wb_command doesn't currently understand
the absolute warpfield convention that seems to be the default.
Tim
On Fri, Feb 2, 2018 at 10:07 PM, Timothy Coalson <tsc...@mst.edu>
z <
mois...@fmrib.ox.ac.uk> wrote:
> I had the same problem in the past... try:
> *c3d_affine_tool -itk ANTSmat.txt -ref myref -src mysrc -ras2fsl -o
> FSLmat.txt *
>
> However, as Tim said, this tool only converts the affine, not the warp.
>
> Moises.
>
>
>
> On 2
I haven't finished figuring this out, but it appears that the affine file's
"Parameters" array contains the 12 affine components you'd find in a normal
affine file. The warpfield does not seem to include the affine, so you
need to convert and use both of them, which may be why what you did looked
No, scene files are highly GUI-centric, taking captures of what is in a
scene is somewhat secondary. The number of display options for files (and
when we add new display features) would make it very cumbersome to make and
maintain a command to do the equivalent without the GUI.
You can set up a
Inline comments.
Tim
On Mon, Jan 29, 2018 at 9:51 AM, Xavier Guell Paradis
wrote:
> Dear HCP experts,
> I have a dlabel file that labels a particular nucleus of the left thalamus
> (thalamusnucleus.dlabel.nii). I would like to convert this dlabel file to a
> whole-brain
For specifically wb_command, there is online documentation, though it is
just another representation of the various help information you can get
from running wb_command:
https://www.humanconnectome.org/software/workbench-command
If you are referring to the HCP Course, trying to cover all of the
If the CIFTI XML specifies that it is "Version=1", then that is expected,
though in some sense it was an error. The CIFTI-1 specification
unfortunately specified that the nifti header must be used incorrectly.
This was changed in the CIFTI-2 specification.
Tim
On Fri, Jan 19, 2018 at 5:09 PM,
It is possible, first you need a cortical-only cifti file to use as a
template, if you don't have one, you can make one from the original ROI
files. The files are in the pipelines repo:
https://github.com/Washington-University/Pipelines/tree/master/global/templates/91282_Greyordinates
Use this
ditional effort.
> I managed to get the Nearest-Neighbour-like mapping I wanted done in
> Matlab. When I display this on the group average surfaces, it also looks
> pretty good. I just want to check that the workbench steps I have taken are
> correct.
>
> Best wishe
Get the "dev_latest" zip file that says "mac64" from here:
http://brainvis.wustl.edu/workbench/
Tim
On Fri, Jan 12, 2018 at 1:41 PM, Glasser, Matthew
wrote:
> You need the development version of Connectome Workbench, which Tim can
> point you to. We will soon be getting
To further explain: "latest master" means you should *not* go to the
"releases" page on github. Instead, go to the main page,
https://github.com/Washington-University/Pipelines , and click the green
"clone or download" button on the upper right above the file list.
Tim
On Tue, Jan 9, 2018 at
To clarify further, until a surface registration is done, there is no
obvious relationship between the vertices of one subject and another (and
the surfaces even have different numbers of vertices). After registration
and resampling to a standard mesh (MSMAll and fs_LR 32k are options for
these
To expand on the smoothing issue a bit, even 4mm FWHM spatial smoothing in
the volume causes substantial signal mixing between areas on opposite sides
of sulci, which sounds like a particularly bad idea for ICA. As I
understand it, group ICA shouldn't care much about spatial noise,
especially
The fs_LR 32k spheres use a resolution (vertex spacing) that is suitable
for 2mm fMRI data, but it sounds like you are using structural-resolution
voxels. As Matt says, I would put the fs_LR surface into your volume
space, and do only a single mapping, because nearest neighbor or enclosing
voxel
You can set map names on data files with wb_command -set-map-names.
However, as Matt says, we would rather support the ciftiread/ciftiwrite
functions for fMRI data.
Tim
On Mon, Dec 18, 2017 at 2:56 PM, A Nunes wrote:
> Hi HCP community,
>
> I have some parcellations
.surf.gii files only store geometry of cortex, they do not store activation
data. We generally suggest .func.gii rather than .shape.gii, even though
those two formats are the same, and with fMRI data, .func.gii is more
appropriate.
To get data from the volume onto the surface, you need to have
Since tractography is done on individual subject data, you don't want a
version of the parcellation that was mapped to the volume with the group
average surfaces (like the one you linked to). You can do per-subject
mapping yourself with wb_command -label-to-volume-mapping, using either the
group
This was a bug in the code. Please use the version from here instead:
https://github.com/Washington-University/wb_shortcuts
Tim
On Mon, Dec 11, 2017 at 7:50 AM, Dev vasu <
vasudevamurthy.devulapa...@gmail.com> wrote:
> Dear Sir,
>
> When i am using wb_shortcuts for my analysis , i am
For cifti files, you need to use -cifti-* commands, so you want to use
-cifti-label-import instead. Most wb_command operations require a very
specific file format for input.
You can run it without a label-list-file (as the help info says, use a pair
of quotes instead of a filename: ""), which
enough to be a problem.
Tim
On Wed, Dec 6, 2017 at 4:55 PM, Timothy Coalson <tsc...@mst.edu> wrote:
> Another possibility for a quick and dirty approximation to what you want,
> you may be able to use transparency of layers to blend together coloring
> from the different componen
Another possibility for a quick and dirty approximation to what you want,
you may be able to use transparency of layers to blend together coloring
from the different components, rather than manually mixing them
beforehand. However, you would need to play with the palettes, and it
couldn't be used
First, use -cifti-math to do the thresholding you want (if you have
negative p-values for deactivations, you could add something like " + 2 *
((x < 0) && (x > -0.05))"):
$ wb_command -cifti-math '(x > 0) && (x < 0.05)' above_thresh.dscalar.nii
-var x
Then, import it as a label file (this basic
The wb_shortcuts command is actually just a bash script. However, it calls
other utilities, and that function specifically calls both "mris_convert"
and "wb_command". Since it is complaining about "libgomp" which is
involved in parallelization, it may in fact be an issue with wb_command, as
I
Using -cifti-separate will get you the 2mm definitions of these structures,
in particular using "-volume-all -label ", but the same
information is already available in the pipellines repository:
"How much deformation" is a tricky subject, and we may very well mean
something different than you do when you say it.
The short version: surface registration does not cause anatomical
deformation. Any registered, resampled anatomical surface will still line
up perfectly with the volume it
You can resample cifti data to a different resolution without splitting it
up first, use wb_command -cifti-resample . You will need to specify both
-left-spheres and -right-spheres.
Tim
On Fri, Nov 3, 2017 at 1:26 PM, Jan Mathijs Schoffelen <
jm.schoffe...@gmail.com> wrote:
> Dear Julia,
>
>
Volume-based alignment of cortex is significantly worse than surface-based
cortical alignment, so we have not released a volume-based version of the
parcellation (because it can't be equivalent to using MSMAll), see this
previous thread:
The -*-label-import commands do not convert from separate ROIs to
single-map label format, because ROIs could overlap while labels are not
allowed to. You need to first create a file with a single map of different
integers for each ROI, before using these commands.
First, you need to ensure that
For a quick introduction to CIFTI, see FAQ #1:
https://wiki.humanconnectome.org/display/PublicData/HCP+Users+FAQ
You may also want to read wb_command -cifti-help, while some of it is
redundant with the FAQ, it also explains things you need to know about when
using cifti commands in wb_command:
rfMRI_REST1_LR_hp2000_clean.nii.gz is a volume file, not a cifti file.
Cifti files end in things like .dtseries.nii. See wb_command -cifti-help:
https://www.humanconnectome.org/software/workbench-command/-cifti-help
Tim
On Mon, Oct 23, 2017 at 4:17 PM, hercp wrote:
> I am
I just want to see what the mean myelination in these exact areas that
> turned out to be significant is in the groups I compared. Hope it makes
> sense.
>
> Thanks again!
>
> Lisa
>
> On 18 October 2017 at 01:12, Timothy Coalson <tsc...@mst.edu> wrote:
>
>> T
The -cifti-parcellate command is currently the easier way to do this, if
your ROIs don't overlap. The -roi option to -cifti-stats currently uses
only the first map, and only tests for whether the value is greater than 0,
so probably the first map in your roi input is actually all positive. Also
oud...@tcd.ie>
wrote:
> Hi Tim,
>
> Was there a response to the question about sumsDB requesting login
> information? I also ran into that problem a couple of days ago.
>
> Best wishes,
>
> Sean
>
> Sean Froudist-Walsh
> Postdoc
> New York University
>
> O
The s...@brainvis.wustl.edu address is still active, but it is a manual
process. As David gave you the information, I don't plan on redundantly
responding to the email to sums@brainvis.
Tim
On Mon, Oct 16, 2017 at 11:40 AM, Mars, R.B. (Rogier)
wrote:
> Hi David,
>
>
Nibabel should support cifti files now, though it may be fairly basic:
https://github.com/nipy/nibabel
I don't know of other projects for cifti in python.
Tim
On Wed, Oct 11, 2017 at 11:18 PM, Aaron Crank
wrote:
> Dear HCP experts,
>
>
> I have a question about
It was probably killed because it ran out of memory. The best solution is
to use a machine with more memory.
A workaround, but one that will take a LOT more time to run on each subject
(unless you happen to already have fast SSDs), is to add swap space to your
system, which will allow it to
That looks like a generic matlab error related to the amount of memory the
computer has. The memory requirements for some of these scripts are fairly
high (the machines we often use have 64GB of memory).
Tim
On Mon, Oct 9, 2017 at 9:58 AM, Sang-Young Kim wrote:
> Dear
he rows that did not belong to cortical vertices, and then did the math
> operations.
> Thank you very much,
> Xavier.
> ----------
> *From:* Timothy Coalson [tsc...@mst.edu]
> *Sent:* Wednesday, October 04, 2017 4:09 PM
> *To:* Xavier Guell Paradis
> *Cc:*
erenced.
>
> Peace,
>
> Matt.
>
> From: Timothy Coalson <tsc...@mst.edu>
> Date: Friday, October 6, 2017 at 5:30 PM
> To: "Gopalakrishnan, Karthik" <gkart...@gatech.edu>
> Cc: Matt Glasser <glass...@wustl.edu>, "hcp-users@humanconnecto
want to
use different seeding strategies.
Tim
On Fri, Oct 6, 2017 at 6:24 PM, Glasser, Matthew <glass...@wustl.edu> wrote:
> --ompl option in probtrackx2.
>
> Peace,
>
> Matt.
>
> From: "Gopalakrishnan, Karthik" <gkart...@gatech.edu>
> Date: Friday, Oc
appropriate.
>
> Peace,
>
> Matt.
>
> From: <hcp-users-boun...@humanconnectome.org> on behalf of Timothy
> Coalson <tsc...@mst.edu>
> Date: Tuesday, October 3, 2017 at 6:30 PM
> To: "Gopalakrishnan, Karthik" <gkart...@gatech.edu>
> Cc: &q
One of the methods to capture images (render pixmap) is not supported well
by intel GPUs. We use nvidia GPUs in the linux computers we run workbench
on, and they generally work well for us.
One workaround is to switch the image capture method in wb_view ->
Preferences -> OpenGL to "Grab Frame
nformation in the MMP paper:
https://www.ncbi.nlm.nih.gov/pubmed/27437579
Tim
On Wed, Oct 4, 2017 at 1:38 PM, Romuald Janik <romuald.ja...@gmail.com>
wrote:
> Thanks for the detailed explanations!
>
> I have just one follow up question regarding this point:
>
> On Tue, Oct 3, 20
That is a new option that is not yet in a released version of workbench,
and was probably not intended to be pushed to the pipelines repository.
You can use github to find the version before that line was added, and use
that instead. Alternatively, you can probably just comment out that line
and
You can use -cifti-replace-structure with the -volume-all option and a
volume file of zeroes. If you have a lot of maps (or you want to do
something similar to a long dtseries), you can make a cifti file with 1's
in the surfaces and 0's in voxels, and use -cifti-math to multiply them
together.
Since ROIs are not points, distance between them becomes a trickier
question. Since areas are connected through white matter rather than gray
matter, that also implies that the easy ways to calculate distance may not
be all that biologically relevant. This would point to using tractography
to
Inline comments.
On Tue, Oct 3, 2017 at 12:49 PM, Romuald Janik
wrote:
> Hi,
>
> I have recently started to look at the HCP single subject rfMRI data and I
> have a couple of beginner questions:
>
> 1) What is the difference between the various files (I downloaded just
We do have a version of the 7 networks labels as the first map in this file:
https://balsa.wustl.edu/file/show/Q2xn
It currently does not use a medial wall mask, so extra steps may be needed
for using it with HCP data.
I don't know if either current matlab method of cifti file IO has much
The -cifti-create-dense-* commands are indeed involved in manually making
cifti versions of your data, though you would also need things like
-volume-to-surface-mapping, as well as FreeSurfer to generate the subject
surfaces - the HCP pipelines are designed for automating these steps for
HCP and
I am not familiar with the .w format, but it sounds like mris_convert is
not actually converting it to gifti format. The second error from the .mgh
method sounds like it has the surface geometry in the file, which must be
removed - workbench requires geometry files (.surf.gii) and data files
Workbench uses OpenMP for parallelization, and is therefore controlled via
its environment variables. OMP_NUM_THREADS is probably the one you are
looking for. We will add this information to the help info of wb_command
in a future release.
Tim
On Fri, Sep 8, 2017 at 2:47 PM, Reinder Vos de
We do not apply that kind of narrow temporal filtering (we basically only
do a detrend), you will need to do such things yourself.
Tim
On Thu, Sep 7, 2017 at 9:27 PM, hercp wrote:
> I am looking at rfMRI data. Are there files that are already
> time-filtered between .01 and .08
The commands in wb_command are designed for scripting flexibility, they
each do a small, low-level operation, to be chained together to achieve
various tasks. However, they mainly output data files, there isn't much
for text output currently.
You could use -cifti-parcellate to parcellate your
Technically, what should matter more is what frequencies dominate the
correlation in the data of interest - if they are lower enough in frequency
than your sample rate (and therefore slice timing spread), then each
latency should produce reasonable maps. However, this also means that when
you
Creating a label file from ROIs is a bit more complicated than a single
command (label files automatically have a guarantee that the areas don't
overlap, but arbitrary ROIs can overlap).
>From what you currently have, one way to get to a new dlabel file is to
concatenate the ROIs you want to use
d smoothing to get
> things up to 3mm FWHM.
>
> Matt.
>
> From: Timothy Coalson <tsc...@mst.edu>
> Date: Friday, September 1, 2017 at 3:02 PM
> To: Matt Glasser <glass...@wustl.edu>
> Cc: "Harms, Michael" <mha...@wustl.edu>, Daria Jensen <
Using that transform that flips X around the origin should put left on
right and right on left, yes. Note, however, that the left and right
parcel ROIs are not the same, so you will have to figure out how to handle
the mismatched edges.
I would not suggest -volume-parcel-resampling. If you want
I believe the pipelines generally expect isotropic resolution, yes. I
don't think there is currently a way to use that variable to represent a
non-isotropic resolution.
However, there may be few (or no) steps that actually interpret that
variable as a number and generate a new volume space using
In workbench, we do not do things this way at all. Putting both surfaces
into a single file is neither necessary nor recommended, even for
visualization. wb_view will happily display both left and right surfaces
in a single tab, just load the surfaces as separate left and right gifti
files.
*Cc:* hcp-users@humanconnectome.org
> *Subject:* Re: [HCP-Users] How can I convert subcortical nifti files to
> dlabel files?
>
> It should work if you skip the last step and use the dlabel file.
>
> Peace,
>
> Matt.
>
> From: Xavier Guell Paradis <xavie...@mit.edu
I'm assuming you want them to match a standard grayordinate space, so that
they can be compared across subjects.
The simple way that doesn't account for residual subject differences in
subcortical locations is to first resample the data to the appropriate
resolution/orientation MNI space (222 for
colin are the best we can do
at present for displaying the cerebellum as a surface.
> Regards,
> Claude
>
> On 03.08.2017 02:05, Timothy Coalson wrote:
>
> On Wed, Aug 2, 2017 at 6:02 PM, James Morrow <james.mor...@monash.edu>
> wrote:
>
>> Thanks Tim and Matt fo
oss gifti surfaces to create a
> so-called "average surface"
>
> Can I ask then, is averaging the data associated with vertices from
> individual subjects and plotting the result on a template surface (eg colin
> or a just using an individual as a template) also problematic?
School of Psychological Sciences
> Monash University
> c/o MBI, 770 Blackburn Road
> Clayton VIC 3800
> Australia
>
> T: 03 9902 9768
> E: james.mor...@monash.edu <amy.al...@monash.edu>
> www.med.monash.edu.au/psych/bmh/
>
> <http://www.med.monash.edu.au/psy
I don't have a code snippet for you, but there is a separate "brain model"
for each subcortical structure, which contains voxel indices. The "index
map" that contains that brain model has the volume space information to
turn them into coordinates in mm.
Note that for the cortex, the CIFTI files
Hold on, you have gone past the point.
CIFTI files *DO NOT CONTAIN* coordinates for surface vertices. Do not look
for them in the header, they are not there. They are in gifti surface
files (.surf.gii) that we provide alongside the cifti files.
What Anand has done is use the fieldtrip matlab
As some additional explanation, the reason there are multiple files in the
unprocessed data is that we did not acquire all the directions in a single
scan (that would have been excessively long). Instead, they are split
across several scans, which are separate files before preprocessing. The
we do not take these coordinates through a volume warpfield
twice - using the coordinates from the existing native mesh surfaces
already in that volume space is much simpler.
> The question is:
>
>- Is this now a correct interpretation?
>
> Regards,
>
> Claude
>
> On 24.07.
On Mon, Jul 24, 2017 at 8:40 AM, Claude Bajada
wrote:
> Dear all,
>
> I have a two questions about the surfaces that live in the HCP data
> folders:
>
>1. MNINonLinear/fsaverage/
>2. T1w/fsaverageLR32k/
>
> I assume you mean the fsaverage_LR32k folder within each
On Mon, Jul 17, 2017 at 6:21 PM, Regner, Michael <
michael.reg...@ucdenver.edu> wrote:
> Hello Matt and HCP Community,
>
>
>
> Thank you for the helpful e-mail and words of encouragement. We are very
> encouraged by the data / results we can view in the Connectome Workbench;
> however,
> on behalf of David Hartman <
> dhartman1...@gmail.com>
> Date: Monday, July 17, 2017 at 6:24 PM
> To: Timothy Coalson <tsc...@mst.edu>
> Cc: "hcp-users@humanconnectome.org" <hcp-users@humanconnectome.org>
> Subject: Re: [HCP-Users] mapping HCP data int
as the first map:
https://balsa.wustl.edu/file/show/Q2xn
Tim
On Fri, Jul 14, 2017 at 2:42 PM, Timothy Coalson <tsc...@mst.edu> wrote:
> We have a version of the 17-network Yeo parcellation here:
>
> https://balsa.wustl.edu/W8wK
>
> 163842 sounds like a freesurfer resolution
The -cifti-separate command is for getting sections of a cifti file as
gifti (single hemisphere) or nifti (volume) files, and it is generally not
a good idea to run it on dconns. The -cifti-merge command is how you would
do this on the command line, and you would give it the row index (though if
Connectome Workbench currently does not have tools designed for doing
statistical inference. PALM is what we recommend, and it has support for
CIFTI files.
You can also use wb_command -cifti-convert -to-nifti to feed CIFTI data
into FSL tools, if they don't use spatial information. Note that
Symbolic links don't require the folders to be on the same filesystem. The
'operation not supported' error seems unusual for making a symbolic link.
Aside from the usual things to check (folder write permissions, whether
there is already something there with that name), also know that windows
and
You can combine the ROIs into a single map (with a different integer for
each) with -cifti-reduce using INDEXMAX. You should then also mask out the
areas that aren't in any ROI by using -cifti-reduce MAX on your ROIs file,
and then using -cifti-math with 'index * (mask > 0)'. Finally, you need
Then when I'm trying to visualise it, I'm just loading it in and trying to
> display over the atlas I'm using.
>
> Not sure if there's more info I can give.
>
> Thanks again for the help.
>
> Best wishes,
>
> Sean
>
>
> On 21 June 2017 at 11:42, Timothy Coalson <t
201 - 300 of 547 matches
Mail list logo