On Thu, 16 Aug 2018, Harms, Michael wrote:
>Sorry, but not to my knowledge. We use ‘dcm2niix’ currently for DICOM to
>NIFTI conversion (‘dcm2niix’ generates nice .json files containing a bunch
>of relevant parameters of the scan). That particular step is pretty
>straightforward.
On Wed, 28 Feb 2018, Shankar Tumati wrote:
> Hello experts,
> I would like to get a list of neighbors of each vertex in a 32k surface file.
> How can I do this with workbench? If not with workbench, is there another way
> to get this info?
quick one: we rely on neighborhood information in
http://nipy.org/nibabel/reference/nibabel.cifti2.html
?
On October 12, 2017 12:18:32 AM EDT, Aaron Crank wrote:
>Dear HCP experts,
>
>
>I have a question about loading CIFTI files in the Python. Would you
>please suggest if there are any well-established tools in the
Thank you Michael and Jennifer again!
On Tue, 06 Dec 2016, Elam, Jennifer wrote:
>To add to what Mike Harms just wrote, it still sounds like you are
>thinking of the packages as data bundles for groups of subjects.
yes -- indeed -- it was probably the main reason for a bit of disconnect
On Tue, 06 Dec 2016, Elam, Jennifer wrote:
>A listing of the by subject unpacked files available, organized by
>modality and processing level, are available in Appendix 3 of the
>Reference Manual.
>The files are listed there as they unpack into a standard directory
>structure.
On Tue, 06 Dec 2016, Elam, Jennifer wrote:
>A listing of the by subject unpacked files available, organized by
>modality and processing level, are available in Appendix 3 of the
>Reference Manual.
>The files are listed there as they unpack into a standard directory
>structure.
On Tue, 06 Dec 2016, Hodge, Michael wrote:
> I've attached unzip -l output for the packages of a couple of subjects. One
> has MEG data in addition to the standard 3T data, and the other has 7T data
> in addition to the 3T data, so you can see what's in the packages.
Thank you Michael!
Dear HCP gurus,
db.humanconnectome.org/ provides convenient bundles of subjects/data.
Is it possible to obtain the lists of files (I guess as a subset
of files within hcp-openaccess/HCP or hcp-openaccess/HCP_900 s3 buckets)
which comes within each bundle? (without downloading all those bundles
Dear Developers of the workbench,
in the Volume view where you have e.g. all 3 panes visible, it would be
great if there was an easy way to point to the location defining
which slices should be visible.
I did find previous question/discussion
On Mon, 07 Apr 2014, Jennifer Elam wrote:
The latest version Workbench beta 0.85 release has not yet gone live, but
is set to occur later today, hopefully by 5pm CDT. Look for the
announcement, download the new version and check to see if the problem
still occurs.
ah - awesome --
On Mon, 07 Apr 2014, Timothy Coalson wrote:
As per the announcement, yes, it does include the source (on github,
[1]https://github.com/Washington-University/workbench, see releases for
the code matching the released binaries), with some basic instructions for
building (src/README
On Sat, 08 Mar 2014, David Van Essen wrote:
As a quick update, we are aiming for the next workbench release (v0.85;
cifti-2 compatibility), including source code, to occur in about a month.
That is great -- thanks for the update!
It would be especially nice if source release would be
On Sat, 01 Mar 2014, Russ Poldrack wrote:
hi all - it turns out that gifti files created by nibabel.gifti.giftiio
are not readable by connectome workbench
reciprocally related question which I have not got answer in my previous
inquiries: are/will be(?) sources of connectome workbench
On Tue, 20 Aug 2013, Marcus, Dan wrote:
There are a couple of additional data access methods that we'll be
implementing down the road. Working with the INCF, we'll be making the
data accessible over INCF Dataspace. We will also be putting the packages
up on the Amazon cloud.
Thank you Dan
14 matches
Mail list logo