Hi Tim,

Thank you for the help with the workbench and the code.  Yes, it is a mac osx.  
Sorry for the mix up.


Also, I can get the first two commands to work for "MEAN" and "STDEV", but the 
last command gives me an error message:


wb_command -volume-math '((x - mean) / stdev)' normalized.nii.gz -fixnan 0 -var 
x <input> -var mean mean.nii.gz -repeat -var stdev stdev.nii.gz -repeat


ERROR: volume file for variable 'mean' has 1 subvolume(s), but previous volume 
files have 1200 subvolume(s) requested to be used


Any help would be appreciated.


best,

Jason






Jason S. Nomi, Ph.D.
Post-Doctoral Researcher (BCCL Lab)
Department of Psychology
University of Miami
5151 San Amaro Drive: Room 114A
Website: http://www.psy.miami.edu/bccl/
Email: jxn...@miami.edu
________________________________
From: Timothy Coalson <tsc...@mst.edu>
Sent: Monday, October 27, 2014 5:38 PM
To: Nomi, Jason
Cc: hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] Using Connectome Workbench Commands

Inline replies.

Tim

On Mon, Oct 27, 2014 at 10:26 AM, Nomi, Jason 
<jxn...@miami.edu<mailto:jxn...@miami.edu>> wrote:
Dear Experts,

I am trying to use the workbench commands but cannot get them to work.  After 
opening the “wb_command” file in my linux terminal, I get the message printed 
below.

Linux terminal?  The output shows that you successfully executed the Mac OS X 
binary, is there a linux system involved?

However, none of the commands work and nothing comes up when I type in the 
command by itself to get help.

I have tried all the various combinations.  With the “-“, without it.  With 
“wb_command” to start, without it, etc.

Try entering this at the command line:

/Applications/workbench/bin_macosx64/wb_command -volume-reduce

If you want to use it without entering that path each time, you'll need to add 
"/Applications/workbench/bin_macosx64/" to your PATH environment variable.  How 
to do this depends on what shell you are using.

Also, I would like to conduct variance normalization on some of the .nii files 
from the ICA-Fix dataset.  Would using the command, “-volume-reduce" with the 
VARIANCE operation accomplish this?

No, among other things, variance is nonlinear with the spread of the data.  If 
the files you are looking at end in .dtseries.nii, they are not volume files, 
but rather CIFTI files (there are other 2-part extensions that signify CIFTI 
files, but ICA-FIX should be using dtseries).  This thread contains the steps 
for normalizing (including demeaning) along timeseries while in CIFTI format:

http://www.mail-archive.com/hcp-users@humanconnectome.org/msg00444.html

Similar commands apply if you are actually dealing with volume file timeseries:

wb_command -volume-reduce <input> MEAN mean.nii.gz
wb_command -volume-reduce <input> STDEV stdev.nii.gz
wb_command -volume-math '(x - mean) / stdev' normalized.nii.gz -fixnan 0 -var x 
<input> -var mean mean.nii.gz -repeat -var stdev stdev.nii.gz -repeat

Thank you very much for you help.

best,
Jason











UM-46JNG5RP:~ admin$ /Applications/workbench/bin_macosx64/wb_command ; exit &

Why do you have "; exit &" on the end?

Connectome Workbench
Version: 1.0
Qt Compiled Version: 4.8.3
Qt Runtime Version: 4.8.3
commit: dfd2086d37612ccf2369b85b5f5f0f5987369339
commit date: 2014-09-09 13:23:57 -0500
Compiler: clang2++ (/usr/local/clang-openmp-opt/llvm/build/Release/bin)
Compiler Version:
Compiled Debug: NO
Operating System: Apple OSX

Information options:
   -help                 print this help info
   -arguments-help       explain how to read the help info for subcommands
   -version              print version information only
   -list-commands        print all non-information (processing) subcommands
   -all-commands-help    print all non-information (processing) subcommands and
                            their help info - VERY LONG

Global options (can be added to any command):
   -disable-provenance   don't generate provenance info in output files

If the first argument is not recognized, all processing commands that start
   with the argument are displayed





_______________________________________________
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


_______________________________________________
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

Reply via email to