Re: [HCP-Users] Acquire text file with averaged volume time series for subcortical structures from cifti file

2015-09-01 Thread Nomi, Jason
That was the problem - sorry about that. I have extracted the mean time series 
for the left amygdala now.


Thanks so much for the help!



I have one final question:


I assume that this ROI is from the freesurfer subcortical segmentation?







From: Timothy Coalson 
Sent: Tuesday, September 1, 2015 9:20 PM
To: Nomi, Jason
Cc: hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] Acquire text file with averaged volume time series for 
subcortical structures from cifti file

Your second command is different than the one I posted - specifically, the 
-volume option to -cifti-create-dense-from-template has a bug in v1.1 and 
v1.1.1, which causes it to do the wrong thing even when it doesn't error, use 
-volume-all instead (which does not take a structure name).

Tim


On Tue, Sep 1, 2015 at 7:08 PM, Nomi, Jason 
mailto:jxn...@miami.edu>> wrote:


When I do the third step, I now get the error:


ERROR: roi column is empty




The first command:

wb_command -cifti-separate 30_min.dtseries.nii COLUMN -volume AMYGDALA_LEFT 
output_left_amygdala.nii.gz -roi roi_left_amygdala.nii.gz

The second command:

wb_command -cifti-create-dense-from-template 30_min.dtseries.nii 
roi_left_amygdala.dscalar.nii -volume AMYGDALA_LEFT roi_left_amygdala.nii.gz

The third command:

wb_command -cifti-stats 30_min.dtseries.nii -reduce MEAN -roi 
roi_left_amygdala.dscalar.nii




Thanks for your patience on this Tim -




From: Timothy Coalson mailto:tsc...@mst.edu>>
Sent: Tuesday, September 1, 2015 8:12 PM

To: Nomi, Jason
Cc: hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>
Subject: Re: [HCP-Users] Acquire text file with averaged volume time series for 
subcortical structures from cifti file

That is a bug in -cifti-create-dense-from-template we hadn't caught, sorry.  To 
work around it, you can remove the -crop option from the -cifti-separate 
command (warning, may use a lot of memory, you could take only the first frame 
with -cifti-merge before doing -cifti-separate, as for this method you don't 
need the data volume output), and then use the -volume-all option to 
-cifti-create-dense-from-template instead of the -volume option, like so:

#get the roi in the full volume space
wb_command -cifti-separate tfMRI_EMOTION_LR_Atlas.dtseries.nii COLUMN -volume 
AMYGDALA_LEFT output_left_amygdala.nii.gz -roi roi_left_amygdala.nii.gz

#turn the ROI into cifti
wb_command -cifti-create-dense-from-template 
tfMRI_EMOTION_LR_Atlas.dtseries.nii roi_left_amygdala.dscalar.nii -volume-all 
roi_left_amygdala.nii.gz

#stats prints a number per column to standard output
wb_command -cifti-stats tfMRI_EMOTION_LR_Atlas.dtseries.nii -reduce MEAN -roi 
roi_left_amygdala.dscalar.nii

Tim


On Tue, Sep 1, 2015 at 9:57 AM, Nomi, Jason 
mailto:jxn...@miami.edu>> wrote:

Hi Tim,


When doing the second step (turn the ROI into cifti), I get the following error:


ERROR: input volume doesn't match volume space and dimensions in CIFTI




The code I am using for the first step:

wb_command -cifti-separate 30_min.dtseries.nii COLUMN -volume AMYGDALA_LEFT 
output_left_amygdala.nii.gz -roi roi_left_amygdala.nii.gz -crop

The code I am using for the second step:

wb_command -cifti-create-dense-from-template 30_min.dtseries.nii 
roi_left_amygdala.dscalar.nii -volume AMYGDALA_LEFT roi_left_amygdala.nii.gz 
-from-cropped


Any advice on what is happening?

Jason





From: Timothy Coalson mailto:tsc...@mst.edu>>
Sent: Monday, August 31, 2015 9:12 PM

To: Nomi, Jason
Cc: hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>
Subject: Re: [HCP-Users] Acquire text file with averaged volume time series for 
subcortical structures from cifti file

That file contains all the voxels of the structure, plus zero voxels out to the 
bounding box of the structure, so you probably also need the ROI volume to get 
it to do the right thing.  I'm not all that familiar with fsl tools.

The tfMRI_EMOTION_LR_Atlas.dtseries.nii file is the same as the input in the 
first step.  The roi_left_amygdala.dscalar.nii file is written out by the 
second step, for use in the third step.

Tim


On Mon, Aug 31, 2015 at 7:07 PM, Nomi, Jason 
mailto:jxn...@miami.edu>> wrote:

Thank you for the quick reply Tim!


I was wondering if I could simplify things by applying the fsl command 
"fslmeants" to the output from the first step, on the 
"output_left_amygdala.nii.gz" file?  That should get me the mean of all the 
voxels included in that file right?  If that file contains only the voxels of 
the left amygdala, then I figure that would get me the averaged time series.


If that is not possible, I am unclear where the file in the second step, 
"tfMRI_EMOTION_LR_Atlas.dtseries.nii roi_left_amygdala.dscalar.nii", comes from.


Thanks for the help -


Jason


__

Re: [HCP-Users] Acquire text file with averaged volume time series for subcortical structures from cifti file

2015-09-01 Thread Nomi, Jason

When I do the third step, I now get the error:


ERROR: roi column is empty




The first command:

wb_command -cifti-separate 30_min.dtseries.nii COLUMN -volume AMYGDALA_LEFT 
output_left_amygdala.nii.gz -roi roi_left_amygdala.nii.gz

The second command:

wb_command -cifti-create-dense-from-template 30_min.dtseries.nii 
roi_left_amygdala.dscalar.nii -volume AMYGDALA_LEFT roi_left_amygdala.nii.gz

The third command:

wb_command -cifti-stats 30_min.dtseries.nii -reduce MEAN -roi 
roi_left_amygdala.dscalar.nii




Thanks for your patience on this Tim -




From: Timothy Coalson 
Sent: Tuesday, September 1, 2015 8:12 PM
To: Nomi, Jason
Cc: hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] Acquire text file with averaged volume time series for 
subcortical structures from cifti file

That is a bug in -cifti-create-dense-from-template we hadn't caught, sorry.  To 
work around it, you can remove the -crop option from the -cifti-separate 
command (warning, may use a lot of memory, you could take only the first frame 
with -cifti-merge before doing -cifti-separate, as for this method you don't 
need the data volume output), and then use the -volume-all option to 
-cifti-create-dense-from-template instead of the -volume option, like so:

#get the roi in the full volume space
wb_command -cifti-separate tfMRI_EMOTION_LR_Atlas.dtseries.nii COLUMN -volume 
AMYGDALA_LEFT output_left_amygdala.nii.gz -roi roi_left_amygdala.nii.gz

#turn the ROI into cifti
wb_command -cifti-create-dense-from-template 
tfMRI_EMOTION_LR_Atlas.dtseries.nii roi_left_amygdala.dscalar.nii -volume-all 
roi_left_amygdala.nii.gz

#stats prints a number per column to standard output
wb_command -cifti-stats tfMRI_EMOTION_LR_Atlas.dtseries.nii -reduce MEAN -roi 
roi_left_amygdala.dscalar.nii

Tim


On Tue, Sep 1, 2015 at 9:57 AM, Nomi, Jason 
mailto:jxn...@miami.edu>> wrote:

Hi Tim,


When doing the second step (turn the ROI into cifti), I get the following error:


ERROR: input volume doesn't match volume space and dimensions in CIFTI




The code I am using for the first step:

wb_command -cifti-separate 30_min.dtseries.nii COLUMN -volume AMYGDALA_LEFT 
output_left_amygdala.nii.gz -roi roi_left_amygdala.nii.gz -crop

The code I am using for the second step:

wb_command -cifti-create-dense-from-template 30_min.dtseries.nii 
roi_left_amygdala.dscalar.nii -volume AMYGDALA_LEFT roi_left_amygdala.nii.gz 
-from-cropped


Any advice on what is happening?

Jason





From: Timothy Coalson mailto:tsc...@mst.edu>>
Sent: Monday, August 31, 2015 9:12 PM

To: Nomi, Jason
Cc: hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>
Subject: Re: [HCP-Users] Acquire text file with averaged volume time series for 
subcortical structures from cifti file

That file contains all the voxels of the structure, plus zero voxels out to the 
bounding box of the structure, so you probably also need the ROI volume to get 
it to do the right thing.  I'm not all that familiar with fsl tools.

The tfMRI_EMOTION_LR_Atlas.dtseries.nii file is the same as the input in the 
first step.  The roi_left_amygdala.dscalar.nii file is written out by the 
second step, for use in the third step.

Tim


On Mon, Aug 31, 2015 at 7:07 PM, Nomi, Jason 
mailto:jxn...@miami.edu>> wrote:

Thank you for the quick reply Tim!


I was wondering if I could simplify things by applying the fsl command 
"fslmeants" to the output from the first step, on the 
"output_left_amygdala.nii.gz" file?  That should get me the mean of all the 
voxels included in that file right?  If that file contains only the voxels of 
the left amygdala, then I figure that would get me the averaged time series.


If that is not possible, I am unclear where the file in the second step, 
"tfMRI_EMOTION_LR_Atlas.dtseries.nii roi_left_amygdala.dscalar.nii", comes from.


Thanks for the help -


Jason



From: Timothy Coalson mailto:tsc...@mst.edu>>
Sent: Monday, August 31, 2015 6:45 PM
To: Nomi, Jason
Cc: hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>
Subject: Re: [HCP-Users] Acquire text file with averaged volume time series for 
subcortical structures from cifti file

-cifti-separate can't do any averaging, that isn't its purpose.  Instead, you 
can use -cifti-stats and the roi of the amygdala to do such an average:

#what you already did, for the purpose of getting the ROI
wb_command -cifti-separate tfMRI_EMOTION_LR_Atlas.dtseries.nii COLUMN -volume 
AMYGDALA_LEFT output_left_amygdala.nii.gz -roi roi_left_amygdala.nii.gz -crop

#turn the ROI into cifti
wb_command -cifti-create-dense-from-template 
tfMRI_EMOTION_LR_Atlas.dtseries.nii roi_left_amygdala.dscalar.nii -volume 
AMYGDALA_LEFT roi_left_amygdala.nii.gz -from-cropped

#stats prints a number per column to standard output
wb_command -cifti-stats tfMRI_E

Re: [HCP-Users] Acquire text file with averaged volume time series for subcortical structures from cifti file

2015-09-01 Thread Nomi, Jason
Hi Tim,


When doing the second step (turn the ROI into cifti), I get the following error:


ERROR: input volume doesn't match volume space and dimensions in CIFTI




The code I am using for the first step:

wb_command -cifti-separate 30_min.dtseries.nii COLUMN -volume AMYGDALA_LEFT 
output_left_amygdala.nii.gz -roi roi_left_amygdala.nii.gz -crop

The code I am using for the second step:

wb_command -cifti-create-dense-from-template 30_min.dtseries.nii 
roi_left_amygdala.dscalar.nii -volume AMYGDALA_LEFT roi_left_amygdala.nii.gz 
-from-cropped


Any advice on what is happening?

Jason





From: Timothy Coalson 
Sent: Monday, August 31, 2015 9:12 PM
To: Nomi, Jason
Cc: hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] Acquire text file with averaged volume time series for 
subcortical structures from cifti file

That file contains all the voxels of the structure, plus zero voxels out to the 
bounding box of the structure, so you probably also need the ROI volume to get 
it to do the right thing.  I'm not all that familiar with fsl tools.

The tfMRI_EMOTION_LR_Atlas.dtseries.nii file is the same as the input in the 
first step.  The roi_left_amygdala.dscalar.nii file is written out by the 
second step, for use in the third step.

Tim


On Mon, Aug 31, 2015 at 7:07 PM, Nomi, Jason 
mailto:jxn...@miami.edu>> wrote:

Thank you for the quick reply Tim!


I was wondering if I could simplify things by applying the fsl command 
"fslmeants" to the output from the first step, on the 
"output_left_amygdala.nii.gz" file?  That should get me the mean of all the 
voxels included in that file right?  If that file contains only the voxels of 
the left amygdala, then I figure that would get me the averaged time series.


If that is not possible, I am unclear where the file in the second step, 
"tfMRI_EMOTION_LR_Atlas.dtseries.nii roi_left_amygdala.dscalar.nii", comes from.


Thanks for the help -


Jason



From: Timothy Coalson mailto:tsc...@mst.edu>>
Sent: Monday, August 31, 2015 6:45 PM
To: Nomi, Jason
Cc: hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>
Subject: Re: [HCP-Users] Acquire text file with averaged volume time series for 
subcortical structures from cifti file

-cifti-separate can't do any averaging, that isn't its purpose.  Instead, you 
can use -cifti-stats and the roi of the amygdala to do such an average:

#what you already did, for the purpose of getting the ROI
wb_command -cifti-separate tfMRI_EMOTION_LR_Atlas.dtseries.nii COLUMN -volume 
AMYGDALA_LEFT output_left_amygdala.nii.gz -roi roi_left_amygdala.nii.gz -crop

#turn the ROI into cifti
wb_command -cifti-create-dense-from-template 
tfMRI_EMOTION_LR_Atlas.dtseries.nii roi_left_amygdala.dscalar.nii -volume 
AMYGDALA_LEFT roi_left_amygdala.nii.gz -from-cropped

#stats prints a number per column to standard output
wb_command -cifti-stats tfMRI_EMOTION_LR_Atlas.dtseries.nii -reduce MEAN -roi 
roi_left_amygdala.dscalar.nii

If you were using a surface structure, you should use -cifti-weighted-stats 
instead as the last step with -spatial-weights to account for differences in 
vertex area.

Tim


On Mon, Aug 31, 2015 at 1:33 PM, Nomi, Jason 
mailto:jxn...@miami.edu>> wrote:

Dear Experts,


I am trying to acquire the averaged volume time series for subcortical 
structures in text file form.


>From this post 
>http://www.mail-archive.com/hcp-users@humanconnectome.org/msg01184.html<https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mail-2Darchive.com_hcp-2Dusers-40humanconnectome.org_msg01184.html&d=BQMFaQ&c=y2w-uYmhgFWijp_IQN0DhA&r=ZJUDSwWP05vEvMBUQZ8FbQ&m=BmKcM4Yzl6xcnzSTeNcIEuHvRFAqTKTTeOiGiMbb7UQ&s=rMIzyQLLAXu4Qo1sXKKB9Rx2BEZhsmIRPFso5N-9d_0&e=>
> , I assume that doing these commands :


wb_command -cifti-separate tfMRI_EMOTION_LR_Atlas.dtseries.nii COLUMN
-volume AMYGDALA_LEFT output_left_amygdala.nii.gz -roi
roi_left_amygdala.nii.gz -crop


gives me a nifti file with *all* the voxels from the left amygdala 
(output_left_amygdala.nii.gz).


Is there a way to get the *averaged* time series for all left amygdala voxels 
into a nifti file using the -cifti-separate command?


If so, I suppose that I could then use the -nifti-information command to 
extract the time series from that averaged nifti file into a text file.


Or, is there another way that I should do this?


Thanks in advance!


Jason





___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users<https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.humanconnectome.org_mailman_listinfo_hcp-2Dusers&d=BQMFaQ&c=y2w-uYmhgFWijp_IQN0DhA&r=ZJUDSwWP05vEvMBUQZ8FbQ&m=BmKcM4Yzl6xcnzSTeNcIEuHvRFAqTKTTeOiGiMbb7UQ&s=eIqQcGiHAP8KDa

Re: [HCP-Users] Acquire text file with averaged volume time series for subcortical structures from cifti file

2015-08-31 Thread Nomi, Jason
Thank you for the quick reply Tim!


I was wondering if I could simplify things by applying the fsl command 
"fslmeants" to the output from the first step, on the 
"output_left_amygdala.nii.gz" file?  That should get me the mean of all the 
voxels included in that file right?  If that file contains only the voxels of 
the left amygdala, then I figure that would get me the averaged time series.


If that is not possible, I am unclear where the file in the second step, 
"tfMRI_EMOTION_LR_Atlas.dtseries.nii roi_left_amygdala.dscalar.nii", comes from.


Thanks for the help -


Jason



From: Timothy Coalson 
Sent: Monday, August 31, 2015 6:45 PM
To: Nomi, Jason
Cc: hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] Acquire text file with averaged volume time series for 
subcortical structures from cifti file

-cifti-separate can't do any averaging, that isn't its purpose.  Instead, you 
can use -cifti-stats and the roi of the amygdala to do such an average:

#what you already did, for the purpose of getting the ROI
wb_command -cifti-separate tfMRI_EMOTION_LR_Atlas.dtseries.nii COLUMN -volume 
AMYGDALA_LEFT output_left_amygdala.nii.gz -roi roi_left_amygdala.nii.gz -crop

#turn the ROI into cifti
wb_command -cifti-create-dense-from-template 
tfMRI_EMOTION_LR_Atlas.dtseries.nii roi_left_amygdala.dscalar.nii -volume 
AMYGDALA_LEFT roi_left_amygdala.nii.gz -from-cropped

#stats prints a number per column to standard output
wb_command -cifti-stats tfMRI_EMOTION_LR_Atlas.dtseries.nii -reduce MEAN -roi 
roi_left_amygdala.dscalar.nii

If you were using a surface structure, you should use -cifti-weighted-stats 
instead as the last step with -spatial-weights to account for differences in 
vertex area.

Tim


On Mon, Aug 31, 2015 at 1:33 PM, Nomi, Jason 
mailto:jxn...@miami.edu>> wrote:

Dear Experts,


I am trying to acquire the averaged volume time series for subcortical 
structures in text file form.


>From this post 
>http://www.mail-archive.com/hcp-users@humanconnectome.org/msg01184.html<https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mail-2Darchive.com_hcp-2Dusers-40humanconnectome.org_msg01184.html&d=BQMFaQ&c=y2w-uYmhgFWijp_IQN0DhA&r=ZJUDSwWP05vEvMBUQZ8FbQ&m=BmKcM4Yzl6xcnzSTeNcIEuHvRFAqTKTTeOiGiMbb7UQ&s=rMIzyQLLAXu4Qo1sXKKB9Rx2BEZhsmIRPFso5N-9d_0&e=>
> , I assume that doing these commands :


wb_command -cifti-separate tfMRI_EMOTION_LR_Atlas.dtseries.nii COLUMN
-volume AMYGDALA_LEFT output_left_amygdala.nii.gz -roi
roi_left_amygdala.nii.gz -crop


gives me a nifti file with *all* the voxels from the left amygdala 
(output_left_amygdala.nii.gz).


Is there a way to get the *averaged* time series for all left amygdala voxels 
into a nifti file using the -cifti-separate command?


If so, I suppose that I could then use the -nifti-information command to 
extract the time series from that averaged nifti file into a text file.


Or, is there another way that I should do this?


Thanks in advance!


Jason





___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users<https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.humanconnectome.org_mailman_listinfo_hcp-2Dusers&d=BQMFaQ&c=y2w-uYmhgFWijp_IQN0DhA&r=ZJUDSwWP05vEvMBUQZ8FbQ&m=BmKcM4Yzl6xcnzSTeNcIEuHvRFAqTKTTeOiGiMbb7UQ&s=eIqQcGiHAP8KDaMKQ5LBktP1HatoITSdGPtrXN1YFqs&e=>


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] Acquire text file with averaged volume time series for subcortical structures from cifti file

2015-08-31 Thread Nomi, Jason
Dear Experts,


I am trying to acquire the averaged volume time series for subcortical 
structures in text file form.


>From this post 
>http://www.mail-archive.com/hcp-users@humanconnectome.org/msg01184.html , I 
>assume that doing these commands :


wb_command -cifti-separate tfMRI_EMOTION_LR_Atlas.dtseries.nii COLUMN
-volume AMYGDALA_LEFT output_left_amygdala.nii.gz -roi
roi_left_amygdala.nii.gz -crop


gives me a nifti file with *all* the voxels from the left amygdala 
(output_left_amygdala.nii.gz).


Is there a way to get the *averaged* time series for all left amygdala voxels 
into a nifti file using the -cifti-separate command?


If so, I suppose that I could then use the -nifti-information command to 
extract the time series from that averaged nifti file into a text file.


Or, is there another way that I should do this?


Thanks in advance!


Jason





___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] auto-correlations

2015-08-25 Thread Nomi, Jason
Dear Experts,


I was just reading in the Smith et al. (2013) paper about correcting for 
autocorrelations (footnote 7) and was wondering if there were any type of 
corrections you would advise.


I am aware that I should demean, and then normalize the individual time series 
before concatenating across sessions 
(https://wiki.humanconnectome.org/display/PublicData/HCP+Users+FAQ - point 3), 
but I am wondering if there are any suggestions about mixture-model corrections 
or pre whitening.


I am interested in creating a parcellated connectome using the surface based 
data based off of an ROI atlas and was wondering if concerns about 
autocorrelations would be a problem when exploring dynamic functional 
connections between the ROI time series.


Thanks in advance!


Jason






___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Creating a parcellated connectome from a label file

2015-04-27 Thread Nomi, Jason

I was able to extract the time series for each label from the .ptseries.nii 
file in the terminal window using the wb_command -nifti-information with the 
-print-matrix option and then I exported that to a single text file.


Thanks Matt!!


Jason






From: Glasser, Matthew 
Sent: Sunday, April 26, 2015 11:57 PM
To: Nomi, Jason; hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] Creating a parcellated connectome from a label file

wb_command -cifti-parcellate is indeed the first step.  The next step you could 
load the resulting .ptseries.nii file into matlab or use wb_command 
-nifti-information with the -print-matrix option to output the terminal window, 
which you could redirect to a text file.  I don't know if this is in the 
released version yet, but the development version has the ability to use 
wb_command -cifti-convert to convert the file -to-text.

Peace,

Matt.

From: , Jason mailto:jxn...@miami.edu>>
Date: Sunday, April 26, 2015 at 3:53 PM
To: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Creating a parcellated connectome from a label file


Dear Experts,


I am trying to create a parcellated connectome from a label file based off of 
the Gordon et al., 2014 (Cerebral Cortex) paper.  They have provided a 
dlabel.nii file consisting of 333 parcels in both the left (162 parcels) and 
right (172 parcels) hemispheres 
(http://www.nil.wustl.edu/labs/petersen/Resources.html).


Eventually, I would like to acquire the time series for subcortical and 
cerebellar areas for a whole brain parcellated connectome, but this would be 
the first step.


I would like to get a series of text files (or even better: a single text file 
with each parcel/ROI as a single column) for each subject based off of the 
label file provided by Gordon et al.


I am not sure exactly what to do here.  I have been reading through the posts 
on the forum and assume I might perhaps start with the -cifti-parcellate 
command:


wb_command -cifti-parcellate   COLUMN 



I assume (hope?) the output file contains the average time series for each 
parcel as defined from the gordon_LR.dlabel.nii file??


However, I am unsure how I would acquire the time series information in text 
file form from this output file.


Am I even in the right ballpark here??


Any help would be appreciated as I have not worked with surface data yet.


Thanks in advance!


best,

Jason





___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] Creating a parcellated connectome from a label file

2015-04-26 Thread Nomi, Jason
Dear Experts,


I am trying to create a parcellated connectome from a label file based off of 
the Gordon et al., 2014 (Cerebral Cortex) paper.  They have provided a 
dlabel.nii file consisting of 333 parcels in both the left (162 parcels) and 
right (172 parcels) hemispheres 
(http://www.nil.wustl.edu/labs/petersen/Resources.html).


Eventually, I would like to acquire the time series for subcortical and 
cerebellar areas for a whole brain parcellated connectome, but this would be 
the first step.


I would like to get a series of text files (or even better: a single text file 
with each parcel/ROI as a single column) for each subject based off of the 
label file provided by Gordon et al.


I am not sure exactly what to do here.  I have been reading through the posts 
on the forum and assume I might perhaps start with the -cifti-parcellate 
command:


wb_command -cifti-parcellate   COLUMN 



I assume (hope?) the output file contains the average time series for each 
parcel as defined from the gordon_LR.dlabel.nii file??


However, I am unsure how I would acquire the time series information in text 
file form from this output file.


Am I even in the right ballpark here??


Any help would be appreciated as I have not worked with surface data yet.


Thanks in advance!


best,

Jason





___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Parcellated Connectome

2015-04-12 Thread Nomi, Jason
Thank you Steve!


Yes - I was speaking about the group-ICA spatial maps.  I have noticed that I 
can threshold the group-ICA spatial maps for each component at a much higher 
level than other ICAs that I have done on non-HCP data.


Your explanation about strong CNR makes sense.


I am still a little unclear about the relationship of instantiating a strong 
threshold on the group-ICA spatial maps relative to the time series.


For example, if I threshold a component's spatial map at a lower level, more 
areas of activation will naturally show up.  Does the time series represent all 
voxels in the spatial map when there is no thresholding?  Or, does the time 
series represent only the strongest voxels of activation?


Thus, when I apply a strong threshold for image presentation to "clean up" the 
image a little, does the time series also include those voxels that are not 
visible due to high thresholding?


Thanks again!


Jason



From: Stephen Smith 
Sent: Saturday, April 11, 2015 3:29 AM
To: Nomi, Jason
Cc: hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] Parcellated Connectome

Hi - there are many factors that affect overall scaling - more below:


On 10 Apr 2015, at 14:22, Nomi, Jason 
mailto:jxn...@miami.edu>> wrote:


Dear Experts,


I have noticed that the time-series for individual subjects from the dual 
regression output in the parcellated connectome (100 comp ICA) has a much 
larger range than I am used to seeing.


The range for time series values are approximately -800 to 800 while dual 
regression outputs that I have conducted myself are usually around -5 to 5.


I also notice that I can set the threshold much higher for the independent 
components when isolating activation compared to dual regression analyses that 
I have done myself. This "cleans up" the component representation substantially.


My questions are:


1) Is there a particular reason for this large increase in ranges?


In this case most likely because we set the max of the group maps used in 
dualreg stage 1 to be 1. This causes output timeseries to have larger scaling - 
but the overall scaling is arbitrary anyway.



2) Does the larger threshold for component activation have any influence on the 
time series that is being produced?  Does the time series from the dual 
regression output only represent the areas from the independent component with 
the most intense activation?  I would like to ensure that my presentation of 
component images using a much higher threshold is actually representative of 
the time series that I am analyzing.


Do you mean the group-ICA spatial maps or maps output by diualreg stage 2?

The group-ICA maps have high peaks (compared with the background scaling) for a 
couple of reasons:  a) because there are so many subjects being combined that  
the ICA components are strong, and b) the group-PCA reduction has removed a lot 
of unstructured noise before the PCA+ICA step.  But despite the maps having 
strong "CNR", they are still valid maps.

Cheers, Steve.





Thanks!


Jason




___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


---
Stephen M. Smith, Professor of Biomedical Engineering
Associate Director,  Oxford University FMRIB Centre

FMRIB, JR Hospital, Headington, Oxford  OX3 9DU, UK
+44 (0) 1865 222726  (fax 222717)
st...@fmrib.ox.ac.uk<mailto:st...@fmrib.ox.ac.uk>
http://www.fmrib.ox.ac.uk/~steve
---

Stop the cultural destruction of Tibet<http://smithinks.net>





___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] Dense Connectome Time Courses

2015-03-04 Thread Nomi, Jason
Dear HCP Experts,


I have a question regarding the text files found in the dense connectome 
release.  Are the columns found in the text files for the node time courses 
ordered in the same way as the melodic_IC_sum.nii.gz file?  That is, I assume 
that column 1 in the text file represents the time course for component 0 in 
the melodic file, column 2 = component 1, column 3 = component 2, etc, etc.


Also, for the 4800 time points listed in the text file, how does the ordering 
of the resting state acquisition scans go?  I know that there are four 15 
minute resting state scans (1200 volumes each), 2 L-R and 2 R-L acquired.  But, 
how are they ordered in the text file?


Are the first 2400 volumes L-R and R-L?  Or, are the first 2400 volumes both 
L-R, or both R-L?


I would like to cut the file in half for an analysis (30 minute resting state 
scan) but want to ensure that I use a counterbalanced L-R and R-L acquisition 
parameter.


Thank you for the help -


Jason



?

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Using Connectome Workbench Commands

2014-10-28 Thread Nomi, Jason
Hi Tim,


You are absolutely right.  I was missing the -repeat after the -var option for 
mean for my own command.  The code works perfectly now.


Thanks!


Jason




From: Timothy Coalson 
Sent: Tuesday, October 28, 2014 5:47 PM
To: Nomi, Jason
Cc: hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] Using Connectome Workbench Commands

That is the error I would expect if you removed the -repeat after the -var 
option for mean (yes, it needs 2 -repeat options, they each associate with one 
-var option) - could you paste the command you actually ran?

Tim


On Tue, Oct 28, 2014 at 11:30 AM, Nomi, Jason 
mailto:jxn...@miami.edu>> wrote:

Hi Tim,


Thank you for the help with the workbench and the code.  Yes, it is a mac osx.  
Sorry for the mix up.


Also, I can get the first two commands to work for "MEAN" and "STDEV", but the 
last command gives me an error message:


wb_command -volume-math '((x - mean) / stdev)' normalized.nii.gz -fixnan 0 -var 
x  -var mean mean.nii.gz -repeat -var stdev stdev.nii.gz -repeat


ERROR: volume file for variable 'mean' has 1 subvolume(s), but previous volume 
files have 1200 subvolume(s) requested to be used


Any help would be appreciated.


best,

Jason






Jason S. Nomi, Ph.D.
Post-Doctoral Researcher (BCCL Lab)
Department of Psychology
University of Miami
5151 San Amaro Drive: Room 114A
Website: http://www.psy.miami.edu/bccl/
Email: jxn...@miami.edu<mailto:jxn...@miami.edu>

From: Timothy Coalson mailto:tsc...@mst.edu>>
Sent: Monday, October 27, 2014 5:38 PM
To: Nomi, Jason
Cc: hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>
Subject: Re: [HCP-Users] Using Connectome Workbench Commands

Inline replies.

Tim

On Mon, Oct 27, 2014 at 10:26 AM, Nomi, Jason 
mailto:jxn...@miami.edu>> wrote:
Dear Experts,

I am trying to use the workbench commands but cannot get them to work.  After 
opening the “wb_command” file in my linux terminal, I get the message printed 
below.

Linux terminal?  The output shows that you successfully executed the Mac OS X 
binary, is there a linux system involved?

However, none of the commands work and nothing comes up when I type in the 
command by itself to get help.

I have tried all the various combinations.  With the “-“, without it.  With 
“wb_command” to start, without it, etc.

Try entering this at the command line:

/Applications/workbench/bin_macosx64/wb_command -volume-reduce

If you want to use it without entering that path each time, you'll need to add 
"/Applications/workbench/bin_macosx64/" to your PATH environment variable.  How 
to do this depends on what shell you are using.

Also, I would like to conduct variance normalization on some of the .nii files 
from the ICA-Fix dataset.  Would using the command, “-volume-reduce" with the 
VARIANCE operation accomplish this?

No, among other things, variance is nonlinear with the spread of the data.  If 
the files you are looking at end in .dtseries.nii, they are not volume files, 
but rather CIFTI files (there are other 2-part extensions that signify CIFTI 
files, but ICA-FIX should be using dtseries).  This thread contains the steps 
for normalizing (including demeaning) along timeseries while in CIFTI format:

http://www.mail-archive.com/hcp-users@humanconnectome.org/msg00444.html

Similar commands apply if you are actually dealing with volume file timeseries:

wb_command -volume-reduce  MEAN mean.nii.gz
wb_command -volume-reduce  STDEV stdev.nii.gz
wb_command -volume-math '(x - mean) / stdev' normalized.nii.gz -fixnan 0 -var x 
 -var mean mean.nii.gz -repeat -var stdev stdev.nii.gz -repeat

Thank you very much for you help.

best,
Jason











UM-46JNG5RP:~ admin$ /Applications/workbench/bin_macosx64/wb_command ; exit &

Why do you have "; exit &" on the end?

Connectome Workbench
Version: 1.0
Qt Compiled Version: 4.8.3
Qt Runtime Version: 4.8.3
commit: dfd2086d37612ccf2369b85b5f5f0f5987369339
commit date: 2014-09-09 13:23:57 -0500
Compiler: clang2++ (/usr/local/clang-openmp-opt/llvm/build/Release/bin)
Compiler Version:
Compiled Debug: NO
Operating System: Apple OSX

Information options:
   -help print this help info
   -arguments-help   explain how to read the help info for subcommands
   -version  print version information only
   -list-commandsprint all non-information (processing) subcommands
   -all-commands-helpprint all non-information (processing) subcommands and
their help info - VERY LONG

Global options (can be added to any command):
   -disable-provenance   don't generate provenance info in output files

If the first argument is not recognized, all processing commands that start
   with the argument are displayed





___
HCP-Users mailing lis

Re: [HCP-Users] Using Connectome Workbench Commands

2014-10-28 Thread Nomi, Jason
Hi Tim,


Thank you for the help with the workbench and the code.  Yes, it is a mac osx.  
Sorry for the mix up.


Also, I can get the first two commands to work for "MEAN" and "STDEV", but the 
last command gives me an error message:


wb_command -volume-math '((x - mean) / stdev)' normalized.nii.gz -fixnan 0 -var 
x  -var mean mean.nii.gz -repeat -var stdev stdev.nii.gz -repeat


ERROR: volume file for variable 'mean' has 1 subvolume(s), but previous volume 
files have 1200 subvolume(s) requested to be used


Any help would be appreciated.


best,

Jason






Jason S. Nomi, Ph.D.
Post-Doctoral Researcher (BCCL Lab)
Department of Psychology
University of Miami
5151 San Amaro Drive: Room 114A
Website: http://www.psy.miami.edu/bccl/
Email: jxn...@miami.edu

From: Timothy Coalson 
Sent: Monday, October 27, 2014 5:38 PM
To: Nomi, Jason
Cc: hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] Using Connectome Workbench Commands

Inline replies.

Tim

On Mon, Oct 27, 2014 at 10:26 AM, Nomi, Jason 
mailto:jxn...@miami.edu>> wrote:
Dear Experts,

I am trying to use the workbench commands but cannot get them to work.  After 
opening the “wb_command” file in my linux terminal, I get the message printed 
below.

Linux terminal?  The output shows that you successfully executed the Mac OS X 
binary, is there a linux system involved?

However, none of the commands work and nothing comes up when I type in the 
command by itself to get help.

I have tried all the various combinations.  With the “-“, without it.  With 
“wb_command” to start, without it, etc.

Try entering this at the command line:

/Applications/workbench/bin_macosx64/wb_command -volume-reduce

If you want to use it without entering that path each time, you'll need to add 
"/Applications/workbench/bin_macosx64/" to your PATH environment variable.  How 
to do this depends on what shell you are using.

Also, I would like to conduct variance normalization on some of the .nii files 
from the ICA-Fix dataset.  Would using the command, “-volume-reduce" with the 
VARIANCE operation accomplish this?

No, among other things, variance is nonlinear with the spread of the data.  If 
the files you are looking at end in .dtseries.nii, they are not volume files, 
but rather CIFTI files (there are other 2-part extensions that signify CIFTI 
files, but ICA-FIX should be using dtseries).  This thread contains the steps 
for normalizing (including demeaning) along timeseries while in CIFTI format:

http://www.mail-archive.com/hcp-users@humanconnectome.org/msg00444.html

Similar commands apply if you are actually dealing with volume file timeseries:

wb_command -volume-reduce  MEAN mean.nii.gz
wb_command -volume-reduce  STDEV stdev.nii.gz
wb_command -volume-math '(x - mean) / stdev' normalized.nii.gz -fixnan 0 -var x 
 -var mean mean.nii.gz -repeat -var stdev stdev.nii.gz -repeat

Thank you very much for you help.

best,
Jason











UM-46JNG5RP:~ admin$ /Applications/workbench/bin_macosx64/wb_command ; exit &

Why do you have "; exit &" on the end?

Connectome Workbench
Version: 1.0
Qt Compiled Version: 4.8.3
Qt Runtime Version: 4.8.3
commit: dfd2086d37612ccf2369b85b5f5f0f5987369339
commit date: 2014-09-09 13:23:57 -0500
Compiler: clang2++ (/usr/local/clang-openmp-opt/llvm/build/Release/bin)
Compiler Version:
Compiled Debug: NO
Operating System: Apple OSX

Information options:
   -help print this help info
   -arguments-help   explain how to read the help info for subcommands
   -version  print version information only
   -list-commandsprint all non-information (processing) subcommands
   -all-commands-helpprint all non-information (processing) subcommands and
their help info - VERY LONG

Global options (can be added to any command):
   -disable-provenance   don't generate provenance info in output files

If the first argument is not recognized, all processing commands that start
   with the argument are displayed





___
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] Using Connectome Workbench Commands

2014-10-27 Thread Nomi, Jason
Dear Experts,

I am trying to use the workbench commands but cannot get them to work.  After 
opening the “wb_command” file in my linux terminal, I get the message printed 
below.

However, none of the commands work and nothing comes up when I type in the 
command by itself to get help.

I have tried all the various combinations.  With the “-“, without it.  With 
“wb_command” to start, without it, etc.

Also, I would like to conduct variance normalization on some of the .nii files 
from the ICA-Fix dataset.  Would using the command, “-volume-reduce" with the 
VARIANCE operation accomplish this?

Thank you very much for you help.

best,
Jason











UM-46JNG5RP:~ admin$ /Applications/workbench/bin_macosx64/wb_command ; exit &
Connectome Workbench
Version: 1.0
Qt Compiled Version: 4.8.3
Qt Runtime Version: 4.8.3
commit: dfd2086d37612ccf2369b85b5f5f0f5987369339
commit date: 2014-09-09 13:23:57 -0500
Compiler: clang2++ (/usr/local/clang-openmp-opt/llvm/build/Release/bin)
Compiler Version:
Compiled Debug: NO
Operating System: Apple OSX

Information options:
   -help print this help info
   -arguments-help   explain how to read the help info for subcommands
   -version  print version information only
   -list-commandsprint all non-information (processing) subcommands
   -all-commands-helpprint all non-information (processing) subcommands and
their help info - VERY LONG

Global options (can be added to any command):
   -disable-provenance   don't generate provenance info in output files

If the first argument is not recognized, all processing commands that start
   with the argument are displayed





___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] More questions about Fix-preprocessed rs-fMRI data

2014-10-25 Thread Nomi, Jason
?

Dear Experts,


I have a follow up question regarding this earlier thread about 
Fix-preprocessed rs-fMRI data.

http://www.mail-archive.com/hcp-users@humanconnectome.org/msg00596.html


There are 4 runs for the fix data each consisting of a 15 minute resting state 
.nii volume file:

Rest1: LR

Rest1: RL

Rest2: LR

Rest2: RL


>From the earlier thread it seems like all 4 runs should be concatenated?  In 
>other words, I should combine all 4 15 minute runs into a single 60 minute 
>run?  Or, should I merge them to make a single 15 minute run?


I do not understand what to do with the 4 runs.


There is a PNAS paper where they simply chose one of the 15 minute runs (Rest2: 
LR): http://www.pnas.org/content/111/28/10341.full.pdf



Is this acceptable?  Or, is it absolutely necessary to 
combine/concatenate/merge (I don't know what the proper term/operation is 
here!) the 4 runs?


best,

Jason

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users