[HCP-Users] Questions regarding values in .dscalar.nii file

2017-03-20 Thread Irisqql0922
Dear HCP teams, I am extracting signals from individual level-2 analysis , and I wonder what is the value of file end with dscalar.nii (e.g., 100307_tfMRI_WM_level2_hp200_s2.dscalar.nii) . Is it z value ? I can't find that in manual. Regards, Qinqin lee _

[HCP-Users] Questions regarding coordinates infomation of vertex

2017-03-20 Thread Irisqql0922
Dear HCP teams, I am trying to threshold surface data of file *.dscalar.nii , and after that, I don't know how to link data remains in the matrix with its space coordinates. Is there any matrix contains the information linking vertex index in the matrix with its space location? If there is suc

Re: [HCP-Users] Questions regarding coordinates infomation of vertex

2017-03-20 Thread Irisqql0922
can I find it. Thanks again, Qinqin lee On 03/20/2017 20:03,Glasser, Matthew wrote: In individual subjects, you can use the midthickness surface coordinates as the 3D coordinates. Peace, Matt. From: on behalf of Irisqql0922 Date: Monday, March 20, 2017 at 4:19 AM To: hcp-users Su

Re: [HCP-Users] Questions regarding coordinates infomation of vertex

2017-03-20 Thread Irisqql0922
look here and consider whether this command might help: https://www.humanconnectome.org/documentation/workbench-command/command-all-commands-help.html -cifti-rois-from-extrema You might have to manipulate your ROI to narrow it down first. Donna > On Mar 20, 2017, at 7:23 AM, Irisqql0922 wr

Re: [HCP-Users] Questions regarding coordinates infomation of vertex

2017-03-20 Thread Irisqql0922
/nipy/nibabel Tim On Mon, Mar 20, 2017 at 8:39 AM, Irisqql0922 wrote: Sorry, I thought Matt misunderstanding my words. Ok, I will check the file *midthickness*.sur.gii. Thank you Donna, and thank you Matt^_^ Qinqin Lee On 03/20/2017 21:25,Dierker, Donna wrote: Hi Qinqin, As Matt said

[HCP-Users] info about head motion and individual brain size

2017-04-20 Thread Irisqql0922
Dear HCP teams, I need parameters concerning head motion(e.g., FD) to do my task fMRI analyse and data about subjects' brain size to analyze structure data now (they will both be used in regression). So I wonder is there any file on AWS or connectome DB include these information (I cannot f

[HCP-Users] Question concerning SNR info

2017-05-01 Thread Irisqql0922
Dear HCP teams, I notice that you have run CIA-FIX denoise process for rfMRI data, but it seems like there is no such process for task fMRI. Now I am working on WM data and want to find related SNR data to do some further analyse. So I wonder if there is any file containing SNR value (verte

[HCP-Users] using Amazon S3 on NITRC-CE

2017-05-09 Thread Irisqql0922
Dear HCP teams, I am now trying to use Amazon S3 on NITRC-CE, and I stucked with mounting data from hcp-openaccess bucket to my instance. After I filled in blanks, the system told me I had mount it successfully. But when I check the folder though terminal, it shows nothing below. I

Re: [HCP-Users] using Amazon S3 on NITRC-CE

2017-05-09 Thread Irisqql0922
minal like "sudo mount /s3mnt" and see what error you get. You may have entered the authentication incorrectly, or requested a new authentication token, but then entered the old, expired one. Tim On Tue, May 9, 2017 at 2:12 AM, Irisqql0922 wrote: Dear HCP teams, I

[HCP-Users] Problem regarding to mount 1200-release data to NITRIC-CE

2017-05-15 Thread Irisqql0922
Dear hcp teams, I sorry to bother you again with same problem. I used default options and mounted data successfully. But when I checked /s3/hcp, I found that data in it has only 900 subjects. Obviously, it's not the latest 1200-release data. Since I want to analyse the latest version of

Re: [HCP-Users] Problem regarding to mount 1200-release data to NITRIC-CE

2017-05-15 Thread Irisqql0922
e 900 subjects release and the 1200 subjects release. If the HCP_900 listing works, but the HCP_1200 listing does not, then we will need to arrange for you to get different credentials. Tim On 05/15/2017 08:48 AM, Irisqql0922 wrote: Dear hcp teams, I sorry to bother you again with same pr

Re: [HCP-Users] Problem regarding to mount 1200-release data to NITRIC-CE

2017-05-16 Thread Irisqql0922
s3fs#hcp-openaccess:/HCP_1200 and then stop and restart the instance. Then I use command mount /s3/hcp It worked!! But I still don't know why I failed when I used s3fs to mount data. Best, Qinqin Li On 05/16/2017 02:43,Timothy Coalson wrote: On Mon, May 15, 2017 at 8:48

Re: [HCP-Users] Problem regarding to mount 1200-release data to NITRIC-CE

2017-05-16 Thread Irisqql0922
s, but the HCP_1200 listing does not, then we will need to arrange for you to get different credentials. Tim On 05/15/2017 08:48 AM, Irisqql0922 wrote: Dear hcp teams, I sorry to bother you again with same problem. I used default options and mounted data successfully. But when I checked

[HCP-Users] calculate beta and t value of new contrast

2019-01-10 Thread Irisqql0922
e Key Laboratory of Cognitive Neuroscience and Learning | | Irisqql0922 | | irisqql0...@163.com | 签名由网易邮箱大师定制 ___ HCP-Users mailing list HCP-Users@humanconnectome.org http://lists.humanconnectome.org/mailman/listinfo/hcp-users