Re: [HCP-Users] Questions regarding coordinates infomation of vertex

2017-03-20 Thread Irisqql0922


Thank you Glasser, but your answer is not what I'm looking for.  I may not 
express myself clearly. 
Here is the situation: I am trying to analysis CIFTI data using python.  Now I 
have got a matrix with all threshold vertex value.  Next, I want to use that 
matrix to generate a mask for my future analysis .  But now, what I got is just 
a matrix with only vertex values , thus I cannot use this matrix to generate my 
ROIs and then get their spatial coordinates. 
So I wonder if there is a transformation matrix to transform vertex index to 
spatial coordinates, and if there is, where can I find it. 


Thanks again,
Qinqin lee 


On 03/20/2017 20:03,Glasser, Matthew<glass...@wustl.edu> wrote:
In individual subjects, you can use the midthickness surface coordinates as the 
3D coordinates.  


Peace,


Matt.


From: <hcp-users-boun...@humanconnectome.org> on behalf of Irisqql0922 
<irisqql0...@163.com>
Date: Monday, March 20, 2017 at 4:19 AM
To: hcp-users <hcp-users@humanconnectome.org>
Subject: [HCP-Users] Questions regarding coordinates infomation of vertex



Dear HCP teams,


I am trying to threshold surface data of file *.dscalar.nii , and after that, I 
don't know how to link data remains in the matrix with its space coordinates. 
Is there any matrix contains the information linking vertex index in the matrix 
with its space location? If there is such a matrix, where can I find it? 

Regards,
Qinqin lee 






___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

 

The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.
___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] Questions regarding coordinates infomation of vertex

2017-03-20 Thread Irisqql0922
Dear HCP teams,


I am trying to threshold surface data of file *.dscalar.nii , and after that, I 
don't know how to link data remains in the matrix with its space coordinates. 
Is there any matrix contains the information linking vertex index in the matrix 
with its space location? If there is such a matrix, where can I find it? 

Regards,
Qinqin lee 





___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Questions regarding coordinates infomation of vertex

2017-03-20 Thread Irisqql0922
I am now trying to get mapping matrix from *.sur.gii using nibabel.  I have to 
say it's not easy to find that matrix, hiding so deep in the file.
 
Thank you for your reminding! I am very very grateful!






Qinqin lee 
On 03/20/2017 23:05,Timothy Coalson<tsc...@mst.edu> wrote:
Also note that cifti files can contain both left and right hemispheres and 
voxel data, while gifti files, including *.surf.gii, only contain one 
hemisphere per file.  There is some explanation here:

http://www.humanconnectome.org/software/workbench-command.php?function=-cifti-help


Note that cifti files generally exclude the medial wall from being represented 
in the matrix, such that there are 29k indices that represent the left cortex 
in our standard cifti space, but the left cortex surface has 32k vertices.  
Nibabel has some very new support for reading cifti files in python, which 
should give the information of what each index in the cifti file means 
(specifically which vertex index or voxel each spatial index represents - the 
vertex coordinates are not in cifti files, so you need to get coordinates from 
*.surf.gii files).  You may need to grab the latest version from github:

https://github.com/nipy/nibabel


Tim




On Mon, Mar 20, 2017 at 8:39 AM, Irisqql0922 <irisqql0...@163.com> wrote:



Sorry, I thought Matt misunderstanding my words.  Ok, I will check the file 
*midthickness*.sur.gii. 


Thank you Donna, and thank you Matt^_^


Qinqin Lee 


On 03/20/2017 21:25,Dierker, Donna<do...@wustl.edu> wrote:
Hi Qinqin,

As Matt said, the file named like *midthickness*.surf.gii contains the 
vertex-to-3D coordinate mapping that you seek.
This could be an individual’s surface, if the cifti data pertains to a 
particular subject, or it could be a mean midthickness (e.g., HCP500) if it is 
group data.

You can read about the GIFTI file format here:

https://www.nitrc.org/projects/gifti/

Also, have a look here and consider whether this command might help:

https://www.humanconnectome.org/documentation/workbench-command/command-all-commands-help.html
-cifti-rois-from-extrema

You might have to manipulate your ROI to narrow it down first.

Donna


> On Mar 20, 2017, at 7:23 AM, Irisqql0922 <irisqql0...@163.com> wrote:
>
>
> Thank you Glasser, but your answer is not what I'm looking for.  I may not 
> express myself clearly.
> Here is the situation: I am trying to analysis CIFTI data using python.  Now 
> I have got a matrix with all threshold vertex value.  Next, I want to use 
> that matrix to generate a mask for my future analysis .  But now, what I got 
> is just a matrix with only vertex values , thus I cannot use this matrix to 
> generate my ROIs and then get their spatial coordinates.
> So I wonder if there is a transformation matrix to transform vertex index to 
> spatial coordinates, and if there is, where can I find it.
>
> Thanks again,
> Qinqin lee
>
> On 03/20/2017 20:03,Glasser, Matthew<glass...@wustl.edu> wrote:
> In individual subjects, you can use the midthickness surface coordinates as 
> the 3D coordinates.
>
> Peace,
>
> Matt.
>
> From: <hcp-users-boun...@humanconnectome.org> on behalf of Irisqql0922 
> <irisqql0...@163.com>
> Date: Monday, March 20, 2017 at 4:19 AM
> To: hcp-users <hcp-users@humanconnectome.org>
> Subject: [HCP-Users] Questions regarding coordinates infomation of vertex
>
> Dear HCP teams,
>
> I am trying to threshold surface data of file *.dscalar.nii , and after that, 
> I don't know how to link data remains in the matrix with its space 
> coordinates. Is there any matrix contains the information linking vertex 
> index in the matrix with its space location? If there is such a matrix, where 
> can I find it?
>
> Regards,
> Qinqin lee
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
>
>
> The materials in this message are private and may contain Protected 
> Healthcare Information or other information of a sensitive nature. If you are 
> not the intended recipient, be advised that any unauthorized use, disclosure, 
> copying or the taking of any action in reliance on the contents of this 
> information is strictly prohibited. If you have received this email in error, 
> please immediately notify the sender via telephone or return mail.
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advise

[HCP-Users] info about head motion and individual brain size

2017-04-20 Thread Irisqql0922
Dear HCP teams,


I need parameters concerning head motion(e.g., FD) to do my task fMRI analyse 
and data about subjects' brain size to analyze structure data now (they will 
both be used in regression). 


So I wonder is there any file on AWS or connectome DB include these information 
(I cannot find them after searching for hours)? If not, how should I get these 
information? 


Thanks in advance :)


Qinqin Li
___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] Problem regarding to mount 1200-release data to NITRIC-CE

2017-05-15 Thread Irisqql0922
Dear hcp teams,


I sorry to bother you again with same problem.


I used default options and mounted data successfully. But when I checked 
/s3/hcp, I found that data in it has only 900 subjects. Obviously, it's not the 
latest 1200-release data. 




Since I want to analyse the latest version of data, I use s3fs to achieve my 
goal. 
I use command:
: > ~/.passwd-s3fs
chmod 600 ~/.passwd-s3fs
s3fs hcp-openaccess /s3mnt -o passwd_file=~/.passwd-s3fs


It failed everytime. In the syslog file, I found error below:




I got my credential keys from connectome DB, and I quiet sure that I put it 
right in passwd-s3fs.


So I wonder, does my credential keys have access to hcp-openaccess when using 
s3fs to mount data? If the answer is yes, do you have any suggestion for me? 


(note:  At first, I thought the problem may due to the  version of s3fs. So I 
created a new instance based on Amazon Linux AMI, and then download the lastest 
version of s3fs. But still, I failed because 'invalid credentials')


thank you very much!


Best,


Qinqin Li


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] using Amazon S3 on NITRC-CE

2017-05-09 Thread Irisqql0922
Dear HCP  teams,


I am now trying to use Amazon S3 on NITRC-CE, and I stucked with mounting 
data from hcp-openaccess bucket to my instance. After I filled in blanks, the 
system told me I had mount it successfully. But when I check the folder though 
terminal, it shows nothing below. 
   
 I have tried it many times, and I am very confused. 


I don't know what's going on. Can anyone give me some suggestions?




Regards,
 
Qinqin Li 
___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] using Amazon S3 on NITRC-CE

2017-05-09 Thread Irisqql0922


hi Tim,


I tied to mount data using terminal, and I entered command: sudo s3fs 
hcp-openaccess /s3mnt, and still, I got nothing under s3mnt folder. 




And then I tried to use command: sudo mount hcp-openaccess  /s3mnt 




I think I  input the bucket's name correctly,and I have entered the newest 
authentication I got.




the content in log shows below:




Did I make something wrong?


Best,
Qinqin Li 


On 05/10/2017 02:02,Timothy Coalson<tsc...@mst.edu> wrote:
From what I recall, the mounting tools for S3 don't properly notify the web 
interface when they fail.  Try mounting it in the terminal like "sudo mount 
/s3mnt" and see what error you get.  You may have entered the authentication 
incorrectly, or requested a new authentication token, but then entered the old, 
expired one.


Tim




On Tue, May 9, 2017 at 2:12 AM, Irisqql0922 <irisqql0...@163.com> wrote:

Dear HCP  teams,


I am now trying to use Amazon S3 on NITRC-CE, and I stucked with mounting 
data from hcp-openaccess bucket to my instance. After I filled in blanks, the 
system told me I had mount it successfully. But when I check the folder though 
terminal, it shows nothing below. 
   
 I have tried it many times, and I am very confused. 


I don't know what's going on. Can anyone give me some suggestions?




Regards,
 
Qinqin Li 

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users




___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users
___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Problem regarding to mount 1200-release data to NITRIC-CE

2017-05-16 Thread Irisqql0922
Hi Tim,


I change first line in /etc/fstab file to 


s3fs#hcp-openaccess:/HCP_1200,


and it worked!!! Thank you very much!


But it's not very convenient if I do it every time when I need to mount 
1200-release data. The test you ask me to do yesterday can mount 1200-release 
data directly, right?


Best,
Qinqin Li


On 05/16/2017 03:04,Timothy B. Brown<tbbr...@wustl.edu> wrote:

Dear Qinqin Li,

Based on my checking so far, AWS credentials that give you access to the 
HCP_900 section of the S3 bucket should also give you access to the HCP_1200 
section of the bucket.

One thing I would suggest is to go back to using the mount point provided by 
the NITRC-CE-HCP environment, but edit the system file that tells the system 
what to mount at /s3/hcp.

You will need to edit the file /etc/fstab. You will need to fire up the editor 
you use to make this change via sudo to be able to edit this file.

You should find a line in the /etc/fstab file that starts with:

s3fs#hcp-openaccess:/HCP_900


Change the start of that line to:

s3fs#hcp-openaccess:/HCP_1200


Once you make this change and stop and restart your instance, then what is 
mounted at /s3/hcp should be the 1200 subjects release data.

  Tim


On 05/15/2017 10:07 AM, Timothy B. Brown wrote:


Dear Qinqin Li,

First of all, you are correct that in using the latest version of the NITRC-CE 
for HCP, the 900 subjects release is mounted at /s3/hcp. We just recently got 
the data from the 1200 subjects release fully uploaded to the S3 bucket. I am 
working with the NITRC folks to get the AMI modified to mount the 1200 subjects 
release data.

As for using s3fs yourself to mount the HCP_1200 data, it seems to me that you 
are doing the right thing by putting your access key and secret access key in 
the ~/.passwd-s3fs file. I think that the credentials you have that gave you 
access to the HCP_900 data should also give you access to the HCP_1200 data. I 
will be running a test shortly to verify that that is working as I expect. In 
the meantime, you can also do some helpful testing from your end.

Please try installing the AWS command line interface tool (see 
https://aws.amazon.com/cli). Be sure to follow the configuration instructions 
at http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-welcome.html to run 
the aws configure command. This will get your AWS access key id and AWS secret 
access key into a configuration file for the AWS command line tool similar to 
way you've placed that information into a file for s3fs.


Then try issuing commands like the following:

$ aws s3 ls s3://hcp-openaccess/HCP_900/

$ aws s3 ls s3://hcp-openaccess/HCP_1200/


If both of these work and give you a long list of subject ID entries that look 
something like:

PRE 100206/
PRE 100307/
PRE 100408/
...


then your credentials are working for both the 900 subjects release and the 
1200 subjects release.

If the HCP_900 listing works, but the HCP_1200 listing does not, then we will 
need to arrange for you to get different credentials.

  Tim


On 05/15/2017 08:48 AM, Irisqql0922 wrote:

Dear hcp teams,


I sorry to bother you again with same problem.


I used default options and mounted data successfully. But when I checked 
/s3/hcp, I found that data in it has only 900 subjects. Obviously, it's not the 
latest 1200-release data. 




Since I want to analyse the latest version of data, I use s3fs to achieve my 
goal. 
I use command:
: > ~/.passwd-s3fs
chmod 600 ~/.passwd-s3fs
s3fs hcp-openaccess /s3mnt -o passwd_file=~/.passwd-s3fs


It failed everytime. In the syslog file, I found error below:




I got my credential keys from connectome DB, and I quiet sure that I put it 
right in passwd-s3fs.


So I wonder, does my credential keys have access to hcp-openaccess when using 
s3fs to mount data? If the answer is yes, do you have any suggestion for me? 


(note:  At first, I thought the problem may due to the  version of s3fs. So I 
created a new instance based on Amazon Linux AMI, and then download the lastest 
version of s3fs. But still, I failed because 'invalid credentials')


thank you very much!


Best,


Qinqin Li



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



--
Timothy B. Brown
Business & Technology Application Analyst III
Pipeline Developer (Human Connectome Project)
tbbrown(at)wustl.edu

The material in this message is private and may contain Protected Healthcare 
Information (PHI). If you are not the intended recipient, be advised that any 
unauthorized use, disclosure, copying or the taking of any action in reliance 
on the contents of this information is strictly prohibited. If you have 
received this email in error, please immediately notify the sender via 
telephone or return mail.

___
HCP-Users

Re: [HCP-Users] Problem regarding to mount 1200-release data to NITRIC-CE

2017-05-16 Thread Irisqql0922
Hi Tim,


I forget to type echo in the mail, but I have typed it in the terminal. I am 
sorry I didn't put it right it the mail.  And I checked the file 
~/.passwd-s3fs, and it had expected secret key. 


I use @ Timothy B.Brown 's suggestion that change the first line in /etc/fstab 
to


s3fs#hcp-openaccess:/HCP_1200


and then stop and restart the instance. Then I use command 


mount  /s3/hcp


It worked!!


But I still don't know why I failed when I used s3fs to mount data. 


Best,


Qinqin Li  
On 05/16/2017 02:43,Timothy Coalson<tsc...@mst.edu> wrote:
On Mon, May 15, 2017 at 8:48 AM, Irisqql0922 <irisqql0...@163.com> wrote:

...
I use command:
: > ~/.passwd-s3fs


If this is really the command you used, then it wouldn't work: you need "echo" 
at the start of it, like this:


echo : > ~/.passwd-s3fs


Please look at the file's contents with something like "less ~/.passwd-s3fs" to 
double check that it contains what you expect (but don't post it to the list if 
it contains your secret key, obviously).


Tim



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users
___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Problem regarding to mount 1200-release data to NITRIC-CE

2017-05-15 Thread Irisqql0922
Hi Tim,


I am glad to do the test, and I will let you how it goes :)




Best,
Qinqin Li


On 05/15/2017 23:07,Timothy B. Brown<tbbr...@wustl.edu> wrote:

Dear Qinqin Li,

First of all, you are correct that in using the latest version of the NITRC-CE 
for HCP, the 900 subjects release is mounted at /s3/hcp. We just recently got 
the data from the 1200 subjects release fully uploaded to the S3 bucket. I am 
working with the NITRC folks to get the AMI modified to mount the 1200 subjects 
release data.

As for using s3fs yourself to mount the HCP_1200 data, it seems to me that you 
are doing the right thing by putting your access key and secret access key in 
the ~/.passwd-s3fs file. I think that the credentials you have that gave you 
access to the HCP_900 data should also give you access to the HCP_1200 data. I 
will be running a test shortly to verify that that is working as I expect. In 
the meantime, you can also do some helpful testing from your end.

Please try installing the AWS command line interface tool (see 
https://aws.amazon.com/cli). Be sure to follow the configuration instructions 
at http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-welcome.html to run 
the aws configure command. This will get your AWS access key id and AWS secret 
access key into a configuration file for the AWS command line tool similar to 
way you've placed that information into a file for s3fs.


Then try issuing commands like the following:

$ aws s3 ls s3://hcp-openaccess/HCP_900/

$ aws s3 ls s3://hcp-openaccess/HCP_1200/


If both of these work and give you a long list of subject ID entries that look 
something like:

PRE 100206/
PRE 100307/
PRE 100408/
...


then your credentials are working for both the 900 subjects release and the 
1200 subjects release.

If the HCP_900 listing works, but the HCP_1200 listing does not, then we will 
need to arrange for you to get different credentials.

  Tim


On 05/15/2017 08:48 AM, Irisqql0922 wrote:

Dear hcp teams,


I sorry to bother you again with same problem.


I used default options and mounted data successfully. But when I checked 
/s3/hcp, I found that data in it has only 900 subjects. Obviously, it's not the 
latest 1200-release data. 




Since I want to analyse the latest version of data, I use s3fs to achieve my 
goal. 
I use command:
: > ~/.passwd-s3fs
chmod 600 ~/.passwd-s3fs
s3fs hcp-openaccess /s3mnt -o passwd_file=~/.passwd-s3fs


It failed everytime. In the syslog file, I found error below:




I got my credential keys from connectome DB, and I quiet sure that I put it 
right in passwd-s3fs.


So I wonder, does my credential keys have access to hcp-openaccess when using 
s3fs to mount data? If the answer is yes, do you have any suggestion for me? 


(note:  At first, I thought the problem may due to the  version of s3fs. So I 
created a new instance based on Amazon Linux AMI, and then download the lastest 
version of s3fs. But still, I failed because 'invalid credentials')


thank you very much!


Best,


Qinqin Li



___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



--
Timothy B. Brown
Business & Technology Application Analyst III
Pipeline Developer (Human Connectome Project)
tbbrown(at)wustl.edu

The material in this message is private and may contain Protected Healthcare 
Information (PHI). If you are not the intended recipient, be advised that any 
unauthorized use, disclosure, copying or the taking of any action in reliance 
on the contents of this information is strictly prohibited. If you have 
received this email in error, please immediately notify the sender via 
telephone or return mail.
___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] Question concerning SNR info

2017-05-01 Thread Irisqql0922
Dear HCP teams,


I notice that you have run CIA-FIX denoise  process for rfMRI data, but it 
seems like there is no such process  for task fMRI. 
Now I am working on WM data and want to find related SNR data to do some 
further analyse. So I wonder if there is any file containing SNR value 
(vertex-level or voxel-level) in your data? If not, can I run CIA-FIX for task 
fMRI to get SNR?


Thanks in advance,


Qinqin Li 


-
School of Psychology
State Key Laboratory of Cognitive Neuroscience and Learning
Beijing Normal University
Beijing, China, 100875.




___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] calculate beta and t value of new contrast

2019-01-10 Thread Irisqql0922


Dear Team,
 
I have some trouble in calculating data with AWS data: I want to use beta value 
and t value of face-tool contrast of working memory task, but it seems that 
there is no such data on AWS (I understand that there are face-fix and face-avg 
contrast on AWS, but they are not perfect for my study). 


So I wonder if it's possible for me to calculate them using existing data on 
the AWS.  And if it is possible, how should I calculate beta value and t value 
of face-tool contrast? 


Thank you very much!




Best,
Iris Lee,


Graduate Student
Beijing Normal University
State Key Laboratory of Cognitive Neuroscience and Learning 
| |
Irisqql0922
|
|
irisqql0...@163.com
|
签名由网易邮箱大师定制
___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users