Hi Tim,

I am glad to do the test, and I will let you how it goes :)




Best,
Qinqin Li


On 05/15/2017 23:07,Timothy B. Brown<tbbr...@wustl.edu> wrote:

Dear Qinqin Li,

First of all, you are correct that in using the latest version of the NITRC-CE 
for HCP, the 900 subjects release is mounted at /s3/hcp. We just recently got 
the data from the 1200 subjects release fully uploaded to the S3 bucket. I am 
working with the NITRC folks to get the AMI modified to mount the 1200 subjects 
release data.

As for using s3fs yourself to mount the HCP_1200 data, it seems to me that you 
are doing the right thing by putting your access key and secret access key in 
the ~/.passwd-s3fs file. I think that the credentials you have that gave you 
access to the HCP_900 data should also give you access to the HCP_1200 data. I 
will be running a test shortly to verify that that is working as I expect. In 
the meantime, you can also do some helpful testing from your end.

Please try installing the AWS command line interface tool (see 
https://aws.amazon.com/cli). Be sure to follow the configuration instructions 
at http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-welcome.html to run 
the aws configure command. This will get your AWS access key id and AWS secret 
access key into a configuration file for the AWS command line tool similar to 
way you've placed that information into a file for s3fs.


Then try issuing commands like the following:

$ aws s3 ls s3://hcp-openaccess/HCP_900/

$ aws s3 ls s3://hcp-openaccess/HCP_1200/


If both of these work and give you a long list of subject ID entries that look 
something like:

                    PRE 100206/
                    PRE 100307/
                    PRE 100408/
                    ...


then your credentials are working for both the 900 subjects release and the 
1200 subjects release.

If the HCP_900 listing works, but the HCP_1200 listing does not, then we will 
need to arrange for you to get different credentials.

  Tim


On 05/15/2017 08:48 AM, Irisqql0922 wrote:

Dear hcp teams,


I sorry to bother you again with same problem.


I used default options and mounted data successfully. But when I checked 
/s3/hcp, I found that data in it has only 900 subjects. Obviously, it's not the 
latest 1200-release data. 




Since I want to analyse the latest version of data, I use s3fs to achieve my 
goal. 
I use command:
<ACCESS Key ID>:<SECRETE ACCESS KEY> > ~/.passwd-s3fs
chmod 600 ~/.passwd-s3fs
s3fs hcp-openaccess /s3mnt -o passwd_file=~/.passwd-s3fs


It failed everytime. In the syslog file, I found error below:




I got my credential keys from connectome DB, and I quiet sure that I put it 
right in passwd-s3fs.


So I wonder, does my credential keys have access to hcp-openaccess when using 
s3fs to mount data? If the answer is yes, do you have any suggestion for me? 


(note:  At first, I thought the problem may due to the  version of s3fs. So I 
created a new instance based on Amazon Linux AMI, and then download the lastest 
version of s3fs. But still, I failed because 'invalid credentials')


thank you very much!


Best,


Qinqin Li



_______________________________________________
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users



--
Timothy B. Brown
Business & Technology Application Analyst III
Pipeline Developer (Human Connectome Project)
tbbrown(at)wustl.edu

The material in this message is private and may contain Protected Healthcare 
Information (PHI). If you are not the intended recipient, be advised that any 
unauthorized use, disclosure, copying or the taking of any action in reliance 
on the contents of this information is strictly prohibited. If you have 
received this email in error, please immediately notify the sender via 
telephone or return mail.
_______________________________________________
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

Reply via email to