Dear hcp teams,

I sorry to bother you again with same problem.


I used default options and mounted data successfully. But when I checked 
/s3/hcp, I found that data in it has only 900 subjects. Obviously, it's not the 
latest 1200-release data. 




Since I want to analyse the latest version of data, I use s3fs to achieve my 
goal. 
I use command:
<ACCESS Key ID>:<SECRETE ACCESS KEY> > ~/.passwd-s3fs
chmod 600 ~/.passwd-s3fs
s3fs hcp-openaccess /s3mnt -o passwd_file=~/.passwd-s3fs


It failed everytime. In the syslog file, I found error below:




I got my credential keys from connectome DB, and I quiet sure that I put it 
right in passwd-s3fs.


So I wonder, does my credential keys have access to hcp-openaccess when using 
s3fs to mount data? If the answer is yes, do you have any suggestion for me? 


(note:  At first, I thought the problem may due to the  version of s3fs. So I 
created a new instance based on Amazon Linux AMI, and then download the lastest 
version of s3fs. But still, I failed because 'invalid credentials')


thank you very much!


Best,


Qinqin Li


_______________________________________________
HCP-Users mailing list
[email protected]
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

Reply via email to