[
https://issues.apache.org/jira/browse/WHIRR-74?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12900082#action_12900082
]
Tom White commented on WHIRR-74:
--------------------------------
AWS keys are not passed by default, but you can pass them using the technique
described at
https://issues.apache.org/jira/browse/HADOOP-6681?focusedCommentId=12854098&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#action_12854098.
Does this solve the issue?
> AWS Keys Not being Propagated to Instances via Userdata
> -------------------------------------------------------
>
> Key: WHIRR-74
> URL: https://issues.apache.org/jira/browse/WHIRR-74
> Project: Whirr
> Issue Type: Bug
> Components: contrib/python
> Environment: Ubuntu 10.04 and Mac OSX
> with ami-ed59bf84 for instances
> Reporter: Sameer Al-Sakran
>
> The %ENV% string in
> contrib/python/src/py/hadoop/cloud/data/hadoop-ec2-init-remote.sh is mean to
> export various environment variables.
> These variables exist in the invoking client shell process, but dont exist on
> the launched instance. In this specific case, the aws keys for S3/S3n access
> were being set to empty strings. Manually adding AWS ID/SECRET to the pairs
> dictionary passed to build_env_string in py.hadoop.cloud.service:209 fixes
> this problem, but introduces aws specific information to the Service class.
> Should the ENV string do an assignment + export instead of just an export
> here?
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.