Sorry my bad. I had just "AWS..="  , once I exported those variables,
they work fine now.

-Prasen

On Tue, Feb 16, 2010 at 9:58 AM, prasenjit mukherjee
<[email protected]> wrote:
> I have the following problem :  It doesn't seem to be picking up the
> env_var. Although echo $AWS_SECRET_ACCESS_KEY prints it correctly. Any
> pointers I can debug it through ?
>
> pmukher...@ubuntu:~/apps/cloudera-for-hadoop-on-ec2-py-0.3.0-beta$
> ./hadoop-ec2 list
> Traceback (most recent call last):
>  File "./hadoop-ec2", line 151, in <module>
>    list_all()
>  File 
> "/home/pmukherjee/apps/cloudera-for-hadoop-on-ec2-py-0.3.0-beta/hadoop/ec2/commands.py",
> line 46, in list_all
>    clusters = get_clusters_with_role(MASTER)
>  File 
> "/home/pmukherjee/apps/cloudera-for-hadoop-on-ec2-py-0.3.0-beta/hadoop/ec2/cluster.py",
> line 29, in get_clusters_with_role
>    all = EC2Connection().get_all_instances()
>  File "/usr/lib/python2.5/site-packages/boto/ec2/connection.py", line
> 69, in __init__
>    self.region.endpoint, debug, https_connection_factory, path)
>  File "/usr/lib/python2.5/site-packages/boto/connection.py", line
> 446, in __init__
>    debug,  https_connection_factory, path)
>  File "/usr/lib/python2.5/site-packages/boto/connection.py", line
> 169, in __init__
>    self.hmac = hmac.new(self.aws_secret_access_key, digestmod=sha)
> AttributeError: EC2Connection instance has no attribute 
> 'aws_secret_access_key'
>
> pmukher...@ubuntu:~/apps/cloudera-for-hadoop-on-ec2-py-0.3.0-beta$
> echo $AWS_SECRET_ACCESS_KEY
> XXXXXXXXXXXXXXXXXXXXXX
>
>
>
> On Tue, Feb 16, 2010 at 8:29 AM, Zaki Rahaman <[email protected]> wrote:
>> I've been using the latest scripts with no problems at all. Plus if you use
>> CDH2 you get hive installed on your cluster (alternatively you could use the
>> package manager to download and install hive)
>>
>> Sent from my iPhone
>>
>> On Feb 15, 2010, at 9:43 PM, prasenjit mukherjee <[email protected]>
>> wrote:
>>
>>> I am actually using the hadoop-ec2 ( the earlier version of Cloudera I
>>> believe )  which comes along with hadoop ( under src/contrib/ec2/...
>>> ), and this works fine with me, and I am a bit reluctant to move over
>>> to cloudera's latest version. Hence was trying to figure out a way to
>>> use hive ( with minimal change to my existing setup ).
>>>
>>> -Prasen
>>>
>>> On Tue, Feb 16, 2010 at 12:59 AM, Carl Steinbach <[email protected]>
>>> wrote:
>>>>
>>>> Hi Prasenjit,
>>>>
>>>> Can you be more specific about which additional startup scripts you are
>>>> trying to
>>>> avoid. Are you talking about Cloudera's version of the hadoop-ec2 script?
>>>>
>>>> Thanks.
>>>>
>>>> Carl
>>>>
>>>>
>>
>

Reply via email to