not sure about the error, never tried sourcing hive table directly with s3
data, we use EBS with ec2 for better data locality.
see if this trick works for you: use distcp to copy data from s3n to hdfs
location and then use create table with copied location...

On Fri, Aug 14, 2009 at 12:07 PM, sumit khanna <[email protected]>wrote:

> Hi,
> i launched a cluster using hadoo-ec2 scripts. I then sshed into the
> masternode
> After i log into hive i tried creating a table
> hive> create external table kv (key int, values string)  location
> 's3n://data.s3ndemo.hive/kv';
> FAILED: Error in metadata: java.lang.IllegalArgumentException: Wrong FS:
> s3n://data.s3ndemo.hive/kv, expected: hdfs://
> ec2-174-129-108-91.compute-1.amazonaws.com:50001
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
> Time taken: 0.606 seconds
>
>
> why do i get this error ? any help would be appreciated
> Regards
> Sumit
>

Reply via email to