Hadoop fs -du command will show you the size of the files. What do you mean by 
physical?

Sent from my iPhone

On Jul 26, 2010, at 6:43 AM, "vaibhav negi" <[email protected]> wrote:

> Hi,
> 
> Hadoop -dfs command show logical path /user/hive/warehouse. How can i see 
> where this directory exists physically ?
> 
>  
> 
> Vaibhav Negi
> 
> 
> On Mon, Jul 26, 2010 at 2:45 PM, Amogh Vasekar <[email protected]> wrote:
> Hi,
> The default HWI (hive web interface) provides some basic metadata, but don’t 
> think file sizes are included. In any case, you can query using the common 
> hadoop dfs commands. The default warehouse directory is as set in your hive 
> conf xml.
> 
> Amogh
> 
> 
> 
> On 7/26/10 2:30 PM, "vaibhav negi" <[email protected]> wrote:
> 
> Hi,
> 
> Thanks amogh.
> How can i browse actual physical location  of hive tables juts like i see 
> mysql tables in mysql directory. I want to check actual disk space consumed 
> by hive tables.
> 
> 
> 
> Vaibhav Negi
> 
> 
> On Mon, Jul 26, 2010 at 1:55 PM, Amogh Vasekar <[email protected]> wrote:
> Hi,
> You can create an external table pointing to data already on hdfs and 
> specifying the delimiter-
> CREATE EXTERNAL TABLE page_view_stg(viewTime INT, userid BIGINT,
>                     page_url STRING, referrer_url STRING,
>                     ip STRING COMMENT 'IP Address of the User',
>                     country STRING COMMENT 'country of origination')
>     COMMENT 'This is the staging page view table'
>     ROW FORMAT DELIMITED FIELDS TERMINATED BY '44' LINES TERMINATED BY '12'
>     STORED AS TEXTFILE
>     LOCATION '/user/data/staging/page_view';
> 
> http://wiki.apache.org/hadoop/Hive/Tutorial#Creating_Tables   for more
> 
> HTH,
> Amogh
> 
> 
> 
> On 7/26/10 1:02 PM, "vaibhav negi" <[email protected] 
> <http://[email protected]> > wrote:
> 
> Hi,
> 
> Is there some way to load csv file into hive? 
> 
> Vaibhav Negi
> 
> 
> 
> 

Reply via email to