Thanks All, *Sriram *can you explain about that shell script which directly update the metadata.
Actually data is in large amount , having many tables which is not possible to move the data and create each table into different database; table also have some partitions. Regards Vikas Srivastava On Tue, Jan 31, 2012 at 7:22 AM, Matt Tucker <matthewt...@gmail.com> wrote: > Someone recently posted a command to convert a managed table to an > external table. If you also use that command, there's no expensive copy > operation. > > You'd probably want to move the data into a different HDFS directory > though, to keep your namespace as consistent as possible, > > Matt > > > > On Jan 30, 2012, at 8:48 PM, Sriram Krishnan <skrish...@netflix.com> > wrote: > > You could do that if you don't mind moving the data (which may be > expensive depending on how much data you have). Also if your table is > partitioned, you may have to do another additional step (to recover > partitions). > > Sriram > > From: Aniket Mokashi <aniket...@gmail.com> > Reply-To: <user@hive.apache.org> > Date: Mon, 30 Jan 2012 16:32:03 -0800 > To: <user@hive.apache.org> > Subject: Re: move tables into different database > > If you are on hdfs > > How about-- > > use db1; > create table table1 like db2.table1; > > and move the data? > > Thanks, > Aniket > > On Mon, Jan 30, 2012 at 8:09 AM, Sriram Krishnan <skrish...@netflix.com>wrote: > >> AFAIK there is no way in HiveQL to do this. We had a similar >> requirement in the past, and we wrote a shell script to update the MySQL >> metastore directly (obviously not the cleanest or recommended way to go). >> >> Cheers, >> Sriram >> >> From: hadoop hive <hadooph...@gmail.com> >> Reply-To: <user@hive.apache.org> >> Date: Mon, 30 Jan 2012 20:05:13 +0530 >> To: <user@hive.apache.org> >> Subject: move tables into different database >> >> Hey folks, >> >> Is this possible to moves table into different hive database?? >> >> regards >> Vikas Srivastava >> >> >> > > > -- > "...:::Aniket:::... Quetzalco@tl" > >