Nitin, Thanks a lot for these answers.
Then based on this: I should be able to call the dfs command and hdfs APIs from a java application, is this correct? (I bet this sound naive) I know about the edit log file. I have to do more investigation on this but, are you aware about any sort of standard method for doing this auditing? I have looked at edit log file and its layout is not very straightforward, do you know about any documentation I could look at? Thanks a lot. On 7 October 2013 11:13, Nitin Pawar <nitinpawar...@gmail.com> wrote: > Answers as per my understanding and I may be wrong. so wait for others to > correct me as well. > > 1. What files in hdfs constitute a hive table. > If you specifically do an alter table command and map it to a single file, > all the files inside a directory where the table is created are mapped to > the table. > > 2. What is the size of each of these files. > This you can easily get out of hadoop dfs command. > > 3. The time stamp of the creation/last update to each of these files. > Tricky, hive does not maintain a file was created or updated. > What you may want to do is, audit your edit log to see all the changes > happenings to files. > Last modification time of all the files is available to you via hdfs apis. > > except anything speicific to hive, whatever you mentioned can be done via > hdfs api as well as using hadoop command line tool. > > > On Mon, Oct 7, 2013 at 11:31 PM, demian rosas <demia...@gmail.com> wrote: > >> Hi all, >> >> I want to track the changes made to the files of a Hive table. >> >> I wounder whether there is any API that I can use to find out the >> following: >> >> 1. What files in hdfs constitute a hive table. >> 2. What is the size of each of these files. >> 3. The time stamp of the creation/last update to each of these files. >> >> >> Also in a wider view, is there any API that can do the above mentioned >> for HDFS files in general (not only hive specific)? >> >> Thanks a lot in advance. >> >> Cheers. >> >> >> > > > -- > Nitin Pawar >