Silly question... then what's meant by the native libraries when you talk about compression?
On Jun 3, 2013, at 5:27 AM, Harsh J <[email protected]> wrote: > Hi Xu, > > HDFS is data agnostic. It does not currently care about what form the > data of the files are in - whether they are compressed, encrypted, > serialized in format-x, etc.. > > There are hadoop-common APIs that support decompressing of supported > codecs, but there are no C/C++ level implementations of these (though > you may use JNI). You will have to write/use your own > decompress/compress code for files. > > On Mon, Jun 3, 2013 at 12:33 PM, Xu Haiti <[email protected]> wrote: >> >> I have found somebody talks libhdfs does not support read/write gzip file at >> about 2010. >> >> I download the newest hadoop-2.0.4 and read hdfs.h. There is also no >> compressing arguments. >> >> Now I am wondering if it supports reading compressed file now? >> >> If it not, how can I make a patch for the libhdfs and make it work? >> >> Thanks in advance. >> >> Best Regards >> Haiti > > > > -- > Harsh J >
