Hi Ravi, I got the code working here:
https://github.com/StreamBright/orcdemo/blob/master/src/main/java/org/streambright/orcdemo/App.java It seems the OrcFile.createWriter takes a path on the local filesystem and there is no need for FileSystem.getLocal. Regards, Istvan On Mon, Jan 11, 2016 at 10:46 AM, Ravi Tatapudi <[email protected]> wrote: > Yes. I think, including the below example-code, to ORC-documentation would > be useful (for test purposes...etc). > > Regards, > Ravi > > > > > From: Lefty Leverenz <[email protected]> > To: [email protected] > Date: 01/11/2016 03:08 PM > Subject: Re: Writing ORC files without HDFS > > ------------------------------ > > > > Should this be included in the ORC documentation? > > -- Lefty > > On Fri, Jan 8, 2016 at 2:33 PM, István <*[email protected]* > <[email protected]>> wrote: > Hi Ravi, > > Excellent response, thank you very much, this is exactly I was looking for! > > Best regards, > Istvan > > On Fri, Jan 8, 2016 at 8:38 AM, Ravi Tatapudi <*[email protected]* > <[email protected]>> wrote: > Hello, > > You can write ORC-files on local-filesystem, by getting the local > "FileSystem" object, as "FileSystem.getLocal(conf)". Pl. find below the > simple-example given below & see if it works for your requirement. > > ================================================= > public class orcw { > > private static Configuration conf = new Configuration(); > public static Writer writer ; > > public static class OrcRow > { > int col1 ; > String col2 ; > String col3 ; > > OrcRow(int a, String b, String c) { > this.col1 = a ; > this.col2 = b ; > this.col3 = c ; > } > } > > public static void main(String[] args) throws IOException, > > InterruptedException, ClassNotFoundException { > > String path = "/tmp/orcfile1"; > > try { > > conf = new Configuration(); > FileSystem fs = FileSystem.getLocal(conf); > > ObjectInspector ObjInspector = > ObjectInspectorFactory.getReflectionObjectInspector(OrcRow.class, > ObjectInspectorFactory.ObjectInspectorOptions.JAVA); > writer = OrcFile.createWriter(new Path(path), > OrcFile.writerOptions(conf).inspector(ObjInspector).stripeSize(100000).bufferSize(10000).compress(CompressionKind.ZLIB).version(OrcFile.Version.V_0_12)); > > writer.addRow(new OrcRow(1,"hello","orcFile")) ; > writer.addRow(new OrcRow(2,"hello2","orcFile2")) ; > > writer.close(); > } > catch (Exception e) > { > e.printStackTrace(); > } > } > } > ================================================= > > Thanks, > Ravi > > > > From: István <*[email protected]* <[email protected]>> > To: *[email protected]* <[email protected]> > Date: 01/08/2016 01:53 AM > Subject: Writing ORC files without HDFS > ------------------------------ > > > > > Hi all, > > I am working on a project that requires me to write ORC files locally on a > non-HDFS location. I was wondering if there is any project doing something > similar, but I guess there is none, after spending some time on Google. > > I think what needs to get done is to re-implement the ORC Writer, sort of > similar to the following but leaving out Hadoop: > > > *https://github.com/apache/hive/blob/master/orc/src/java/org/apache/orc/impl/WriterImpl.java* > <https://github.com/apache/hive/blob/master/orc/src/java/org/apache/orc/impl/WriterImpl.java> > > Am I on the right track implementing this? > > Let me know if you have any suggestions or links in the subject. > > Thank you very much, > Istvan > > -- > the sun shines for all > > > > > > > -- > the sun shines for all > > > > > -- the sun shines for all
