Works like a charme. Thanks a lot for the hint and of course for all the hard work you do at elephant bird project.
Best regards, Torben 2011/3/14 Dmitriy Ryaboy <[email protected]> > This is caused by google-collect jar not being on your classpath: > > Caused by: java.lang.ClassNotFoundException: com.google.common.collect.Maps > > you are going to want to register the > dependencies: google-collect-1.0.jar, json-simple-1.1.jar > > D > > > On Mon, Mar 14, 2011 at 9:02 AM, Torben Brodt <[email protected]> wrote: > >> Hey folks, >> i am using the pig-0.8 branch from dvryaboy repository and Pig >> (0.8.0+5-1~maverick-cdh3b4) from the cloudera distribution. >> i just want to run the example "json_word_count.pig" >> Well it works, in fact there no map/reduce job ;) But when i add a DUMP to >> the example, it does not work any longer. >> >> I tried the following code: >> # register elephant-bird-2.0-SNAPSHOT.jar; >> # raw_data = load '/tmp/flume/2011-03-14/1600/' using >> com.twitter.elephantbird.pig8.load.LzoJsonLoader() >> # as ( >> # json: map[] >> # ); >> # DUMP raw_data >> >> Thats my Error: >> # ERROR 2997: Unable to recreate exception from backed error: >> java.lang.RuntimeException: could not instantiate >> 'com.twitter.elephantbird.pig8.load.LzoJsonLoader' with arguments 'null' >> >> LZO should work, thats what the logs say: >> # com.hadoop.compression.lzo.LzoCodec - Successfully loaded & initialized >> native-lzo library [hadoop-lzo rev fatal: Not a git repository (or any of >> the parent directories): .git] >> >> I hopy you have any ideas. See the full stacktrace attached. >> >> Thanks a lot, >> Torben >> >> >> Backend error message >> --------------------- >> java.lang.RuntimeException: could not instantiate >> 'com.twitter.elephantbird.pig8.load.LzoJsonLoader' with arguments 'null' >> at >> >> org.apache.pig.impl.PigContext.instantiateFuncFromSpec(PigContext.java:502) >> at >> >> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getLoadFunc(PigInputFormat.java:153) >> at >> >> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.createRecordReader(PigInputFormat.java:105) >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:613) >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322) >> at org.apache.hadoop.mapred.Child$4.run(Child.java:240) >> at java.security.AccessController.doPrivileged(Native Method) >> at javax.security.auth.Subject.doAs(Subject.java:396) >> at >> >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115) >> at org.apache.hadoop.mapred.Child.main(Child.java:234) >> Caused by: java.lang.NoClassDefFoundError: com/google/common/collect/Maps >> at com.twitter.elephantbird.pig8.util.PigCounterHelper.<init>(Unknown >> Source) >> at com.twitter.elephantbird.pig8.load.LzoBaseLoadFunc.<init>(Unknown >> Source) >> at com.twitter.elephantbird.pig8.load.LzoJsonLoader.<init>(Unknown Source) >> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) >> at >> >> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) >> at >> >> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) >> at java.lang.reflect.Constructor.newInstance(Constructor.java:513) >> at java.lang.Class.newInstance0(Class.java:355) >> at java.lang.Class.newInstance(Class.java:308) >> at >> >> org.apache.pig.impl.PigContext.instantiateFuncFromSpec(PigContext.java:472) >> ... 9 more >> Caused by: java.lang.ClassNotFoundException: >> com.google.common.collect.Maps >> at java.net.URLClassLoader$1.run(URLClassLoader.java:202) >> at java.security.AccessController.doPrivileged(Native Method) >> at java.net.URLClassLoader.findClass(URLClassLoader.java:190) >> at java.lang.ClassLoader.loadClass(ClassLoader.java:307) >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) >> at java.lang.ClassLoader.loadClass(ClassLoader.java:248) >> ... 19 more >> >> Pig Stack Trace >> --------------- >> ERROR 2997: Unable to recreate exception from backed error: >> java.lang.RuntimeException: could not instantiate >> 'com.twitter.elephantbird.pig8.load.LzoJsonLoader' with arguments 'null' >> >> org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to >> open iterator for alias raw_data. Backend error : Unable to recreate >> exception from backed error: java.lang.RuntimeException: could not >> instantiate 'com.twitter.elephantbird.pig8.load.LzoJsonLoader' with >> arguments 'null' >> at org.apache.pig.PigServer.openIterator(PigServer.java:742) >> at >> org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:612) >> at >> >> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:303) >> at >> >> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:165) >> at >> >> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:141) >> at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:90) >> at org.apache.pig.Main.run(Main.java:406) >> at org.apache.pig.Main.main(Main.java:107) >> Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR >> 2997: >> Unable to recreate exception from backed error: >> java.lang.RuntimeException: >> could not instantiate 'com.twitter.elephantbird.pig8.load.LzoJsonLoader' >> with arguments 'null' >> at >> >> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.Launcher.getErrorMessages(Launcher.java:221) >> at >> >> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.Launcher.getStats(Launcher.java:151) >> at >> >> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:337) >> at >> >> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.execute(HExecutionEngine.java:378) >> at >> org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1198) >> at org.apache.pig.PigServer.storeEx(PigServer.java:874) >> at org.apache.pig.PigServer.store(PigServer.java:816) >> at org.apache.pig.PigServer.openIterator(PigServer.java:728) >> ... 7 more >> >> ================================================================================ >> > >
