[ 
https://issues.apache.org/jira/browse/GIRAPH-659?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13650379#comment-13650379
 ] 

Nitay Joffe commented on GIRAPH-659:
------------------------------------

Maja, after GIRAPH-655 goes in the failures here are because of 
WrappedVertexOutputFormat's committer not overriding commitJob() for profiles 
e.g. the hadoop profile. Because of this the default version gets called which 
just calls cleanupJob(). HiveIO's commitJob() never gets called so the data 
doesn't get registered with Hive.
                
> giraph-hive tests all fail
> --------------------------
>
>                 Key: GIRAPH-659
>                 URL: https://issues.apache.org/jira/browse/GIRAPH-659
>             Project: Giraph
>          Issue Type: Bug
>            Reporter: Avery Ching
>            Assignee: Maja Kabiljo
>
> aching@localhost:~/git/giraph_git$ java -version
> java version "1.6.0_41"
> Java(TM) SE Runtime Environment (build 1.6.0_41-b02-445-11M4107)
> Java HotSpot(TM) 64-Bit Server VM (build 20.14-b01-445, mixed mode)
> mvn clean install
> -------------------------------------------------------
>  T E S T S
> -------------------------------------------------------
> Running org.apache.giraph.hive.input.HiveVertexInputTest
> Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 0.167 sec <<< 
> FAILURE!
> Running org.apache.giraph.hive.input.HiveEdgeInputTest
> Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 0.013 sec <<< 
> FAILURE!
> Running org.apache.giraph.hive.output.HiveOutputTest
> Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 0.01 sec <<< 
> FAILURE!
> Results :
> Tests in error: 
>   testVertexInput(org.apache.giraph.hive.input.HiveVertexInputTest)
>   
> testVertexInputWithPartitions(org.apache.giraph.hive.input.HiveVertexInputTest)
>   testValues(org.apache.giraph.hive.input.HiveVertexInputTest)
>   testEdgeInput(org.apache.giraph.hive.input.HiveEdgeInputTest)
>   testEdgeInputWithPartitions(org.apache.giraph.hive.input.HiveEdgeInputTest)
>   testEdgeInputWithValues(org.apache.giraph.hive.input.HiveEdgeInputTest)
>   testHiveOutput(org.apache.giraph.hive.output.HiveOutputTest)
>   testHiveOutputWithPartitions(org.apache.giraph.hive.output.HiveOutputTest)
> Tests run: 8, Failures: 0, Errors: 8, Skipped: 0
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO] 
> [INFO] Apache Giraph Parent .............................. SUCCESS [0.916s]
> [INFO] Apache Giraph Core ................................ SUCCESS [1:57.752s]
> [INFO] Apache Giraph Hive I/O ............................ FAILURE [16.228s]
> [INFO] Apache Giraph Examples ............................ SKIPPED
> [INFO] Apache Giraph Accumulo I/O ........................ SKIPPED
> [INFO] Apache Giraph HBase I/O ........................... SKIPPED
> [INFO] Apache Giraph HCatalog I/O ........................ SKIPPED
> The errors seem to all be the same: (for example)
>   <testcase time="0.001" 
> classname="org.apache.giraph.hive.output.HiveOutputTest" 
> name="testHiveOutputWithPartitions">
>     <error message="failed to load Hadoop native library" 
> type="java.lang.RuntimeException">java.lang.RuntimeException: failed to load 
> Hadoop native library
>         at 
> com.facebook.hiveio.common.HadoopNative.requireHadoopNative(HadoopNative.java:50)
>         at 
> com.facebook.hiveio.testing.LocalHiveServer.&lt;init&gt;(LocalHiveServer.java:44)
>         at 
> org.apache.giraph.hive.output.HiveOutputTest.&lt;init&gt;(HiveOutputTest.java:54)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
>         at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at 
> org.junit.runners.BlockJUnit4ClassRunner.createTest(BlockJUnit4ClassRunner.java:202)
>         at 
> org.junit.runners.BlockJUnit4ClassRunner$1.runReflectiveCall(BlockJUnit4ClassRunner.java:251)
>         at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>         at 
> org.junit.runners.BlockJUnit4ClassRunner.methodBlock(BlockJUnit4ClassRunner.java:248)
>         at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:76)
>         at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>         at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
>         at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
>         at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
>         at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
>         at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
>         at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>         at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
>         at 
> org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:59)
>         at 
> org.apache.maven.surefire.suite.AbstractDirectoryTestSuite.executeTestSet(AbstractDirectoryTestSuite.java:120)
>         at 
> org.apache.maven.surefire.suite.AbstractDirectoryTestSuite.execute(AbstractDirectoryTestSuite.java:103)
>         at org.apache.maven.surefire.Surefire.run(Surefire.java:169)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at 
> org.apache.maven.surefire.booter.SurefireBooter.runSuitesInProcess(SurefireBooter.java:350)
>         at 
> org.apache.maven.surefire.booter.SurefireBooter.main(SurefireBooter.java:1021)
> Caused by: java.lang.IllegalArgumentException: resource 
> /nativelib/Mac_OS_X-x86_64/libhadoop.jnilib relative to 
> com.facebook.hiveio.common.HadoopNative not found.
>         at 
> com.google.common.base.Preconditions.checkArgument(Preconditions.java:119)
>         at com.google.common.io.Resources.getResource(Resources.java:169)
>         at 
> com.facebook.hiveio.common.NativeCodeHelper.loadLibrary(NativeCodeHelper.java:31)
>         at 
> com.facebook.hiveio.common.HadoopNative.requireHadoopNative(HadoopNative.java:53)
>         at 
> com.facebook.hiveio.testing.LocalHiveServer.&lt;init&gt;(LocalHiveServer.java:44)
>         at 
> org.apache.giraph.hive.input.HiveVertexInputTest.&lt;init&gt;(HiveVertexInputTest.java:46)

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to