-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/47239/#review132663
-----------------------------------------------------------


Ship it!




Ship It!

- Andrew Onischuk


On May 11, 2016, 3:25 p.m., Vitalyi Brodetskyi wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/47239/
> -----------------------------------------------------------
> 
> (Updated May 11, 2016, 3:25 p.m.)
> 
> 
> Review request for Ambari, Andrew Onischuk, Dmitro Lisnichenko, and Dmytro 
> Sen.
> 
> 
> Bugs: AMBARI-16453
>     https://issues.apache.org/jira/browse/AMBARI-16453
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> {code}
> 16/05/06 07:06:23 ERROR lzo.GPLNativeCodeLoader: Could not load native gpl 
> library
> java.lang.UnsatisfiedLinkError: no gplcompression in java.library.path
>       at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
>       at java.lang.Runtime.loadLibrary0(Runtime.java:870)
>       at java.lang.System.loadLibrary(System.java:1122)
>       at 
> com.hadoop.compression.lzo.GPLNativeCodeLoader.<clinit>(GPLNativeCodeLoader.java:32)
>       at com.hadoop.compression.lzo.LzoCodec.<clinit>(LzoCodec.java:71)
>       at java.lang.Class.forName0(Native Method)
>       at java.lang.Class.forName(Class.java:348)
>       at 
> org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2147)
>       at 
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2112)
>       at 
> org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:132)
>       at 
> org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:179)
>       at 
> org.apache.hadoop.mapreduce.lib.input.TextInputFormat.isSplitable(TextInputFormat.java:58)
>       at 
> org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:399)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:301)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:318)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:196)
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1719)
>       at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
>       at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
>       at org.apache.hadoop.examples.WordCount.main(WordCount.java:87)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)
>       at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
>       at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>       at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> 16/05/06 07:06:23 ERROR lzo.LzoCodec: Cannot load native-lzo without 
> native-hadoop
> 16/05/06 07:06:23 INFO mapreduce.JobSubmitter: number of splits:1
> 16/05/06 07:06:23 INFO mapreduce.JobSubmitter: Submitting tokens for job: 
> job_1462518135627_0001
> 16/05/06 07:06:24 INFO impl.YarnClientImpl: Submitted application 
> application_1462518135627_0001
> 16/05/06 07:06:24 INFO mapreduce.Job: The url to track the job: 
> http://ambarirmps-5.openstacklocal:8088/proxy/application_1462518135627_0001/
> 16/05/06 07:06:24 INFO mapreduce.Job: Running job: job_1462518135627_0001
> 16/05/06 07:06:35 INFO mapreduce.Job: Job job_1462518135627_0001 running in 
> uber mode : false
> 16/05/06 07:06:35 INFO mapreduce.Job:  map 0% reduce 0%
> 16/05/06 07:06:42 INFO mapreduce.Job:  map 100% reduce 0%
> 
> {code}
> 
> 
> Diffs
> -----
> 
>   ambari-server/src/main/resources/stacks/HDP/2.2/services/HDFS/metainfo.xml 
> 40e05ff 
>   ambari-server/src/main/resources/stacks/HDP/2.3/services/HDFS/metainfo.xml 
> 0ea7da9 
> 
> Diff: https://reviews.apache.org/r/47239/diff/
> 
> 
> Testing
> -------
> 
> mvn clean test
> 
> 
> Thanks,
> 
> Vitalyi Brodetskyi
> 
>

Reply via email to