----------------------------------------------------------- This is an automatically generated e-mail. To reply, visit: https://reviews.apache.org/r/64419/#review193165 -----------------------------------------------------------
Ship it! Ship It! - Nate Cole On Dec. 7, 2017, 3:53 p.m., Jonathan Hurley wrote: > > ----------------------------------------------------------- > This is an automatically generated e-mail. To reply, visit: > https://reviews.apache.org/r/64419/ > ----------------------------------------------------------- > > (Updated Dec. 7, 2017, 3:53 p.m.) > > > Review request for Ambari, Dmitro Lisnichenko and Nate Cole. > > > Bugs: AMBARI-22613 > https://issues.apache.org/jira/browse/AMBARI-22613 > > > Repository: ambari > > > Description > ------- > > Many queries, similar to the below {{InsertOverwrite}}, are failing after > performing an HDP->HDP upgrade. The cause is that Hive does not localize the > Tez TAR file, but instead uses the MapReduce2 property > {{mapreduce.admin.user.env}} > > ``` > >>> analyze table studenttab10k compute statistics for columns age ; > INFO : Session is already open > INFO : Dag name: analyze table studenttab10k compute st...age(Stage-0) > INFO : Status: Running (Executing on YARN cluster with App id > application_1512568339263_0003) > > [2K-------------------------------------------------------------------------------- > [2K[36;1m VERTICES STATUS TOTAL COMPLETED RUNNING PENDING > FAILED KILLED > [22;0m[2K-------------------------------------------------------------------------------- > [2KMap 1 RUNNING 1 0 1 0 2 > 0 > Reducer 2 INITED 1 0 0 1 0 > 0 > [2K-------------------------------------------------------------------------------- > [2K[31;1mVERTICES: 00/02 [>>--------------------------] 0% ELAPSED > TIME: 9.38 s > [22;0m[2K-------------------------------------------------------------------------------- > [8A[2K-------------------------------------------------------------------------------- > [2K[36;1m VERTICES STATUS TOTAL COMPLETED RUNNING PENDING > FAILED KILLED > [22;0m[2K-------------------------------------------------------------------------------- > [2KMap 1 RUNNING 1 0 0 1 4 > 0 > Reducer 2 INITED 1 0 0 1 0 > 0 > [2K-------------------------------------------------------------------------------- > [2K[31;1mVERTICES: 00/02 [>>--------------------------] 0% ELAPSED > TIME: 14.37 s > [22;0m[2K-------------------------------------------------------------------------------- > ERROR : Status: Failed > ERROR : Vertex failed, vertexName=Map 1, > vertexId=vertex_1512568339263_0003_2_00, diagnostics=[Task failed, > taskId=task_1512568339263_0003_2_00_000000, diagnostics=[TaskAttempt 0 > failed, info=[Error: Failure while running task:java.lang.RuntimeException: > java.io.IOException: Unable to get CompressorType for codec > (org.apache.hadoop.io.compress.SnappyCodec). This is most likely due to > missing native libraries for the codec. > Caused by: java.lang.RuntimeException: native snappy library not available: > this version of libhadoop was built without snappy support. > at > org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:65) > at > org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:134) > at > org.apache.tez.runtime.library.common.sort.impl.ExternalSorter.<init>(ExternalSorter.java:208) > ... 18 more > ], TaskAttempt 1 failed, info=[Error: Failure while running > task:java.lang.RuntimeException: java.io.IOException: Unable to get > CompressorType for codec (org.apache.hadoop.io.compress.SnappyCodec). This is > most likely due to missing native libraries for the codec. > ``` > > > Diffs > ----- > > > ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/params_linux.py > 18e297865e > ambari-server/src/main/resources/stacks/HDP/2.6/upgrades/config-upgrade.xml > 6e1c81968d > > > Diff: https://reviews.apache.org/r/64419/diff/2/ > > > Testing > ------- > > Manual upgrade testing. > > > Thanks, > > Jonathan Hurley > >