----------------------------------------------------------- This is an automatically generated e-mail. To reply, visit: https://reviews.apache.org/r/41449/#review110674 -----------------------------------------------------------
Ship it! Ship It! - Vitalyi Brodetskyi On Гру. 16, 2015, 4:39 після полудня, Andrew Onischuk wrote: > > ----------------------------------------------------------- > This is an automatically generated e-mail. To reply, visit: > https://reviews.apache.org/r/41449/ > ----------------------------------------------------------- > > (Updated Гру. 16, 2015, 4:39 після полудня) > > > Review request for Ambari and Vitalyi Brodetskyi. > > > Bugs: AMBARI-14400 > https://issues.apache.org/jira/browse/AMBARI-14400 > > > Repository: ambari > > > Description > ------- > > While creating cluster on suse, Spark service is not able to start. Restart of > same service is also not working. > > > > > > 5/12/16 08:54:38 INFO SparkEnv: Registering OutputCommitCoordinator > 15/12/16 08:54:38 INFO Server: jetty-8.y.z-SNAPSHOT > 15/12/16 08:54:38 INFO AbstractConnector: Started > [email protected]:4040 > 15/12/16 08:54:38 INFO Utils: Successfully started service 'SparkUI' on > port 4040. > 15/12/16 08:54:38 INFO SparkUI: Started SparkUI at > http://172.22.107.190:4040 > 15/12/16 08:54:38 INFO Executor: Starting executor ID driver on host > localhost > 15/12/16 08:54:38 INFO Utils: Successfully started service > 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42892. > 15/12/16 08:54:38 INFO NettyBlockTransferService: Server created on 42892 > 15/12/16 08:54:38 INFO BlockManagerMaster: Trying to register BlockManager > 15/12/16 08:54:38 INFO BlockManagerMasterEndpoint: Registering block > manager localhost:42892 with 265.1 MB RAM, BlockManagerId(driver, localhost, > 42892) > 15/12/16 08:54:38 INFO BlockManagerMaster: Registered BlockManager > 15/12/16 08:54:40 INFO HiveContext: Initializing execution hive, version > 0.13.1 > Exception in thread "main" java.lang.RuntimeException: > java.lang.NumberFormatException: For input string: "5s" > at > org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346) > at > org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:117) > at > org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:165) > at > org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:163) > at > org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:170) > at > org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:53) > at > org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:75) > at > org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:497) > at > org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665) > at > org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170) > at > org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193) > at > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112) > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > Caused by: java.lang.NumberFormatException: For input string: "5s" > at > java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) > at java.lang.Integer.parseInt(Integer.java:580) > at java.lang.Integer.parseInt(Integer.java:615) > at > org.apache.hadoop.conf.Configuration.getInt(Configuration.java:1258) > at > org.apache.hadoop.hive.conf.HiveConf.getIntVar(HiveConf.java:1211) > at > org.apache.hadoop.hive.conf.HiveConf.getIntVar(HiveConf.java:1220) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:58) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72) > at > org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453) > at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465) > at > org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340) > ... 16 more > 15/12/16 08:54:40 INFO SparkContext: Invoking stop() from shutdown hook > 15/12/16 08:54:40 INFO ContextHandler: stopped > o.s.j.s.ServletContextHandler{/metrics/json,null} > > > Diffs > ----- > > > ambari-server/src/main/resources/common-services/SPARK/1.2.0.2.2/package/scripts/params.py > c4bbdc1 > > ambari-server/src/main/resources/stacks/HDP/2.3/services/SPARK/configuration/spark-hive-site-override.xml > 54df516 > > Diff: https://reviews.apache.org/r/41449/diff/ > > > Testing > ------- > > mvn clean test > > > Thanks, > > Andrew Onischuk > >
