This issue seems like kafka connection issue.

@Liu, Lionel<mailto:[email protected]>

Could you take a look at this?

Thanks,
William

________________________________
From: Jin Liu <[email protected]>
Sent: Wednesday, October 25, 2017 9:46 AM
To: [email protected]
Subject: startup failure

Do you know what's problem

root@sandbox:~/jar# spark-submit --class org.apache.griffin.measure.Application 
--master yarn-client --queue default --verbose griffin-measure.jar env.json 
config.json local,local

17/10/25 01:27:18 WARN metastore.ObjectStore: Version information not found in 
metastore. hive.metastore.schema.verification is not enabled so recording the 
schema version 1.2.0
17/10/25 01:27:18 WARN metastore.ObjectStore: Failed to get database default, 
returning NoSuchObjectException
[1508894846730] streaming-accu-sample start
17/10/25 01:27:27 ERROR measure.Application$: app error: 
java.nio.channels.ClosedChannelException
Exception in thread "main" org.apache.spark.SparkException: 
java.nio.channels.ClosedChannelException
        at 
org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
        at 
org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
        at scala.util.Either.fold(Either.scala:97)
        at 
org.apache.spark.streaming.kafka.KafkaCluster$.checkErrors(KafkaCluster.scala:365)
        at 
org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffsets(KafkaUtils.scala:222)
        at 
org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:484)
        at 
org.apache.griffin.measure.connector.DataConnectorFactory$$anon$1.createDStream(DataConnectorFactory.scala:120)
        at 
org.apache.griffin.measure.connector.streaming.KafkaStreamingDataConnector$$anonfun$stream$1.apply(KafkaStreamingDataConnector.scala:54)
        at 
org.apache.griffin.measure.connector.streaming.KafkaStreamingDataConnector$$anonfun$stream$1.apply(KafkaStreamingDataConnector.scala:52)
        at scala.util.Try$.apply(Try.scala:161)
        at 
org.apache.griffin.measure.connector.streaming.KafkaStreamingDataConnector.stream(KafkaStreamingDataConnector.scala:52)
        at 
org.apache.griffin.measure.connector.direct.StreamingCacheDirectDataConnector$class.init(StreamingCacheDirectDataConnector.scala:41)
        at 
org.apache.griffin.measure.connector.direct.KafkaCacheDirectDataConnector.init(KafkaCacheDirectDataConnector.scala:33)
        at 
org.apache.griffin.measure.algo.streaming.StreamingAccuracyAlgo$$anonfun$run$1.apply$mcV$sp(StreamingAccuracyAlgo.scala:125)
        at 
org.apache.griffin.measure.algo.streaming.StreamingAccuracyAlgo$$anonfun$run$1.apply(StreamingAccuracyAlgo.scala:50)
        at 
org.apache.griffin.measure.algo.streaming.StreamingAccuracyAlgo$$anonfun$run$1.apply(StreamingAccuracyAlgo.scala:50)
        at scala.util.Try$.apply(Try.scala:161)
        at 
org.apache.griffin.measure.algo.streaming.StreamingAccuracyAlgo.run(StreamingAccuracyAlgo.scala:50)
        at org.apache.griffin.measure.Application$.main(Application.scala:98)
        at org.apache.griffin.measure.Application.main(Application.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:483)
        at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Reply via email to