[ 
https://issues.apache.org/jira/browse/SPARK-6203?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14350149#comment-14350149
 ] 

q79969786 commented on SPARK-6203:
----------------------------------

final String checkpointDirectory = "hdfs://dev:9000/spark/RealTimeCountPremium";

When I remove checkpointDirectory directory, application can start up.

> Configured Checkpointing cause applications start error
> -------------------------------------------------------
>
>                 Key: SPARK-6203
>                 URL: https://issues.apache.org/jira/browse/SPARK-6203
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.2.0
>            Reporter: q79969786
>
> I use spark streaming and configured Checkpointing like below:
> final String checkpointDirectory = 
> "hdfs://dev:9000/spark/RealTimeCountPremium";
> JavaStreamingContextFactory contextFactory = new 
> JavaStreamingContextFactory() {
>       @Override
>       public JavaStreamingContext create() {
>               JavaStreamingContext jssc = new JavaStreamingContext(sparkConf,
>                               new Duration(Integer.parseInt(duration)));
>               jssc.checkpoint(checkpointDirectory); // set checkpoint
>               return jssc;
>       }
> };
> But the second time will failed,like this:
>  /appcom/spark/bin/spark-submit --class 
> com.patest.hbase.RealTimeCountPremium7 --driver-class-path  
> .:/appcom/spark/lib/  --master local[6] 
> /home/hduser0401/wangyuming206/java/etl2.jar 2 6000
> Spark assembly has been built with Hive, including Datanucleus jars on 
> classpath
> 15/03/06 16:46:50 INFO streaming.CheckpointReader: Checkpoint files found: 
> hdfs://dev-l002781.app.paic.com.cn:9000/apps-data/hduser0401/spark/RealTimeCountPremium/checkpoint-1425630534000,hdfs://dev-l002781.app.paic.com.cn:9000/apps-data/hduser0401/spark/RealTimeCountPremium/checkpoint-1425630534000.bk,hdfs://dev-l002781.app.paic.com.cn:9000/apps-data/hduser0401/spark/RealTimeCountPremium/checkpoint-1425630528000,hdfs://dev-l002781.app.paic.com.cn:9000/apps-data/hduser0401/spark/RealTimeCountPremium/checkpoint-1425630528000.bk,hdfs://dev-l002781.app.paic.com.cn:9000/apps-data/hduser0401/spark/RealTimeCountPremium/checkpoint-1425630522000,hdfs://dev-l002781.app.paic.com.cn:9000/apps-data/hduser0401/spark/RealTimeCountPremium/checkpoint-1425630522000.bk,hdfs://dev-l002781.app.paic.com.cn:9000/apps-data/hduser0401/spark/RealTimeCountPremium/checkpoint-1425630516000,hdfs://dev-l002781.app.paic.com.cn:9000/apps-data/hduser0401/spark/RealTimeCountPremium/checkpoint-1425630516000.bk,hdfs://dev-l002781.app.paic.com.cn:9000/apps-data/hduser0401/spark/RealTimeCountPremium/checkpoint-1425630510000,hdfs://dev-l002781.app.paic.com.cn:9000/apps-data/hduser0401/spark/RealTimeCountPremium/checkpoint-1425630510000.bk
> 15/03/06 16:46:50 INFO streaming.CheckpointReader: Attempting to load 
> checkpoint from file 
> hdfs://dev-l002781.app.paic.com.cn:9000/apps-data/hduser0401/spark/RealTimeCountPremium/checkpoint-1425630534000
> 15/03/06 16:46:50 INFO streaming.Checkpoint: Checkpoint for time 
> 1425630534000 ms validated
> 15/03/06 16:46:50 INFO streaming.CheckpointReader: Checkpoint successfully 
> loaded from file 
> hdfs://dev-l002781.app.paic.com.cn:9000/apps-data/hduser0401/spark/RealTimeCountPremium/checkpoint-1425630534000
> 15/03/06 16:46:50 INFO streaming.CheckpointReader: Checkpoint was generated 
> at time 1425630534000 ms
> 15/03/06 16:46:50 WARN spark.SparkConf: In Spark 1.0 and later 
> spark.local.dir will be overridden by the value set by the cluster manager 
> (via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN).
> 15/03/06 16:46:51 WARN spark.SparkConf: 
> SPARK_CLASSPATH was detected (set to ':').
> This is deprecated in Spark 1.0+.
> Please instead use:
>  - ./spark-submit with --driver-class-path to augment the driver classpath
>  - spark.executor.extraClassPath to augment the executor classpath
>         
> Exception in thread "main" org.apache.spark.SparkException: Found both 
> spark.executor.extraClassPath and SPARK_CLASSPATH. Use only the former.
>         at 
> org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$7.apply(SparkConf.scala:334)
>         at 
> org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$7.apply(SparkConf.scala:332)
>         at scala.collection.immutable.List.foreach(List.scala:318)
>         at 
> org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:332)
>         at 
> org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:320)
>         at scala.Option.foreach(Option.scala:236)
>         at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:320)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:176)
>         at 
> org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:118)
>         at 
> org.apache.spark.streaming.StreamingContext$$anonfun$getOrCreate$1.apply(StreamingContext.scala:561)
>         at 
> org.apache.spark.streaming.StreamingContext$$anonfun$getOrCreate$1.apply(StreamingContext.scala:561)
>         at scala.Option.map(Option.scala:145)
>         at 
> org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:561)
>         at 
> org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:566)
>         at 
> org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
>         at 
> com.patest.hbase.RealTimeCountPremium7.main(RealTimeCountPremium7.java:119)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to