[ 
https://issues.apache.org/jira/browse/SPARK-9441?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

nirav patel updated SPARK-9441:
-------------------------------
    Description: 
I recently migrated my spark based rest service from 1.0.2 to 1.3.1 


15/07/29 10:31:12 INFO spark.SparkContext: Running Spark version 1.3.1

15/07/29 10:31:12 INFO spark.SecurityManager: Changing view acls to: npatel

15/07/29 10:31:12 INFO spark.SecurityManager: Changing modify acls to: npatel

15/07/29 10:31:12 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(npatel); users 
with modify permissions: Set(npatel)

Exception in thread "main" java.lang.NoSuchMethodError: 
com.typesafe.config.Config.getDuration(Ljava/lang/String;Ljava/util/concurrent/TimeUnit;)J

at 
akka.util.Helpers$ConfigOps$.akka$util$Helpers$ConfigOps$$getDuration$extension(Helpers.scala:125)

at akka.util.Helpers$ConfigOps$.getMillisDuration$extension(Helpers.scala:120)

at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:171)

at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:504)

at akka.actor.ActorSystem$.apply(ActorSystem.scala:141)

at akka.actor.ActorSystem$.apply(ActorSystem.scala:118)

at 
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)

at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:55)

at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)

at 
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837)

at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)

at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828)

at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:57)

at org.apache.spark.SparkEnv$.create(SparkEnv.scala:223)

at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)

at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)


I read on blogs where people suggest to modify classpath and put right version 
before, put scala libs before in classpath and similar suggestions. which is  
all ridiculous. I think typesafe config package included with spark-core lib is 
incorrect. I did  following with my maven build and now it works. But i think 
someone need to fix spark-core package. 

<dependency>
                        <groupId>org.apache.spark</groupId>
                        <artifactId>spark-core_2.10</artifactId>
                        <exclusions>
                                <exclusion>
                                        <artifactId>config</artifactId>
                                        <groupId>com.typesafe</groupId>
                                </exclusion>
                        </exclusions>
                </dependency>


                <dependency>
                        <groupId>com.typesafe</groupId>
                        <artifactId>config</artifactId>
                        <version>1.2.1</version>
                </dependency>

  was:
I recently migrated my spark based rest service from 1.0.2 to 1.3.1 


15/07/29 10:31:12 INFO spark.SparkContext: Running Spark version 1.3.1

15/07/29 10:31:12 INFO spark.SecurityManager: Changing view acls to: npatel

15/07/29 10:31:12 INFO spark.SecurityManager: Changing modify acls to: npatel

15/07/29 10:31:12 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(npatel); users 
with modify permissions: Set(npatel)

Exception in thread "main" java.lang.NoSuchMethodError: 
com.typesafe.config.Config.getDuration(Ljava/lang/String;Ljava/util/concurrent/TimeUnit;)J

at 
akka.util.Helpers$ConfigOps$.akka$util$Helpers$ConfigOps$$getDuration$extension(Helpers.scala:125)

at akka.util.Helpers$ConfigOps$.getMillisDuration$extension(Helpers.scala:120)

at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:171)

at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:504)

at akka.actor.ActorSystem$.apply(ActorSystem.scala:141)

at akka.actor.ActorSystem$.apply(ActorSystem.scala:118)

at 
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)

at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:55)

at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)

at 
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837)

at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)

at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828)

at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:57)

at org.apache.spark.SparkEnv$.create(SparkEnv.scala:223)

at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)

at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)


I read on blogs where people suggest to modify classpath and put right version 
before, put scala libs before in classpath and similar suggestions. which is  
all ridiculous. I think typesafe config package included with spark-core lib is 
incorrect. I did  following with my maven build and now it works. But i think 
someone need to fix spark-core package. 

<dependency>
                        <groupId>org.apache.spark</groupId>
                        <artifactId>spark-core_2.10</artifactId>
                        <exclusions>
                                <exclusion>
                                        <artifactId>config</artifactId>
                                        <groupId>com.typesafe</groupId>
                                </exclusion>
                        </exclusions>
                </dependency>
                <dependency>
                        <groupId>com.typesafe</groupId>
                        <artifactId>config</artifactId>
                        <version>1.2.1</version>
                </dependency>


> NoSuchMethodError: Com.typesafe.config.Config.getDuration
> ---------------------------------------------------------
>
>                 Key: SPARK-9441
>                 URL: https://issues.apache.org/jira/browse/SPARK-9441
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 1.3.1
>            Reporter: nirav patel
>
> I recently migrated my spark based rest service from 1.0.2 to 1.3.1 
> 15/07/29 10:31:12 INFO spark.SparkContext: Running Spark version 1.3.1
> 15/07/29 10:31:12 INFO spark.SecurityManager: Changing view acls to: npatel
> 15/07/29 10:31:12 INFO spark.SecurityManager: Changing modify acls to: npatel
> 15/07/29 10:31:12 INFO spark.SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(npatel); users 
> with modify permissions: Set(npatel)
> Exception in thread "main" java.lang.NoSuchMethodError: 
> com.typesafe.config.Config.getDuration(Ljava/lang/String;Ljava/util/concurrent/TimeUnit;)J
> at 
> akka.util.Helpers$ConfigOps$.akka$util$Helpers$ConfigOps$$getDuration$extension(Helpers.scala:125)
> at akka.util.Helpers$ConfigOps$.getMillisDuration$extension(Helpers.scala:120)
> at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:171)
> at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:504)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:141)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:118)
> at 
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:55)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
> at 
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837)
> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828)
> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:57)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:223)
> at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
> at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)
> I read on blogs where people suggest to modify classpath and put right 
> version before, put scala libs before in classpath and similar suggestions. 
> which is  all ridiculous. I think typesafe config package included with 
> spark-core lib is incorrect. I did  following with my maven build and now it 
> works. But i think someone need to fix spark-core package. 
> <dependency>
>                       <groupId>org.apache.spark</groupId>
>                       <artifactId>spark-core_2.10</artifactId>
>                       <exclusions>
>                               <exclusion>
>                                       <artifactId>config</artifactId>
>                                       <groupId>com.typesafe</groupId>
>                               </exclusion>
>                       </exclusions>
>               </dependency>
>               <dependency>
>                       <groupId>com.typesafe</groupId>
>                       <artifactId>config</artifactId>
>                       <version>1.2.1</version>
>               </dependency>



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to