The NullPointerEx came from Spring. 

Which version of Spring do you use ?

Thanks 

> On Mar 22, 2016, at 6:08 AM, Hafsa Asif <[email protected]> wrote:
> 
> yes I know it is because of NullPointerEception, but could not understand
> why? 
> The complete stack trace is :
> [2016-03-22 13:40:14.894] boot - 10493  WARN [main] ---
> AnnotationConfigApplicationContext: Exception encountered during context
> initialization - cancelling refresh attempt:
> org.springframework.beans.factory.BeanCreationException: Error creating bean
> with name 'MyAnalyzer': Invocation of init method failed; nested exception
> is java.lang.NullPointerException
> [2016-03-22 13:40:14.983] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
> [2016-03-22 13:40:14.986] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
> [2016-03-22 13:40:14.989] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/api,null}
> [2016-03-22 13:40:14.994] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/,null}
> [2016-03-22 13:40:15.001] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/static,null}
> [2016-03-22 13:40:15.002] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
> [2016-03-22 13:40:15.002] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
> [2016-03-22 13:40:15.003] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/executors/json,null}
> [2016-03-22 13:40:15.015] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/executors,null}
> [2016-03-22 13:40:15.018] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/environment/json,null}
> [2016-03-22 13:40:15.019] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/environment,null}
> [2016-03-22 13:40:15.024] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
> [2016-03-22 13:40:15.024] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
> [2016-03-22 13:40:15.024] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/storage/json,null}
> [2016-03-22 13:40:15.024] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/storage,null}
> [2016-03-22 13:40:15.024] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
> [2016-03-22 13:40:15.025] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
> [2016-03-22 13:40:15.025] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
> [2016-03-22 13:40:15.025] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
> [2016-03-22 13:40:15.025] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/stages/json,null}
> [2016-03-22 13:40:15.025] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/stages,null}
> [2016-03-22 13:40:15.025] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
> [2016-03-22 13:40:15.025] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
> [2016-03-22 13:40:15.025] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
> [2016-03-22 13:40:15.025] boot - 10493  INFO [main] --- ContextHandler:
> stopped o.s.j.s.ServletContextHandler{/jobs,null}
> [2016-03-22 13:40:15.077] boot - 10493  INFO [main] --- SparkUI: Stopped
> Spark web UI at http://192.168.116.155:4040
> [2016-03-22 13:40:15.081] boot - 10493  INFO [main] --- DAGScheduler:
> Stopping DAGScheduler
> 
> [2016-03-22 13:40:15.413] boot - 10493 ERROR [main] --- SpringApplication:
> Application startup failed
> org.springframework.beans.factory.BeanCreationException: Error creating bean
> with name 'MyAnalyzer': Invocation of init method failed; nested exception
> is java.lang.NullPointerException
>    at
> org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:136)
>    at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:408)
>    at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1564)
>    at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
>    at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:476)
>    at
> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:303)
>    at
> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
>    at
> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:299)
>    at
> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:194)
>    at
> org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:755)
>    at
> org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:762)
>    at
> org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:480)
>    at
> org.springframework.boot.SpringApplication.refresh(SpringApplication.java:690)
>    at
> org.springframework.boot.SpringApplication.run(SpringApplication.java:322)
>    springframework.boot.SpringApplication.run(SpringApplication.java:970)
>    at
> org.springframework.boot.SpringApplication.run(SpringApplication.java:959)
>    at
> com.matchinguu.analytics.AnalyticalEngineApplication.main(AnalyticalEngineApplication.java:14)
> Caused by: java.lang.NullPointerException
>    at
> org.apache.spark.sql.catalyst.expressions.AttributeReference.hashCode(namedExpressions.scala:194)
>    at scala.runtime.ScalaRunTime$.hash(ScalaRunTime.scala:206)
>    at scala.util.hashing.MurmurHash3.productHash(MurmurHash3.scala:64)
>    at scala.util.hashing.MurmurHash3$.productHash(MurmurHash3.scala:211)
>    at scala.runtime.ScalaRunTime$._hashCode(ScalaRunTime.scala:168)
>    at scala.Tuple2.hashCode(Tuple2.scala:20)
>    at
> scala.collection.mutable.FlatHashTable$class.findElemImpl(FlatHashTable.scala:126)
>    at
> scala.collection.mutable.FlatHashTable$class.containsElem(FlatHashTable.scala:121)
>    at scala.collection.mutable.HashSet.containsElem(HashSet.scala:40)
>    at scala.collection.mutable.HashSet.contains(HashSet.scala:57)
>    at scala.collection.GenSetLike$class.apply(GenSetLike.scala:44)
>    at scala.collection.mutable.AbstractSet.apply(Set.scala:46)
>    at scala.collection.SeqLike$$anonfun$distinct$1.apply(SeqLike.scala:506)
>    at scala.collection.immutable.List.foreach(List.scala:381)
> Exception in thread "main"    at
> scala.collection.SeqLike$class.distinct(SeqLike.scala:505)
>    at scala.collection.AbstractSeq.distinct(Seq.scala:41)
> org.springframework.beans.factory.BeanCreationException: Error creating bean
> with name 'MyAnalyzer': Invocation of init method failed; nested exception
> is java.lang.NullPointerException    at
> org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolve(LogicalPlan.scala:251)
> 
>    at
> org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:136)
>    at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:408)
>    at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1564)
>    at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
>    at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:476)
>    at
> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:303)
>    at
> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
>    at
> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:299)
>    at
> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:194)
>    at
> org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:755)
>    at
> org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:762)
>    at
> org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:480)
>    at
> org.springframework.boot.SpringApplication.refresh(SpringApplication.java:690)
>    at
> org.springframework.boot.SpringApplication.run(SpringApplication.java:322)
>    at
> org.springframework.boot.SpringApplication.run(SpringApplication.java:970)
>    at
> org.springframework.boot.SpringApplication.run(SpringApplication.java:959)
>    at
> com.matchinguu.analytics.AnalyticalEngineApplication.main(AnalyticalEngineApplication.java:14)
> Caused by: java.lang.NullPointerException
>  t
> org.apache.spark.sql.catalyst.expressions.AttributeReference.hashCode(namedExpressions.scala:194)
>    at scala.runtime.ScalaRunTime$.hash(ScalaRunTime.scala:206)
>    at scala.util.hashing.MurmurHash3.productHash(MurmurHash3.scala:64)
>    at scala.util.hashing.MurmurHash3$.productHash(MurmurHash3.scala:211)
>    at scala.runtime.ScalaRunTime$._hashCode(ScalaRunTime.scala:168)
>    at scala.Tuple2.hashCode(Tuple2.scala:20)
>    at
> scala.collection.mutable.FlatHashTable$class.findElemImpl(FlatHashTable.scala:126)
>    at
> scala.collection.mutable.FlatHashTable$class.containsElem(FlatHashTable.scala:121)
>    at scala.collection.mutable.HashSet.containsElem(HashSet.scala:40)
>    at scala.collection.mutable.HashSet.contains(HashSet.scala:57)
>    at scala.collection.GenSetLike$class.apply(GenSetLike.scala:44)
>    at scala.collection.mutable.AbstractSet.apply(Set.scala:46)
>    at scala.collection.SeqLike$$anonfun$distinct$1.apply(SeqLike.scala:506)
>    at scala.collection.immutable.List.foreach(List.scala:381)
>    at scala.collection.SeqLike$class.distinct(SeqLike.scala:505)
>    at scala.collection.AbstractSeq.distinct(Seq.scala:41)
>    at
> org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolve(LogicalPlan.scala:251)
>    at
> org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveChildren(LogicalPlan.scala:116)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7$$anonfun$applyOrElse$4$$anonfun$16.apply(Analyzer.scala:350)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7$$anonfun$applyOrElse$4$$anonfun$16.apply(Analyzer.scala:350)
>    at
> org.apache.spark.sql.catalyst.analysis.package$.withPosition(package.scala:48)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7$$anonfun$applyOrElse$4.applyOrElse(Analyzer.scala:350)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7$$anonfun$applyOrElse$4.applyOrElse(Analyzer.scala:341)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:286)
>  t
> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:286)
>    at
> org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:51)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:285)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$5.apply(TreeNode.scala:299)
>    at scala.collection.Iterator$$anon$11.next(Iterator.scala:370)
>    at scala.collection.Iterator$class.foreach(Iterator.scala:742)
>    at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
>    at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
>    at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
>    at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
>    at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:308)
>    at scala.collection.AbstractIterator.to(Iterator.scala:1194)
>    at
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:300)
>    at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1194)
>    at
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:287)
>    at scala.collection.AbstractIterator.toArray(Iterator.scala:1194)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenUp(TreeNode.scala:329)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:283)
>    at
> org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$transformExpressionUp$1(QueryPlan.scala:108)
>    at
> org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$2.apply(QueryPlan.scala:118)
>    at scala.collection.Iterator$$anon$11.next(Iterator.scala:370)
>    at scala.collection.Iterator$class.foreach(Iterator.scala:742)
>    at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
>    at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
>    at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
>    at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
>    at scala.con.TraversableOnce$class.to(TraversableOnce.scala:308)
>    at scala.collection.AbstractIterator.to(Iterator.scala:1194)
>    at
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:300)
>    at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1194)
>    at
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:287)
>    at scala.collection.AbstractIterator.toArray(Iterator.scala:1194)
>    at
> org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsUp(QueryPlan.scala:127)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7.applyOrElse(Analyzer.scala:341)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7.applyOrElse(Analyzer.scala:243)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:286)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:286)
>    at
> org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveChildren(LogicalPlan.scala:116)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7$$anonfun$applyOrElse$4$$anonfun$16.apply(Analyzer.scala:350)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7$$anonfun$applyOrElse$4$$anonfun$16.apply(Analyzer.scala:350)
>    at
> org.apache.spark.sql.catalyst.analysis.package$.withPosition(package.scala:48)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$applynfun$applyOrElse$4.applyOrElse(Analyzer.scala:350)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7$$anonfun$applyOrElse$4.applyOrElse(Analyzer.scala:341)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:286)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:286)
>    at
> org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:51)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:285)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$5.apply(TreeNode.scala:299)
>    at scala.collection.Iterator$$anon$11.next(Iterator.scala:370)
>    at scala.collection.Iterator$class.foreach(Iterator.scala:742)
>    at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
>    at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
>    at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
>    at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
>    at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:308)
>    at scala.collection.AbstractIterator.to(Iterator.scala:1194)
>    at
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:300)
>    at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1194)
>    at
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:287)
>    at scala.collection.AbstractIterator.toArray(Iterator.scala:1194)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenUp(TreeNode.scala:329)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:283)
>    at
> org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$transformExpressionUp$1(QueryPlan.scala:108)
>    at
> org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$2.apply(QueryPlan.scala:118)
>    at scala.collection.Iterator$$anon$11.next(Iterator.scala:370)
>    at scala.collection.Iterator$class.foreach(Iterator.scala:742)
>    at scala.collectiractIterator.foreach(Iterator.scala:1194)
>    at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
>    at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
>    at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
>    at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:308)
>    at scala.collection.AbstractIterator.to(Iterator.scala:1194)
>    at
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:300)
>    at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1194)
>    at
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:287)
>    at scala.collection.AbstractIterator.toArray(Iterator.scala:1194)
>    at
> org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsUp(QueryPlan.scala:127)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7.applyOrElse(Analyzer.scala:341)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7.applyOrElse(Analyzer.scala:243)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:286)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:286)
>    at
> org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:51)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:285)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$5.apply(TreeNode.scala:299)
>    at scala.collection.Iterator$$anon$11.next(Iterator.scala:370)
>    at scala.collection.Iterator$class.foreach(Iterator.scala:742)
>    at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
>    at
> org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:51)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:285)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$5.apply(TreeNode.scala:299)
>    at scala.collection.Iterator$$anon$11.next(Iterator.scala:370)
>    at scala.collection.Iterator$class.foreach(Iterator.scala:742)
>    at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
>    at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
>    at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
>    at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
>    at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:308)
>    at scala.collection.AbstractIterator.to(Iterator.scala:1194)
>    at
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:300)
>    at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1194)
>    at
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:287)
>    at scala.collection.AbstractIterator.toArray(Iterator.scala:1194)
> 
> g.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenUp(TreeNode.scala:329)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:283)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.apply(Analyzer.scala:243)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.apply(Analyzer.scala:242)
>    at
> org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:61)
>    at
> org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:59)
>    at
> scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)
>    at scala.collection.immutable.List.foldLeft(List.scala:84)
>    at
> org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:59)
>    at
> org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:51)
>    at scala.collection.immutable.List.foreach(List.scala:381)
>    at
> org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:51)
>    at
> org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:933)
>    at
> org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:933)
>    at
> org.apache.spark.sql.SQLContext$QueryExecution.assertAnalyzed(SQLContext.scala:931)
>    at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:131)
>    at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
>    at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:755)
>    at
> com.stratio.deep.core.context.DeepSparkContext.sql(DeepSparkContext.java:220)
>    at
> com.matchinguu.analytics.core.service.PushAnalysisService.getActivePush(PushAnalysisService.java:99)
>    at
> com.matchinguu.analytics.weather.service.WeatherPushContractsGenerator.computePushContracts(WeatherPushContractsGenerator.java:193)
>    at
> com.matchinguu.analytics.weather.service.WeatherAnalyzer.finalize(WeatherAnalyzer.java:49)
>    at
> com.matchinguu.analytics.core.service.AnalyzeService.run(AnalyzeService.java:33)
>    atflect.NativeMethodAccessorImpl.invoke0(Native Method)
>    at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>    at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>    at java.lang.reflect.Method.invoke(Method.java:497)
>    at
> org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
>    at
> org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
>    at
> org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
>    ... 16 more
>    at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
>    at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
>    at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
>    at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:308)
>    at scala.collection.AbstractIterator.to(Iterator.scala:1194)
> 
> cala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:300)
>    at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1194)
>    at
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:287)
>    at scala.collection.AbstractIterator.toArray(Iterator.scala:1194)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenUp(TreeNode.scala:329)
>    at
> org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:283)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.apply(Analyzer.scala:243)
>    at
> org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.apply(Analyzer.scala:242)
>    at
> org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:61)
>    at
> org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:59)
>    at
> scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)
>    at scala.collection.immutable.List.foldLeft(List.scala:84)
>    at
> org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:59)
>    at
> org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:51)
>    at scala.collection.immutable.List.foreach(List.scala:381)
>    at
> org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:51)
>    at
> org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:933)
>    at
> org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:933)
>    at
> org.apache.spark.sql.SQLContext$QueryExecution.assertAnalyzed(SQLContext.scala:931)
>    at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:131)
>    at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
>    at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:755)
>    at
> com.stratio.deep.core.context.DeepSparkContext.sql(DeepSparkContext.java:220)
>    at
> com.matchinguu.analytics.core.service.PushAnalysisService.getActivePush(PushAnalysisService.java:99)
>    at
> com.matchinguu.analytiher.service.WeatherPushContractsGenerator.computePushContracts(WeatherPushContractsGenerator.java:193)
>    at
> com.matchinguu.analytics.weather.service.WeatherAnalyzer.finalize(WeatherAnalyzer.java:49)
>    at
> com.matchinguu.analytics.core.service.AnalyzeService.run(AnalyzeService.java:33)
>    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>    at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>    at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>    at java.lang.reflect.Method.invoke(Method.java:497)
>    at
> org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
>    at
> org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
>    at
> org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
>    ... 16 more
> [2016-03-22 13:40:15.613] boot - 10493  INFO [Thread-3] --- Utils: Shutdown
> hook called
> [2016-03-22 13:40:15.614] boot - 10493  INFO [Thread-3] --- Utils: Deleting
> directory /tmp/spark-4c3f436a-dd1f-423f-b353-80ae8f34fdde
> 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Issue-wihle-applying-filters-conditions-in-DataFrame-in-Spark-tp26560p26561.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
> 

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to