pache.org; user@spark.apache.org
Then, why don't you use nscala-time_2.10-1.8.0.jar, not
nscala-time_2.11-1.8.0.jar ?
On Tue Feb 17 2015 at 5:55:50 PM Hammam CHAMSI wrote:
I can use nscala-time with scala, but my issue is that I can't use it witinh
spark-shell console! It gives my the error be
e.org; user@spark.apache.org
Great, or you can just use nscala-time with scala 2.10!
On Tue Feb 17 2015 at 5:41:53 PM Hammam CHAMSI wrote:
Thanks Kevin for your reply,
I downloaded the pre_built version and as you said the default spark scala
version is 2.10. I'm now building spark
Re: Use of nscala-time within spark-shell
To: hscha...@hotmail.com; user@spark.apache.org
What is your scala version used to build Spark?
It seems your nscala-time library scala version is 2.11,
and default Spark scala version is 2.10.
On Tue Feb 17 2015 at 1:51:47 AM Hammam CHAMSI wrote:
Hi All
cala)
Your help is very aappreciated,
Regards,
Hammam
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Your help is very aappreciated,
Regards,
Hammam
--
View this message in context:
ht