Hmmmm... I don't follow. The 2.11.x series is supposed to be binary compatible 
against user code. Anyway, I was building Spark against 2.11.2 and still saw 
the problems with the REPL. I've created a bug report:

https://issues.apache.org/jira/browse/SPARK-6989 
<https://issues.apache.org/jira/browse/SPARK-6989>

I hope this helps.

Cheers,

Michael

> On Apr 17, 2015, at 1:41 AM, Sean Owen <so...@cloudera.com> wrote:
> 
> Doesn't this reduce to "Scala isn't compatible with itself across
> maintenance releases"? Meaning, if this were "fixed" then Scala
> 2.11.{x < 6} would have similar failures. It's not not-ready; it's
> just not the Scala 2.11.6 REPL. Still, sure I'd favor breaking the
> unofficial support to at least make the latest Scala 2.11 the unbroken
> one.
> 
> On Fri, Apr 17, 2015 at 7:58 AM, Michael Allman <mich...@videoamp.com> wrote:
>> FWIW, this is an essential feature to our use of Spark, and I'm surprised
>> it's not advertised clearly as a limitation in the documentation. All I've
>> found about running Spark 1.3 on 2.11 is here:
>> 
>> http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211
>> 
>> Also, I'm experiencing some serious stability problems simply trying to run
>> the Spark 1.3 Scala 2.11 REPL. Most of the time it fails to load and spews a
>> torrent of compiler assertion failures, etc. See attached.
>> 
>> 
>> 
>> Unfortunately, it appears the Spark 1.3 Scala 2.11 REPL is simply not ready
>> for production use. I was going to file a bug, but it seems clear that the
>> current implementation is going to need to be forward-ported to Scala 2.11.6
>> anyway. We already have an issue for that:
>> 
>> https://issues.apache.org/jira/browse/SPARK-6155
>> 
>> Michael
>> 
>> 
>> On Apr 9, 2015, at 10:29 PM, Prashant Sharma <scrapco...@gmail.com> wrote:
>> 
>> You will have to go to this commit ID
>> 191d7cf2a655d032f160b9fa181730364681d0e7 in Apache spark. [1] Once you are
>> at that commit, you need to review the changes done to the repl code and
>> look for the relevant occurrences of the same code in scala 2.11 repl source
>> and somehow make it all work.
>> 
>> 
>> Thanks,
>> 
>> 
>> 
>> 
>> 
>> 1. http://githowto.com/getting_old_versions
>> 
>> Prashant Sharma
>> 
>> 
>> 
>> On Thu, Apr 9, 2015 at 4:40 PM, Alex Nakos <ana...@gmail.com> wrote:
>>> 
>>> Ok, what do i need to do in order to migrate the patch?
>>> 
>>> Thanks
>>> Alex
>>> 
>>> On Thu, Apr 9, 2015 at 11:54 AM, Prashant Sharma <scrapco...@gmail.com>
>>> wrote:
>>>> 
>>>> This is the jira I referred to
>>>> https://issues.apache.org/jira/browse/SPARK-3256. Another reason for not
>>>> working on it is evaluating priority between upgrading to scala 2.11.5(it 
>>>> is
>>>> non trivial I suppose because repl has changed a bit) or migrating that
>>>> patch is much simpler.
>>>> 
>>>> Prashant Sharma
>>>> 
>>>> 
>>>> 
>>>> On Thu, Apr 9, 2015 at 4:16 PM, Alex Nakos <ana...@gmail.com> wrote:
>>>>> 
>>>>> Hi-
>>>>> 
>>>>> Was this the JIRA issue?
>>>>> https://issues.apache.org/jira/browse/SPARK-2988
>>>>> 
>>>>> Any help in getting this working would be much appreciated!
>>>>> 
>>>>> Thanks
>>>>> Alex
>>>>> 
>>>>> On Thu, Apr 9, 2015 at 11:32 AM, Prashant Sharma <scrapco...@gmail.com>
>>>>> wrote:
>>>>>> 
>>>>>> You are right this needs to be done. I can work on it soon, I was not
>>>>>> sure if there is any one even using scala 2.11 spark repl. Actually 
>>>>>> there is
>>>>>> a patch in scala 2.10 shell to support adding jars (Lost the JIRA ID), 
>>>>>> which
>>>>>> has to be ported for scala 2.11 too. If however, you(or anyone else) are
>>>>>> planning to work, I can help you ?
>>>>>> 
>>>>>> Prashant Sharma
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> On Thu, Apr 9, 2015 at 3:08 PM, anakos <ana...@gmail.com> wrote:
>>>>>>> 
>>>>>>> Hi-
>>>>>>> 
>>>>>>> I am having difficulty getting the 1.3.0 Spark shell to find an
>>>>>>> external
>>>>>>> jar.  I have build Spark locally for Scala 2.11 and I am starting the
>>>>>>> REPL
>>>>>>> as follows:
>>>>>>> 
>>>>>>> bin/spark-shell --master yarn --jars data-api-es-data-export-4.0.0.jar
>>>>>>> 
>>>>>>> I see the following line in the console output:
>>>>>>> 
>>>>>>> 15/04/09 09:52:15 INFO spark.SparkContext: Added JAR
>>>>>>> 
>>>>>>> file:/opt/spark/spark-1.3.0_2.11-hadoop2.3/data-api-es-data-export-4.0.0.jar
>>>>>>> at http://192.168.115.31:54421/jars/data-api-es-data-export-4.0.0.jar
>>>>>>> with
>>>>>>> timestamp 1428569535904
>>>>>>> 
>>>>>>> but when i try to import anything from this jar, it's simply not
>>>>>>> available.
>>>>>>> When I try to add the jar manually using the command
>>>>>>> 
>>>>>>> :cp /path/to/jar
>>>>>>> 
>>>>>>> the classes in the jar are still unavailable. I understand that 2.11
>>>>>>> is not
>>>>>>> officially supported, but has anyone been able to get an external jar
>>>>>>> loaded
>>>>>>> in the 1.3.0 release?  Is this a known issue? I have tried searching
>>>>>>> around
>>>>>>> for answers but the only thing I've found that may be related is this:
>>>>>>> 
>>>>>>> https://issues.apache.org/jira/browse/SPARK-3257
>>>>>>> 
>>>>>>> Any/all help is much appreciated.
>>>>>>> Thanks
>>>>>>> Alex
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> --
>>>>>>> View this message in context:
>>>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/External-JARs-not-loading-Spark-Shell-Scala-2-11-tp22434.html
>>>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>>>> Nabble.com.
>>>>>>> 
>>>>>>> ---------------------------------------------------------------------
>>>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>>>> 
>>>>>> 
>>>>> 
>>>> 
>>> 
>> 
>> 
>> 

Reply via email to