Sorry, my mistake in the last email. Only SparkR before 2.3.0 is supported.

https://github.com/apache/zeppelin/blob/master/spark/interpreter/src/main/java/org/apache/zeppelin/spark/SparkVersion.java#L88


andrew shved <andrewshved.w...@gmail.com> 于2018年12月24日周一 上午9:30写道:

> Actually I get the same error even when I do something dead simple like
> below.  I ran the same commands in sparkR directly and it worked.  Is livy
> just does not work with sparkR this is with 2.3.1? It is a bit concerning
> that nothing really works via livy while works dierctly via sparkR would
> point to a livy issue?
>
> %sparkr
> df <- createDataFrame(sqlContext, faithful)
> head(df)
>
> On Sun, Dec 23, 2018 at 6:59 PM Jeff Zhang <zjf...@gmail.com> wrote:
>
>> This is due to livy 0.5 doesn't support spark 2.4. Because spark 2.4
>> changes its SparkR related method signature. I am afraid you have to
>> downgrade to spark 2.3.x
>>
>>
>> andrew shved <andrewshved.w...@gmail.com> 于2018年12月24日周一 上午7:48写道:
>>
>>> Spark 2.4.0 Sorry
>>> Zeppelin 0.8.0
>>> Livy 0.5
>>>
>>> regular livy.sparkr commands like
>>> 1+1 work the issue when spark comes into play
>>>
>>> On Sun, Dec 23, 2018 at 6:44 PM andrew shved <andrewshved.w...@gmail.com>
>>> wrote:
>>>
>>>> 0.5 with spark 2.4.9 on AWS EMR
>>>>
>>>> On Sun., Dec. 23, 2018, 6:41 p.m. Jeff Zhang <zjf...@gmail.com wrote:
>>>>
>>>>> Which version of livy do you use ?
>>>>>
>>>>> andrew shved <andrewshved.w...@gmail.com> 于2018年12月23日周日 下午11:49写道:
>>>>>
>>>>>>
>>>>>> been struggling wiht zeppelin + livy + sparkR integration for days.
>>>>>> I got livy.pyspark and livy.spark work no issues.  with livy.sparkr I get
>>>>>>
>>>>>> 18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive
>>>>>> support).
>>>>>> Exception in thread "SparkR backend" java.lang.ClassCastException:
>>>>>> scala.Tuple2 cannot be cast to java.lang.Integer
>>>>>>     at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
>>>>>>     at
>>>>>> org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
>>>>>> all the time :disappointed: running out of things to try
>>>>>> simple spark.R works
>>>>>>
>>>>>> Any ideas or advice would be appreciated. Thank you!
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Best Regards
>>>>>
>>>>> Jeff Zhang
>>>>>
>>>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>

-- 
Best Regards

Jeff Zhang

Reply via email to