Re: livy with sparkR issue

2018-12-23 Thread Jeff Zhang
Sorry, my mistake in the last email. Only SparkR before 2.3.0 is supported.

https://github.com/apache/zeppelin/blob/master/spark/interpreter/src/main/java/org/apache/zeppelin/spark/SparkVersion.java#L88


andrew shved  于2018年12月24日周一 上午9:30写道:

> Actually I get the same error even when I do something dead simple like
> below.  I ran the same commands in sparkR directly and it worked.  Is livy
> just does not work with sparkR this is with 2.3.1? It is a bit concerning
> that nothing really works via livy while works dierctly via sparkR would
> point to a livy issue?
>
> %sparkr
> df <- createDataFrame(sqlContext, faithful)
> head(df)
>
> On Sun, Dec 23, 2018 at 6:59 PM Jeff Zhang  wrote:
>
>> This is due to livy 0.5 doesn't support spark 2.4. Because spark 2.4
>> changes its SparkR related method signature. I am afraid you have to
>> downgrade to spark 2.3.x
>>
>>
>> andrew shved  于2018年12月24日周一 上午7:48写道:
>>
>>> Spark 2.4.0 Sorry
>>> Zeppelin 0.8.0
>>> Livy 0.5
>>>
>>> regular livy.sparkr commands like
>>> 1+1 work the issue when spark comes into play
>>>
>>> On Sun, Dec 23, 2018 at 6:44 PM andrew shved 
>>> wrote:
>>>
 0.5 with spark 2.4.9 on AWS EMR

 On Sun., Dec. 23, 2018, 6:41 p.m. Jeff Zhang >>>
> Which version of livy do you use ?
>
> andrew shved  于2018年12月23日周日 下午11:49写道:
>
>>
>> been struggling wiht zeppelin + livy + sparkR integration for days.
>> I got livy.pyspark and livy.spark work no issues.  with livy.sparkr I get
>>
>> 18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive
>> support).
>> Exception in thread "SparkR backend" java.lang.ClassCastException:
>> scala.Tuple2 cannot be cast to java.lang.Integer
>> at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
>> at
>> org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
>> all the time :disappointed: running out of things to try
>> simple spark.R works
>>
>> Any ideas or advice would be appreciated. Thank you!
>>
>
>
> --
> Best Regards
>
> Jeff Zhang
>

>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>

-- 
Best Regards

Jeff Zhang


Re: livy with sparkR issue

2018-12-23 Thread andrew shved
Actually I get the same error even when I do something dead simple like
below.  I ran the same commands in sparkR directly and it worked.  Is livy
just does not work with sparkR this is with 2.3.1? It is a bit concerning
that nothing really works via livy while works dierctly via sparkR would
point to a livy issue?

%sparkr
df <- createDataFrame(sqlContext, faithful)
head(df)

On Sun, Dec 23, 2018 at 6:59 PM Jeff Zhang  wrote:

> This is due to livy 0.5 doesn't support spark 2.4. Because spark 2.4
> changes its SparkR related method signature. I am afraid you have to
> downgrade to spark 2.3.x
>
>
> andrew shved  于2018年12月24日周一 上午7:48写道:
>
>> Spark 2.4.0 Sorry
>> Zeppelin 0.8.0
>> Livy 0.5
>>
>> regular livy.sparkr commands like
>> 1+1 work the issue when spark comes into play
>>
>> On Sun, Dec 23, 2018 at 6:44 PM andrew shved 
>> wrote:
>>
>>> 0.5 with spark 2.4.9 on AWS EMR
>>>
>>> On Sun., Dec. 23, 2018, 6:41 p.m. Jeff Zhang >>
 Which version of livy do you use ?

 andrew shved  于2018年12月23日周日 下午11:49写道:

>
> been struggling wiht zeppelin + livy + sparkR integration for days.  I
> got livy.pyspark and livy.spark work no issues.  with livy.sparkr I get
>
> 18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive
> support).
> Exception in thread "SparkR backend" java.lang.ClassCastException:
> scala.Tuple2 cannot be cast to java.lang.Integer
> at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
> at
> org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
> all the time :disappointed: running out of things to try
> simple spark.R works
>
> Any ideas or advice would be appreciated. Thank you!
>


 --
 Best Regards

 Jeff Zhang

>>>
>
> --
> Best Regards
>
> Jeff Zhang
>


Re: livy with sparkR issue

2018-12-23 Thread andrew shved
I switched to Spark 2.3.1
and still get the same error message in the logs
when running
%sparkr
sql("select * from a")
see full log.  How would I change that to work with to get result back ...

18/12/24 01:10:55 INFO RSCDriver: Connecting to:
ip-172-31-29-242.ca-central-1.compute.internal:10001
18/12/24 01:10:55 INFO RSCDriver: Starting RPC server...
18/12/24 01:10:55 INFO RpcServer: Connected to the port 10003
18/12/24 01:10:55 WARN RSCConf: Your hostname,
ip-172-31-29-242.ca-central-1.compute.internal, resolves to a loopback
address, but we couldn't find any external IP address!
18/12/24 01:10:55 WARN RSCConf: Set livy.rsc.rpc.server.address if you
need to bind to another address.
18/12/24 01:10:55 INFO RSCDriver: Received job request
f2d02219-7386-4cd7-8cb3-c5de4254d405
18/12/24 01:10:55 INFO RSCDriver: SparkContext not yet up, queueing job request.
18/12/24 01:10:58 INFO SparkEntries: Starting Spark context...
18/12/24 01:10:58 INFO SparkContext: Running Spark version 2.3.1
18/12/24 01:10:58 INFO SparkContext: Submitted application: livy-session-0
18/12/24 01:10:58 INFO SecurityManager: Changing view acls to: livy
18/12/24 01:10:58 INFO SecurityManager: Changing modify acls to: livy
18/12/24 01:10:58 INFO SecurityManager: Changing view acls groups to:
18/12/24 01:10:58 INFO SecurityManager: Changing modify acls groups to:
18/12/24 01:10:58 INFO SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users  with view
permissions: Set(livy); groups with view permissions: Set(); users
with modify permissions: Set(livy); groups with modify permissions:
Set()
18/12/24 01:10:58 INFO Utils: Successfully started service
'sparkDriver' on port 35045.
18/12/24 01:10:58 INFO SparkEnv: Registering MapOutputTracker
18/12/24 01:10:58 INFO SparkEnv: Registering BlockManagerMaster
18/12/24 01:10:58 INFO BlockManagerMasterEndpoint: Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology
information
18/12/24 01:10:58 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
18/12/24 01:10:58 INFO DiskBlockManager: Created local directory at
/mnt/tmp/blockmgr-ba12ad73-dcd8-4216-8ee3-649db6d5002d
18/12/24 01:10:58 INFO MemoryStore: MemoryStore started with capacity 413.9 MB
18/12/24 01:10:58 INFO SparkEnv: Registering OutputCommitCoordinator
18/12/24 01:10:58 INFO Utils: Successfully started service 'SparkUI'
on port 4040.
18/12/24 01:10:58 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started
at http://ip-172-31-29-242.ca-central-1.compute.internal:4040
18/12/24 01:10:58 INFO SparkContext: Added JAR
file:/usr/lib/livy/rsc-jars/netty-all-4.0.37.Final.jar at
spark://ip-172-31-29-242.ca-central-1.compute.internal:35045/jars/netty-all-4.0.37.Final.jar
with timestamp 1545613858546
18/12/24 01:10:58 INFO SparkContext: Added JAR
file:/usr/lib/livy/rsc-jars/livy-rsc-0.5.0-incubating.jar at
spark://ip-172-31-29-242.ca-central-1.compute.internal:35045/jars/livy-rsc-0.5.0-incubating.jar
with timestamp 1545613858547
18/12/24 01:10:58 INFO SparkContext: Added JAR
file:/usr/lib/livy/rsc-jars/livy-api-0.5.0-incubating.jar at
spark://ip-172-31-29-242.ca-central-1.compute.internal:35045/jars/livy-api-0.5.0-incubating.jar
with timestamp 1545613858547
18/12/24 01:10:58 INFO SparkContext: Added JAR
file:/usr/lib/livy/repl_2.11-jars/livy-core_2.11-0.5.0-incubating.jar
at 
spark://ip-172-31-29-242.ca-central-1.compute.internal:35045/jars/livy-core_2.11-0.5.0-incubating.jar
with timestamp 1545613858547
18/12/24 01:10:58 INFO SparkContext: Added JAR
file:/usr/lib/livy/repl_2.11-jars/livy-repl_2.11-0.5.0-incubating.jar
at 
spark://ip-172-31-29-242.ca-central-1.compute.internal:35045/jars/livy-repl_2.11-0.5.0-incubating.jar
with timestamp 1545613858547
18/12/24 01:10:58 INFO SparkContext: Added JAR
file:/usr/lib/livy/repl_2.11-jars/commons-codec-1.9.jar at
spark://ip-172-31-29-242.ca-central-1.compute.internal:35045/jars/commons-codec-1.9.jar
with timestamp 1545613858547
18/12/24 01:10:58 INFO Utils: Using initial executors = 0, max of
spark.dynamicAllocation.initialExecutors,
spark.dynamicAllocation.minExecutors and spark.executor.instances
18/12/24 01:10:59 INFO RMProxy: Connecting to ResourceManager at
ip-172-31-29-242.ca-central-1.compute.internal/172.31.29.242:8032
18/12/24 01:10:59 INFO Client: Requesting a new application from
cluster with 1 NodeManagers
18/12/24 01:10:59 INFO Client: Verifying our application has not
requested more than the maximum memory capability of the cluster
(57344 MB per container)
18/12/24 01:10:59 INFO Client: Will allocate AM container, with 896 MB
memory including 384 MB overhead
18/12/24 01:10:59 INFO Client: Setting up container launch context for our AM
18/12/24 01:10:59 INFO Client: Setting up the launch environment for
our AM container
18/12/24 01:10:59 INFO Client: Preparing resources for our AM container
18/12/24 01:10:59 WARN Client: Neither spark.yarn.jars nor
spark.yarn.archive is set, falling back to uploading libraries under
SPARK_HOME.
18/12/24 01:11:01 

Re: livy with sparkR issue

2018-12-23 Thread Jiang Jacky
You must try to directly convert tuple2 to integer. You shall get the integer 
from the tuple2 first then cast that in your workflow.



From: Jeff Zhang 
Sent: Sunday, December 23, 2018 6:59 PM
To: user
Subject: Re: livy with sparkR issue

This is due to livy 0.5 doesn't support spark 2.4. Because spark 2.4 changes 
its SparkR related method signature. I am afraid you have to downgrade to spark 
2.3.x


andrew shved mailto:andrewshved.w...@gmail.com>> 
于2018年12月24日周一 上午7:48写道:
Spark 2.4.0 Sorry
Zeppelin 0.8.0
Livy 0.5

regular livy.sparkr commands like
1+1 work the issue when spark comes into play

On Sun, Dec 23, 2018 at 6:44 PM andrew shved 
mailto:andrewshved.w...@gmail.com>> wrote:
0.5 with spark 2.4.9 on AWS EMR

On Sun., Dec. 23, 2018, 6:41 p.m. Jeff Zhang 
mailto:zjf...@gmail.com> wrote:
Which version of livy do you use ?

andrew shved mailto:andrewshved.w...@gmail.com>> 
于2018年12月23日周日 下午11:49写道:

been struggling wiht zeppelin + livy + sparkR integration for days.  I got 
livy.pyspark and livy.spark work no issues.  with livy.sparkr I get

18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive support).
Exception in thread "SparkR backend" java.lang.ClassCastException: scala.Tuple2 
cannot be cast to java.lang.Integer
at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
at 
org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
all the time :disappointed: running out of things to try
simple spark.R works

Any ideas or advice would be appreciated. Thank you!


--
Best Regards

Jeff Zhang


--
Best Regards

Jeff Zhang


Re: livy with sparkR issue

2018-12-23 Thread Jeff Zhang
This is due to livy 0.5 doesn't support spark 2.4. Because spark 2.4
changes its SparkR related method signature. I am afraid you have to
downgrade to spark 2.3.x


andrew shved  于2018年12月24日周一 上午7:48写道:

> Spark 2.4.0 Sorry
> Zeppelin 0.8.0
> Livy 0.5
>
> regular livy.sparkr commands like
> 1+1 work the issue when spark comes into play
>
> On Sun, Dec 23, 2018 at 6:44 PM andrew shved 
> wrote:
>
>> 0.5 with spark 2.4.9 on AWS EMR
>>
>> On Sun., Dec. 23, 2018, 6:41 p.m. Jeff Zhang >
>>> Which version of livy do you use ?
>>>
>>> andrew shved  于2018年12月23日周日 下午11:49写道:
>>>

 been struggling wiht zeppelin + livy + sparkR integration for days.  I
 got livy.pyspark and livy.spark work no issues.  with livy.sparkr I get

 18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive
 support).
 Exception in thread "SparkR backend" java.lang.ClassCastException:
 scala.Tuple2 cannot be cast to java.lang.Integer
 at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
 at
 org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
 all the time :disappointed: running out of things to try
 simple spark.R works

 Any ideas or advice would be appreciated. Thank you!

>>>
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>>>
>>

-- 
Best Regards

Jeff Zhang


Re: livy with sparkR issue

2018-12-23 Thread andrew shved
0.5 with spark 2.4.9 on AWS EMR

On Sun., Dec. 23, 2018, 6:41 p.m. Jeff Zhang  Which version of livy do you use ?
>
> andrew shved  于2018年12月23日周日 下午11:49写道:
>
>>
>> been struggling wiht zeppelin + livy + sparkR integration for days.  I
>> got livy.pyspark and livy.spark work no issues.  with livy.sparkr I get
>>
>> 18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive
>> support).
>> Exception in thread "SparkR backend" java.lang.ClassCastException:
>> scala.Tuple2 cannot be cast to java.lang.Integer
>> at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
>> at
>> org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
>> all the time :disappointed: running out of things to try
>> simple spark.R works
>>
>> Any ideas or advice would be appreciated. Thank you!
>>
>
>
> --
> Best Regards
>
> Jeff Zhang
>


Re: livy with sparkR issue

2018-12-23 Thread Jeff Zhang
Which version of livy do you use ?

andrew shved  于2018年12月23日周日 下午11:49写道:

>
> been struggling wiht zeppelin + livy + sparkR integration for days.  I got
> livy.pyspark and livy.spark work no issues.  with livy.sparkr I get
>
> 18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive
> support).
> Exception in thread "SparkR backend" java.lang.ClassCastException:
> scala.Tuple2 cannot be cast to java.lang.Integer
> at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
> at
> org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
> all the time :disappointed: running out of things to try
> simple spark.R works
>
> Any ideas or advice would be appreciated. Thank you!
>


-- 
Best Regards

Jeff Zhang