Re: JMXSink for YARN deployment

2016-01-12 Thread Kyle Lin
Hello there I run both driver and master on the same node, so I got "Port already in use" exception. Is there any solution to set different port for each component? Kyle 2015-12-05 5:54 GMT+08:00 spearson23 : > Run "spark-submit --help" to see all available options. >

Re: JMXSink for YARN deployment

2016-01-12 Thread Kyle Lin
Hello guys I got a solution. I set -Dcom.sun.management.jmxremote.port=0 to let system assign a unused port. Kyle 2016-01-12 16:54 GMT+08:00 Kyle Lin <kylelin2...@gmail.com>: > Hello there > > > I run both driver and master on the same node, so I got "Port alre

Re: Yarn application ID for Spark job on Yarn

2015-12-18 Thread Kyle Lin
Hello there I have the same requirement. I submit a streaming job with yarn-cluster mode. If I want to shutdown this endless YARN application, I should find out the application id by myself and use "yarn appplication -kill " to kill the application. Therefore, if I can get returned application

Re: Dynamic Allocation & Spark Streaming

2015-11-06 Thread Kyle Lin
Hey there I run Spark streaming 1.5.1 on YARN with Dynamic allocation, and use direct stream API to read data from Kafka. Spark job can dynamically request a executor when reaching spark.dynamicAllocation.schedulerBacklogTimeout. However, it won't dynamically remove executor when there is no

Re: spark.files.userClassPathFirst=true Return Error - Please help

2015-08-14 Thread Kyle Lin
Hi all I had similar usage and also got the same problem. I guess Spark use some class in my user jars but actually it should use the class in spark-assembly-xxx.jar, but I don't know how to fix it. Kyle 2015-07-22 23:03 GMT+08:00 Ashish Soni asoni.le...@gmail.com: Hi All , I am getting

Re: spark.files.userClassPathFirst=true Return Error - Please help

2015-08-14 Thread Kyle Lin
configuration? Thanks Best Regards On Fri, Aug 14, 2015 at 2:05 PM, Kyle Lin kylelin2...@gmail.com wrote: Hi all I had similar usage and also got the same problem. I guess Spark use some class in my user jars but actually it should use the class in spark-assembly-xxx.jar, but I don't know how

The Processing loading of Spark streaming on YARN is not in balance

2015-04-30 Thread Kyle Lin
Hi all My environment info Hadoop release version: HDP 2.1 Kakfa: 0.8.1.2.1.4.0 Spark: 1.1.0 My question: I ran Spark streaming program on YARN. My Spark streaming program will read data from Kafka and doing some processing. But, I found there is always only ONE executor under processing. As

Re: The Processing loading of Spark streaming on YARN is not in balance

2015-04-30 Thread Kyle Lin
? and what about the number of topic partitions from kafka? Best regards, Lin Hao XU IBM Research China Email: xulin...@cn.ibm.com My Flickr: http://www.flickr.com/photos/xulinhao/sets [image: Inactive hide details for Kyle Lin ---2015/04/30 14:39:32---Hi all My environment info]Kyle Lin

Re: The Processing loading of Spark streaming on YARN is not in balance

2015-04-30 Thread Kyle Lin
have one receiver with storage level two copies, so mostly your taks are located on two executors. You could use repartition to redistribute the data more evenly across the executors. Also add more receiver is another solution. 2015-04-30 14:38 GMT+08:00 Kyle Lin kylelin2...@gmail.com: Hi all

Re: Unable to start Spark 1.3 after building:java.lang. NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2014-12-17 Thread Kyle Lin
version of Hadoop that Spark supports is 2.4, not 2.6. Nick On Wed Dec 17 2014 at 2:09:56 AM Kyle Lin kylelin2...@gmail.com wrote: I also got the same problem.. 2014-12-09 22:58 GMT+08:00 Daniel Haviv danielru...@gmail.com: Hi, I've built spark 1.3 with hadoop 2.6

Re: Running Spark Job on Yarn from Java Code

2014-12-16 Thread Kyle Lin
Hi there I also got exception when running PI example on YARN Spark version: spark-1.1.1-bin-hadoop2.4 My environment: Hortonworks HDP 2.2 My command: ./bin/spark-submit --master yarn-cluster --class org.apache.spark.examples.SparkPi lib/spark-examples*.jar 10 Output logs: 14/12/17 14:06:32

Re: Unable to start Spark 1.3 after building:java.lang. NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2014-12-16 Thread Kyle Lin
I also got the same problem.. 2014-12-09 22:58 GMT+08:00 Daniel Haviv danielru...@gmail.com: Hi, I've built spark 1.3 with hadoop 2.6 but when I startup the spark-shell I get the following exception: 14/12/09 06:54:24 INFO server.AbstractConnector: Started