Hi,

Remove

.setMaster("spark://spark-437-1-5963003:7077").
set("spark.driver.host","11.104.29.106")

and start over.

Can you also run the following command to check out Spark Standalone:

run-example --master spark://spark-437-1-5963003:7077 SparkPi

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Mon, Sep 26, 2016 at 5:34 PM, vr spark <vrspark...@gmail.com> wrote:
> Hi,
> I use scala IDE for eclipse. I usually run job against my local spark
> installed on my mac and then export the jars and copy it to spark cluster of
> my company and run spark submit on it.
> This works fine.
>
> But i want to run the jobs from scala ide directly using the spark cluster
> of my company.
> the spark master url of my company cluster is
> spark://spark-437-1-5963003:7077.
> one of the worker nodes of that cluster is 11.104.29.106
>
> I tried this option,  but getting error
>
>   val conf = new SparkConf().setAppName("Simple
> Application").setMaster("spark://spark-437-1-5963003:7077").
> set("spark.driver.host","11.104.29.106")
>
> please let me know.
>
> 16/09/25 08:51:51 INFO SparkContext: Running Spark version 2.0.0
>
> 16/09/25 08:51:51 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
>
> 16/09/25 08:51:52 INFO SecurityManager: Changing view acls to: vr
>
> 16/09/25 08:51:52 INFO SecurityManager: Changing modify acls to: vr
>
> 16/09/25 08:51:52 INFO SecurityManager: Changing view acls groups to:
>
> 16/09/25 08:51:52 INFO SecurityManager: Changing modify acls groups to:
>
> 16/09/25 08:51:52 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users  with view permissions: Set(vr); groups
> with view permissions: Set(); users  with modify permissions: Set(vr);
> groups with modify permissions: Set()
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 WARN Utils: Service 'sparkDriver' could not bind on port
> 0. Attempting port 1.
>
> 16/09/25 08:51:52 ERROR SparkContext: Error initializing SparkContext.
>
> java.net.BindException: Can't assign requested address: Service
> 'sparkDriver' failed after 16 retries! Consider explicitly setting the
> appropriate port for the service 'sparkDriver' (for example spark.ui.port
> for SparkUI) to an available port or increasing spark.port.maxRetries.
>
> at sun.nio.ch.Net.bind0(Native Method)
>
> at sun.nio.ch.Net.bind(Net.java:433)
>
>
>
>
> full class code
>
> object RatingsCounter {
>
>   /** Our main function where the action happens */
>
>   def main(args: Array[String]) {
>
>     Logger.getLogger("org").setLevel(Level.INFO)
>
>         val conf = new SparkConf().setAppName("Simple
> Application").setMaster("spark://spark-437-1-5963003:7077").
> set("spark.driver.host","11.104.29.106")
>
>
>
>    val sc = new SparkContext(conf)
>
>     val lines = sc.textFile("u.data")
>
>     val ratings = lines.map(x => x.toString().split("\t")(2))
>
>         val results = ratings.countByValue()
>
>     val sortedResults = results.toSeq.sortBy(_._1)
>
>     sortedResults.foreach(println)
>
>   }
>
> }
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to