r run --net=host
>..."
>3. use yarn-cluster mode (see SPARK-5162
><https://issues.apache.org/jira/browse/SPARK-5162>)
>
>
> Hope this helps,
> Eron
>
>
> --
> Date: Wed, 10 Jun 2015 13:43:04 -0700
> Subject: Problem
un 2015 13:43:04 -0700
Subject: Problem with pyspark on Docker talking to YARN cluster
From: ashwinshanka...@gmail.com
To: dev@spark.apache.org; u...@spark.apache.org
All,I was wondering if any of you have solved this problem :
I have pyspark(ipython mode) running on docker talking toa yarn
cluste
All,
I was wondering if any of you have solved this problem :
I have pyspark(ipython mode) running on docker talking to
a yarn cluster(AM/executors are NOT running on docker).
When I start pyspark in the docker container, it binds to port *49460.*
Once the app is submitted to YARN, the app(AM) o