Is it just a typo in the email or are you missing a space after your
--master argument?
The logs here actually don't say much but "something went wrong". It seems
fairly low-level, like the gateway process failed or didn't start, rather
than a problem with the program. It's hard to say more
exception message?
>
>
> -- 原始邮件 --
> *发件人:* "Tobi Bosede";<ani.to...@gmail.com>;
> *发送时间:* 2016年10月16日(星期天) 上午8:04
> *收件人:* "user"<user@spark.apache.org>;
> *主题:* Spark-submit Problems
>
> Hi everyone,
&g
; 收件人: "user"<user@spark.apache.org>;
>
> 主题: Spark-submit Problems
>
>
> Hi everyone,
>
> I am having problems submitting an app through spark-submit when the master
> is not "local". However the pi.py example which comes with Spark works with
show you pi.py code and what is the exception message?
-- --
??: "Tobi Bosede";<ani.to...@gmail.com>;
: 2016??10??16??(??) 8:04
??: "user"<user@spark.apache.org>;
: Spark-submit Probl
Hi everyone,
I am having problems submitting an app through spark-submit when the master
is not "local". However the pi.py example which comes with Spark works with
any master. I believe my script has the same structure as pi.py, but for
some reason my script is not as flexible. Specifically, the
I'm using Spark 1.5.0 with the standalone scheduler, and for the life of me I
can't figure out why this isn't working. I have an application that works fine
with --deploy-mode client that I'm trying to get to run in cluster mode so I
can use --supervise. I ran into a few issues with my