Github user fanyunbojerry commented on the issue:
https://github.com/apache/spark/pull/10881
HI,zsxwing. Is there any way to reproduce this issue?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/10881
@1236897 The Spark job status from RUNNING to COMPLETE happens before
`SparkContext.stop`.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user 1236897 commented on the issue:
https://github.com/apache/spark/pull/10881
cos I watch the monitoring page. my last step is save as parquet. but it
page monitoring page stage show completed and the job keep running, after few
minutes, the job is complete and throw the Exp
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/10881
> the issue "RpcEnv already stopped" waste a lot of time to disconnect
Did you measure how long to stop SparkContext? I don't think it will take
several minutes.
---
If your project is set
Github user 1236897 commented on the issue:
https://github.com/apache/spark/pull/10881
@ zsxwing if I just ignore it, will it avoid my issue? my issue is I
need complete my job within 5 mins. but the issue "RpcEnv already stopped"
waste a lot of time to disconnect and make my job b
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/10881
@1236897 If you don't want to build Spark, it's fine to just catch this
special exception thrown from `SparkContxt.stop` and ignore it.
---
If your project is set up for it, you can reply to this e
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/10881
> compile the code through ecplise.
No. Take a look at this page about how to build Spark:
http://spark.apache.org/docs/1.6.2/building-spark.html
> Lastly, add the output jar to mav
Github user 1236897 commented on the issue:
https://github.com/apache/spark/pull/10881
@zsxwing could you give the link about spark 1.6.2 of github?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user 1236897 commented on the issue:
https://github.com/apache/spark/pull/10881
@zsxwing Thank you for your reply and sorry to disturb you, cos this
project is so import for me, I descrise what i need to do. Firistly, check out
the spark 1.6 from Github. Secondly, use the git
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/10881
@1236897 You can check out Spark 1.6.2 tag and apply this patch. Then build
the Spark and use this one to submit your Spark application. You can still use
the Spark maven artifact to build your appl
Github user 1236897 commented on the issue:
https://github.com/apache/spark/pull/10881
actually, I face the issue that "RpcEnv already stopped" when i call
sparkContxt.stop in the end of the program. I add the spark1.6 through maven
pom. could i know how to fix this issue ?
---
If
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/10881
@1236897 Just try `git checkpick` this PR.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user 1236897 commented on the issue:
https://github.com/apache/spark/pull/10881
could i know how to merge the updated code to my project to avoid this
error?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If you
Github user JerryLead commented on the issue:
https://github.com/apache/spark/pull/10881
This bug still exists in latest Spark 1.6.2. How about merging it to
branch-1.6? @nishkamravi2 @zsxwing
---
If your project is set up for it, you can reply to this email and have your
reply appe
14 matches
Mail list logo