Github user GraceH commented on the issue:
https://github.com/apache/spark/pull/7927
@sprite331. According to my understanding, this patch tries to catch
certain exceptions when the user introducing dynamic allocation. One quick
solution is to disable dynamic allocation if possible,
Github user sprite311 commented on the issue:
https://github.com/apache/spark/pull/7927
i have this problem in spark1.3.0, is there any other solutions? i can't
update spark to 1.6
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub