Github user kmadhugit commented on the pull request:

    https://github.com/apache/spark/pull/7461#issuecomment-136928037
  
    I've modified the title.
    
    I'm facing a problem during unit test. When I call 
context.getExecutorStorageStatus.length it returns 1 instead of numExecutors + 
1(for driver). This happens because there is no prior job submitted to the 
executor by DAG, so the only registered block manager is the driver. We may 
need to find an alternative way to know number of executors(with an assumption 
that all executors will store some blocks of the RDD).  Without that change 
this fix won't work.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to