Github user jerryshao commented on the issue:

    https://github.com/apache/spark/pull/20078
  
    Originally in Spark dynamic allocation, "spark.executor.instances" and 
dynamic allocation conf cannot be co-existed, if "spark.executor.instances" is 
set, dynamic allocation will not be enabled. But this behavior is changed after 
2.0.
    
    I think here for streaming dynamic allocation, we'd better keep it 
consistent with Spark dynamic allocation.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to