rickchengx commented on PR #11860:
URL: 
https://github.com/apache/dolphinscheduler/pull/11860#issuecomment-1241415952

   Hi, @fuchanghai, Thanks for the reply. I agree with your point. 
   1. I think that the current `spark version` on ui is kind of **misleading**. 
Users may be concerned that DS does not support `spark 3.x.x`. I have seen 
people asking whether DS supports spark3 , even if there is a description for 
this problem in the DS 
[documentation](https://dolphinscheduler.apache.org/en-us/docs/dev/user_doc/guide/installation/kubernetes.html)
 (but users will not find this description easily) as below:
   
   <img width="811" alt="截屏2022-09-09 09 55 13" 
src="https://user-images.githubusercontent.com/38122586/189256318-8c9cbd85-a6db-4662-981b-57df74774136.png";>
   
   
   2. Even if DS needs to support multiple sparks, I don't think setting 
`SPARK_HOME1` and `SPARK_HOME2` is an elegant way. What if the user needs to 
support 3 or more sparks?
   
   3. I think that using `SPARK_HOME` is an easy and efficient way without 
misunderstandings and some potential problems.
   
   If `DS` needs to support multiple sparks, perhaps it is a better way to let 
the user set the value of `SPARK_HOME` on the task. So different tasks can use 
different `SPARK_HOME`.
   
   <img width="583" alt="截屏2022-09-09 10 04 51" 
src="https://user-images.githubusercontent.com/38122586/189257001-7d45f8dc-37c9-4763-bec5-3795f21f0bd3.png";>
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to