Xu Ling 09:42Is it possible to add a spark-2.4 profile to the spark-plugin to 
specify the hadoop version and shared dependency?



Because this shared doesn't seem to be useful for hadoop3 with normally 
supported versions



My spark is 3.2 not 2.4



fly-cutterThis is OK. If users repackage spark code to support 2.4.3 and 
hadoop3 compatibility, they can use this profile to adapt.




Xu LingThe normal process should not be broken.





fly-cutter:Yes.



许灵 09:42是否可以在,spark-plugin, 可以加一个spark-2.4的profile, 来指定hadoop 
版本与shared的dependency?




因为这个shared,对于hadoop3与正常支持的版本好像没什么用




我这里spark是3.2的不是2.4





小郭飞飞刀 9:55这个是可以的,如果用户自己重新打包spark代码,支持2.4.3和hadoop3的兼容,可以通过这个profile来适配




许灵这样原来的正常流程应该就不会打破了




小郭飞飞刀是的

Reply via email to