>From Spark version 3.1.0 onwards, the clients provided for Spark are built 
>with Hadoop 3 and placed in maven Repository. Unfortunately  we use Hadoop 
>2.7.7 in our infrastructure currently.

1) Does Spark have a plan to publish the Spark client dependencies for Hadoop 
2.x?
2) Are the new Spark clients capable of connecting to the Hadoop 2.x cluster? 
(According to a simple test, Spark client 3.2.1 had no problem with the Hadoop 
2.7 cluster but we wanted to know if there was any guarantee from Spark?)

Thank you very much in advance
Amin Borjian

Reply via email to