Hi Everyone, I have one question : is it possible to run like on HDP Spark 1.6.1 and then run Spark 2.0.0 inside of it ? Like passing the spark libs with —jars ? The Ideia behind it is not to need to use the default installation of HDP and be able to deploy new versions of spark quickly.
Thx Jorge Machado