Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/21588
  
    I'm talking about the `VersionsSuite` stuff. I think it needs to be a more 
conscious decision about what happens.
    
    If, when build with Hadoop 3, Spark will not support older versions of 
Hive, that needs to be reflected in the code, not just in the tests.
    
    If, on the other hand, we want to support those, there may be ways. e.g. by 
forcing `sharesHadoopClasses` to false when using those versions with Hadoop 3.
    
    But I think just disabling the tests is the wrong approach.
    
    And I'm also referring to the "org.spark-project" fork of Hive. That hasn't 
been updated, right? Which means that if you run this PR here, it will still 
fail?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to