I want to configure my Hive to use Spark 2 as its engine. According to
Hive's instruction, the Spark should build *without *Hadoop, nor Hive. I
could build my own, but for some reason I hope I could use a official
binary build.

So I want to ask if the official Spark binary build labeled "with
user-provided Hadoop" also implies "user-provided Hive".


David S.

Reply via email to