----- Original Message -----
> dev/run-tests fails two tests (1 Hive, 1 Kafka Streaming) for me
> locally on 1.1.0-rc3. Does anyone else see that? It may be my env.
> Although I still see the Hive failure on Debian too:
>
> [info] - SET commands semantics for a HiveContext *** FAILED ***
> [info] Expected Array("spark.sql.key.usedfortestonly=test.val.0",
> "spark.sql.key.usedfortestonlyspark.sql.key.usedfortestonly=test.val.0test.val.0"),
> but got
> Array("spark.sql.key.usedfortestonlyspark.sql.key.usedfortestonly=test.val.0test.val.0",
> "spark.sql.key.usedfortestonly=test.val.0") (HiveQuerySuite.scala:541)
I've seen this error before. (In particular, I've seen it on my OS X machine
using Oracle JDK 8 but not on Fedora using OpenJDK.) I've also seen similar
errors in topic branches (but not on master) that seem to indicate that tests
depend on sets of pairs arriving from Hive in a particular order; it seems that
this isn't a safe assumption.
I just submitted a (trivial) PR to fix this spurious failure:
https://github.com/apache/spark/pull/2220
best,
wb
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]