Sean Owen created SPARK-6873: -------------------------------- Summary: Some Hive-Catalyst comparison tests fail due to unimportant order of some printed elements Key: SPARK-6873 URL: https://issues.apache.org/jira/browse/SPARK-6873 Project: Spark Issue Type: Bug Components: SQL, Tests Affects Versions: 1.3.1 Reporter: Sean Owen Priority: Minor
As I mentioned, I've been seeing 4 test failures in Hive tests for a while, and actually it still affects master. I think it's a superficial problem that only turns up when running on Java 8, but still, would probably be an easy fix and good to fix. Specifically, here are four tests and the bit that fails the comparison, below. I tried to diagnose this but had trouble even finding where some of this occurs, like the list of synonyms? {code} - show_tblproperties *** FAILED *** Results do not match for show_tblproperties: ... !== HIVE - 2 row(s) == == CATALYST - 2 row(s) == !tmp true bar bar value !bar bar value tmp true (HiveComparisonTest.scala:391) {code} {code} - show_create_table_serde *** FAILED *** Results do not match for show_create_table_serde: ... WITH SERDEPROPERTIES ( WITH SERDEPROPERTIES ( ! 'serialization.format'='$', 'field.delim'=',', ! 'field.delim'=',') 'serialization.format'='$') {code} {code} - udf_std *** FAILED *** Results do not match for udf_std: ... !== HIVE - 2 row(s) == == CATALYST - 2 row(s) == std(x) - Returns the standard deviation of a set of numbers std(x) - Returns the standard deviation of a set of numbers !Synonyms: stddev_pop, stddev Synonyms: stddev, stddev_pop (HiveComparisonTest.scala:391) {code} {code} - udf_stddev *** FAILED *** Results do not match for udf_stddev: ... !== HIVE - 2 row(s) == == CATALYST - 2 row(s) == stddev(x) - Returns the standard deviation of a set of numbers stddev(x) - Returns the standard deviation of a set of numbers !Synonyms: stddev_pop, std Synonyms: std, stddev_pop (HiveComparisonTest.scala:391) {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org