spark git commit: [SPARK-19525][CORE] Add RDD checkpoint compression support

2017-04-28 Thread zsxwing
Repository: spark Updated Branches: refs/heads/master ebff519c5 -> 77bcd77ed [SPARK-19525][CORE] Add RDD checkpoint compression support ## What changes were proposed in this pull request? This PR adds RDD checkpoint compression support and add a new config `spark.checkpoint.compress` to

spark git commit: [SPARK-19525][CORE] Add RDD checkpoint compression support

2017-04-28 Thread zsxwing
Repository: spark Updated Branches: refs/heads/branch-2.2 554700266 -> 140586238 [SPARK-19525][CORE] Add RDD checkpoint compression support ## What changes were proposed in this pull request? This PR adds RDD checkpoint compression support and add a new config `spark.checkpoint.compress` to

spark git commit: [SPARK-20471] Remove AggregateBenchmark testsuite warning: Two level hashmap is disabled but vectorized hashmap is enabled

2017-04-28 Thread lixiao
Repository: spark Updated Branches: refs/heads/master 5d71f3db8 -> ebff519c5 [SPARK-20471] Remove AggregateBenchmark testsuite warning: Two level hashmap is disabled but vectorized hashmap is enabled What changes were proposed in this pull request? remove AggregateBenchmark testsuite

spark git commit: [SPARK-20471] Remove AggregateBenchmark testsuite warning: Two level hashmap is disabled but vectorized hashmap is enabled

2017-04-28 Thread lixiao
Repository: spark Updated Branches: refs/heads/branch-2.2 f66aabd7c -> 554700266 [SPARK-20471] Remove AggregateBenchmark testsuite warning: Two level hashmap is disabled but vectorized hashmap is enabled What changes were proposed in this pull request? remove AggregateBenchmark testsuite

spark git commit: [SPARK-20514][CORE] Upgrade Jetty to 9.3.11.v20160721

2017-04-28 Thread vanzin
Repository: spark Updated Branches: refs/heads/master 733b81b83 -> 5d71f3db8 [SPARK-20514][CORE] Upgrade Jetty to 9.3.11.v20160721 Upgrade Jetty so it can work with Hadoop 3 (alpha 2 release, in particular). Without this change, because of incompatibily between Jetty versions, Spark fails to

spark git commit: [SPARK-20514][CORE] Upgrade Jetty to 9.3.11.v20160721

2017-04-28 Thread vanzin
Repository: spark Updated Branches: refs/heads/branch-2.2 ec712d751 -> f66aabd7c [SPARK-20514][CORE] Upgrade Jetty to 9.3.11.v20160721 Upgrade Jetty so it can work with Hadoop 3 (alpha 2 release, in particular). Without this change, because of incompatibily between Jetty versions, Spark fails

spark git commit: [SPARK-20496][SS] Bug in KafkaWriter Looks at Unanalyzed Plans

2017-04-28 Thread brkyvz
Repository: spark Updated Branches: refs/heads/branch-2.2 ea5b11446 -> ec712d751 [SPARK-20496][SS] Bug in KafkaWriter Looks at Unanalyzed Plans ## What changes were proposed in this pull request? We didn't enforce analyzed plans in Spark 2.1 when writing out to Kafka. ## How was this patch

spark git commit: [SPARK-20496][SS] Bug in KafkaWriter Looks at Unanalyzed Plans

2017-04-28 Thread brkyvz
Repository: spark Updated Branches: refs/heads/branch-2.1 6696ad0e8 -> 5131b0a96 [SPARK-20496][SS] Bug in KafkaWriter Looks at Unanalyzed Plans ## What changes were proposed in this pull request? We didn't enforce analyzed plans in Spark 2.1 when writing out to Kafka. ## How was this patch

spark git commit: [SPARK-20496][SS] Bug in KafkaWriter Looks at Unanalyzed Plans

2017-04-28 Thread brkyvz
Repository: spark Updated Branches: refs/heads/master 8c911adac -> 733b81b83 [SPARK-20496][SS] Bug in KafkaWriter Looks at Unanalyzed Plans ## What changes were proposed in this pull request? We didn't enforce analyzed plans in Spark 2.1 when writing out to Kafka. ## How was this patch

spark git commit: [SPARK-20465][CORE] Throws a proper exception when any temp directory could not be got

2017-04-28 Thread srowen
Repository: spark Updated Branches: refs/heads/branch-2.2 af3a1411a -> ea5b11446 [SPARK-20465][CORE] Throws a proper exception when any temp directory could not be got ## What changes were proposed in this pull request? This PR proposes to throw an exception with better message rather than

spark git commit: [SPARK-20465][CORE] Throws a proper exception when any temp directory could not be got

2017-04-28 Thread srowen
Repository: spark Updated Branches: refs/heads/master 59e3a5644 -> 8c911adac [SPARK-20465][CORE] Throws a proper exception when any temp directory could not be got ## What changes were proposed in this pull request? This PR proposes to throw an exception with better message rather than

spark git commit: [SPARK-14471][SQL] Aliases in SELECT could be used in GROUP BY

2017-04-28 Thread wenchen
Repository: spark Updated Branches: refs/heads/branch-2.2 26a9e2948 -> af3a1411a [SPARK-14471][SQL] Aliases in SELECT could be used in GROUP BY ## What changes were proposed in this pull request? This pr added a new rule in `Analyzer` to resolve aliases in `GROUP BY`. The current master

spark git commit: [SPARK-14471][SQL] Aliases in SELECT could be used in GROUP BY

2017-04-28 Thread wenchen
Repository: spark Updated Branches: refs/heads/master e3c816043 -> 59e3a5644 [SPARK-14471][SQL] Aliases in SELECT could be used in GROUP BY ## What changes were proposed in this pull request? This pr added a new rule in `Analyzer` to resolve aliases in `GROUP BY`. The current master throws an

spark git commit: [SPARK-20476][SQL] Block users to create a table that use commas in the column names

2017-04-28 Thread wenchen
Repository: spark Updated Branches: refs/heads/branch-2.2 f60ed0c2c -> 26a9e2948 [SPARK-20476][SQL] Block users to create a table that use commas in the column names ### What changes were proposed in this pull request? ```SQL hive> create table t1(`a,` string); OK Time taken: 1.399 seconds

spark git commit: [SPARK-20476][SQL] Block users to create a table that use commas in the column names

2017-04-28 Thread wenchen
Repository: spark Updated Branches: refs/heads/master 7fe824979 -> e3c816043 [SPARK-20476][SQL] Block users to create a table that use commas in the column names ### What changes were proposed in this pull request? ```SQL hive> create table t1(`a,` string); OK Time taken: 1.399 seconds