[GitHub] spark pull request #15174: [SPARK-17502] [17609] [SQL] [Backport] [2.0] Fix ...

2016-09-22 Thread gatorsmile
Github user gatorsmile closed the pull request at:

https://github.com/apache/spark/pull/15174


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15174: [SPARK-17502] [17609] [SQL] [Backport] [2.0] Fix ...

2016-09-22 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/15174#discussion_r80088798
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala
 ---
@@ -342,6 +342,7 @@ trait CheckAnalysis extends PredicateHelper {
 
   case InsertIntoTable(t, _, _, _, _)
 if !t.isInstanceOf[LeafNode] ||
--- End diff --

This PR is to backport the fix from the master branch to Spark 2.0 branch. 
I think your comment is valid. You can submit a PR for improving the code 
style. Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15174: [SPARK-17502] [17609] [SQL] [Backport] [2.0] Fix ...

2016-09-22 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/15174#discussion_r80088127
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/createDataSourceTables.scala
 ---
@@ -200,8 +200,9 @@ case class CreateDataSourceTableAsSelectCommand(
   // TODO: Check that options from the resolved relation match the 
relation that we are
   // inserting into (i.e. using the same compression).
 
-  EliminateSubqueryAliases(
-sessionState.catalog.lookupRelation(tableIdentWithDB)) match {
+  // Pass a table identifier with database part, so that 
`tableExists` won't check temp
+  // views unexpectedly.
+  
EliminateSubqueryAliases(sessionState.catalog.lookupRelation(tableIdentWithDB)) 
match {
--- End diff --

Done. : )


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15174: [SPARK-17502] [17609] [SQL] [Backport] [2.0] Fix ...

2016-09-22 Thread Mironor
Github user Mironor commented on a diff in the pull request:

https://github.com/apache/spark/pull/15174#discussion_r80003353
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala
 ---
@@ -342,6 +342,7 @@ trait CheckAnalysis extends PredicateHelper {
 
   case InsertIntoTable(t, _, _, _, _)
 if !t.isInstanceOf[LeafNode] ||
--- End diff --

I was wondering why not replacing these one-letter values with something 
more explicit (like `table` in the following `case`) ? I understand that this 
is not the purpose of the PR, but just interested if it is normal to highlight 
those "style" things in PR discussion or not.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15174: [SPARK-17502] [17609] [SQL] [Backport] [2.0] Fix ...

2016-09-22 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/15174#discussion_r79997604
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/createDataSourceTables.scala
 ---
@@ -200,8 +200,9 @@ case class CreateDataSourceTableAsSelectCommand(
   // TODO: Check that options from the resolved relation match the 
relation that we are
   // inserting into (i.e. using the same compression).
 
-  EliminateSubqueryAliases(
-sessionState.catalog.lookupRelation(tableIdentWithDB)) match {
+  // Pass a table identifier with database part, so that 
`tableExists` won't check temp
+  // views unexpectedly.
+  
EliminateSubqueryAliases(sessionState.catalog.lookupRelation(tableIdentWithDB)) 
match {
--- End diff --

we don't need `tableIdentWithDB` in `CreateDataSourceTableCommand` right?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org