spark git commit: [SPARK-11474][SQL] change fetchSize to fetchsize

2015-11-05 Thread rxin
Repository: spark
Updated Branches:
  refs/heads/branch-1.5 9522dd23d -> 8e1bd6089


[SPARK-11474][SQL] change fetchSize to fetchsize

In DefaultDataSource.scala, it has
override def createRelation(
sqlContext: SQLContext,
parameters: Map[String, String]): BaseRelation
The parameters is CaseInsensitiveMap.
After this line
parameters.foreach(kv => properties.setProperty(kv._1, kv._2))
properties is set to all lower case key/value pairs and fetchSize becomes 
fetchsize.
However, in compute method in JDBCRDD, it has
val fetchSize = properties.getProperty("fetchSize", "0").toInt
so fetchSize value is always 0 and never gets set correctly.

Author: Huaxin Gao 

Closes #9473 from huaxingao/spark-11474.

(cherry picked from commit b072ff4d1d05fc212cd7036d1897a032a395f0b3)
Signed-off-by: Reynold Xin 


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/8e1bd608
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/8e1bd608
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/8e1bd608

Branch: refs/heads/branch-1.5
Commit: 8e1bd60899838111462227c3ca981f13e60225d8
Parents: 9522dd2
Author: Huaxin Gao 
Authored: Thu Nov 5 09:41:14 2015 -0800
Committer: Reynold Xin 
Committed: Thu Nov 5 09:41:30 2015 -0800

--
 .../org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/8e1bd608/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala
--
diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala
 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala
index 730d88b..018a009 100644
--- 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala
+++ 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala
@@ -347,6 +347,7 @@ private[sql] class JDBCRDD(
 
   /**
* Runs the SQL query against the JDBC driver.
+   *
*/
   override def compute(thePart: Partition, context: TaskContext): 
Iterator[InternalRow] =
 new Iterator[InternalRow] {
@@ -368,7 +369,7 @@ private[sql] class JDBCRDD(
 val sqlText = s"SELECT $columnList FROM $fqTable $myWhereClause"
 val stmt = conn.prepareStatement(sqlText,
 ResultSet.TYPE_FORWARD_ONLY, ResultSet.CONCUR_READ_ONLY)
-val fetchSize = properties.getProperty("fetchSize", "0").toInt
+val fetchSize = properties.getProperty("fetchsize", "0").toInt
 stmt.setFetchSize(fetchSize)
 val rs = stmt.executeQuery()
 


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-11474][SQL] change fetchSize to fetchsize

2015-11-05 Thread rxin
Repository: spark
Updated Branches:
  refs/heads/master a4b5cefcf -> b072ff4d1


[SPARK-11474][SQL] change fetchSize to fetchsize

In DefaultDataSource.scala, it has
override def createRelation(
sqlContext: SQLContext,
parameters: Map[String, String]): BaseRelation
The parameters is CaseInsensitiveMap.
After this line
parameters.foreach(kv => properties.setProperty(kv._1, kv._2))
properties is set to all lower case key/value pairs and fetchSize becomes 
fetchsize.
However, in compute method in JDBCRDD, it has
val fetchSize = properties.getProperty("fetchSize", "0").toInt
so fetchSize value is always 0 and never gets set correctly.

Author: Huaxin Gao 

Closes #9473 from huaxingao/spark-11474.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/b072ff4d
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/b072ff4d
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/b072ff4d

Branch: refs/heads/master
Commit: b072ff4d1d05fc212cd7036d1897a032a395f0b3
Parents: a4b5cef
Author: Huaxin Gao 
Authored: Thu Nov 5 09:41:14 2015 -0800
Committer: Reynold Xin 
Committed: Thu Nov 5 09:41:14 2015 -0800

--
 .../org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/b072ff4d/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala
--
diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala
 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala
index 730d88b..018a009 100644
--- 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala
+++ 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala
@@ -347,6 +347,7 @@ private[sql] class JDBCRDD(
 
   /**
* Runs the SQL query against the JDBC driver.
+   *
*/
   override def compute(thePart: Partition, context: TaskContext): 
Iterator[InternalRow] =
 new Iterator[InternalRow] {
@@ -368,7 +369,7 @@ private[sql] class JDBCRDD(
 val sqlText = s"SELECT $columnList FROM $fqTable $myWhereClause"
 val stmt = conn.prepareStatement(sqlText,
 ResultSet.TYPE_FORWARD_ONLY, ResultSet.CONCUR_READ_ONLY)
-val fetchSize = properties.getProperty("fetchSize", "0").toInt
+val fetchSize = properties.getProperty("fetchsize", "0").toInt
 stmt.setFetchSize(fetchSize)
 val rs = stmt.executeQuery()
 


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org