sarutak commented on code in PR #52756:
URL: https://github.com/apache/spark/pull/52756#discussion_r2469654376
##########
sql/connect/client/jdbc/src/main/scala/org/apache/spark/sql/connect/client/jdbc/SparkConnectResultSet.scala:
##########
@@ -253,13 +263,25 @@ class SparkConnectResultSet(
override def getBigDecimal(columnLabel: String): java.math.BigDecimal =
throw new SQLFeatureNotSupportedException
- override def isBeforeFirst: Boolean = throw new
SQLFeatureNotSupportedException
+ override def isBeforeFirst: Boolean = {
+ checkOpen()
+ cursor < 1
+ }
- override def isAfterLast: Boolean = throw new SQLFeatureNotSupportedException
+ override def isFirst: Boolean = {
+ checkOpen()
+ cursor == 1
+ }
- override def isFirst: Boolean = throw new SQLFeatureNotSupportedException
+ override def isLast: Boolean = {
+ checkOpen()
+ cursor > 0 && cursor == sparkResult.length
+ }
- override def isLast: Boolean = throw new SQLFeatureNotSupportedException
+ override def isAfterLast: Boolean = {
+ checkOpen()
+ cursor > 0 && cursor > sparkResult.length
Review Comment:
I know `isAfterLast` in lot's of existing JDBC drivers returns `true` after
calling `next()` even if the result set contains zero rows (e.g. PostgreSQL).
But I'm OK to comply with the JDBC specification.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]