cbalci commented on code in PR #10209:
URL: https://github.com/apache/pinot/pull/10209#discussion_r1094087225


##########
pinot-connectors/pinot-spark-connector/src/main/scala/org/apache/pinot/connector/spark/connector/PinotGrpcServerDataFetcher.scala:
##########
@@ -25,14 +25,15 @@ import org.apache.pinot.common.proto.Server.ServerRequest
 import org.apache.pinot.connector.spark.utils.Logging
 import org.apache.pinot.spi.config.table.TableType
 
+import java.io.Closeable
 import scala.collection.JavaConverters._
 
 /**
  * Data fetcher from Pinot Grpc server with specific segments.
  * Eg: offline-server1: segment1, segment2, segment3
  */
 private[pinot] class PinotGrpcServerDataFetcher(pinotSplit: PinotSplit)
-  extends Logging {
+  extends Logging with Closeable {

Review Comment:
   Thanks for the review @walterddr!
   
   Do you mean closing the channel inside the `fetchData` method? If so, that 
may not be possible. We need to return a Closeable iterator to Spark which will 
do the closing once the stream is fully consumed.
   
   I guess we could return an anonymous function which handles the closure, but 
I thought implementing 'Closeable' makes it very obvious to the user.
   
   Let me know what you think.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to