This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


    from 23a49af  [SPARK-30329][ML] add iterator/foreach methods for Vectors
     add 319ccd5  [SPARK-30336][SQL][SS] Move Kafka consumer-related classes to 
its own package

No new revisions were added by this update.

Summary of changes:
 .../sql/kafka010/KafkaBatchPartitionReader.scala      |  1 +
 .../spark/sql/kafka010/KafkaContinuousStream.scala    |  1 +
 .../apache/spark/sql/kafka010/KafkaSourceRDD.scala    |  7 ++-----
 .../sql/kafka010/{ => consumer}/FetchedDataPool.scala | 13 +++++++------
 .../{ => consumer}/InternalKafkaConsumerPool.scala    | 13 ++++++-------
 .../kafka010/{ => consumer}/KafkaDataConsumer.scala   | 19 +++++++++++--------
 .../{ => consumer}/FetchedDataPoolSuite.scala         |  5 +++--
 .../InternalKafkaConsumerPoolSuite.scala              |  5 +++--
 .../{ => consumer}/KafkaDataConsumerSuite.scala       |  5 +++--
 9 files changed, 37 insertions(+), 32 deletions(-)
 rename external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/{ 
=> consumer}/FetchedDataPool.scala (92%)
 rename external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/{ 
=> consumer}/InternalKafkaConsumerPool.scala (96%)
 rename external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/{ 
=> consumer}/KafkaDataConsumer.scala (98%)
 rename external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/{ 
=> consumer}/FetchedDataPoolSuite.scala (97%)
 rename external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/{ 
=> consumer}/InternalKafkaConsumerPoolSuite.scala (97%)
 rename external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/{ 
=> consumer}/KafkaDataConsumerSuite.scala (98%)


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to