[GitHub] [spark] gaborgsomogyi commented on a change in pull request #27989: [SPARK-31228][DSTREAMS] Add version information to the configuration of Kafka

2020-03-25 Thread GitBox
gaborgsomogyi commented on a change in pull request #27989: 
[SPARK-31228][DSTREAMS] Add version information to the configuration of Kafka
URL: https://github.com/apache/spark/pull/27989#discussion_r397807801
 
 

 ##
 File path: docs/structured-streaming-kafka-integration.md
 ##
 @@ -571,16 +575,18 @@ Note that it doesn't leverage Apache Commons Pool due to 
the difference of chara
 The following properties are available to configure the fetched data pool:
 
 
-Property NameDefaultMeaning
+Property NameDefaultMeaningSince 
Version
 
   spark.kafka.consumer.fetchedData.cache.timeout
 
 Review comment:
   Same here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] gaborgsomogyi commented on a change in pull request #27989: [SPARK-31228][DSTREAMS] Add version information to the configuration of Kafka

2020-03-25 Thread GitBox
gaborgsomogyi commented on a change in pull request #27989: 
[SPARK-31228][DSTREAMS] Add version information to the configuration of Kafka
URL: https://github.com/apache/spark/pull/27989#discussion_r397807226
 
 

 ##
 File path: docs/structured-streaming-kafka-integration.md
 ##
 @@ -525,28 +525,32 @@ The caching key is built up from the following 
information:
 The following properties are available to configure the consumer pool:
 
 
-Property NameDefaultMeaning
+Property NameDefaultMeaningSince 
Version
 
   spark.kafka.consumer.cache.capacity
   The maximum number of consumers cached. Please note that it's a soft 
limit.
   64
+  3.0.0
 
 
   spark.kafka.consumer.cache.timeout
   The minimum amount of time a consumer may sit idle in the pool before it 
is eligible for eviction by the evictor.
   5m (5 minutes)
+  3.0.0
 
 Review comment:
   Hmm, here I can't see the `.version(...)` in the code area but when I've 
checked out the PR it's there. The PR is fine just wanted to mention this...


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org