Github user tdas commented on a diff in the pull request:

    https://github.com/apache/spark/pull/6034#discussion_r30390985
  
    --- Diff: 
external/twitter/src/main/scala/org/apache/spark/streaming/twitter/TwitterUtils.scala
 ---
    @@ -40,7 +40,7 @@ object TwitterUtils {
           twitterAuth: Option[Authorization],
           filters: Seq[String] = Nil,
           storageLevel: StorageLevel = StorageLevel.MEMORY_AND_DISK_SER_2
    -    ): ReceiverInputDStream[Status] = {
    +    ): ReceiverInputDStream[Status] = ssc.withScope {
    --- End diff --
    
    Well you are sort-of-duplicating some code/processing by adding `withScope` 
as well as `customScopeName`. Instead just add `withScope("kafka stream")` in 
one place and remove the extra `customeScopeName` from everywhere. And even 
though there are many versions of KafkaUtils.createStream, only one or two of 
them actually calls `new KafkaInputDStream`. So using `withScope("kafka 
stream")` in only those is sufficient. Also that more matches other ways of 
setting scope where the scope is defined by the caller method rather than by 
the entity itself. In fact, conceptually, an entity cannot really define the 
idea of scope around itself; only the caller/user of the entity can define the 
scope around the entity.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to