Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2464#discussion_r17826891
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -859,18 +864,27 @@ private[spark] object Utils extends Logging {
         }
       }
     
    -  /**
    -   * A regular expression to match classes of the "core" Spark API that we 
want to skip when
    -   * finding the call site of a method.
    -   */
    -  private val SPARK_CLASS_REGEX = 
"""^org\.apache\.spark(\.api\.java)?(\.util)?(\.rdd)?\.[A-Z]""".r
    +  /** Default filtering function for finding call sites using 
`getCallSite`. */
    +  private def defaultCallSiteFilterFunc(className: String): Boolean = {
    +    // A regular expression to match classes of the "core" Spark API that 
we want to skip when
    +    // finding the call site of a method.
    +    val SPARK_CORE_CLASS_REGEX = 
"""^org\.apache\.spark(\.api\.java)?(\.util)?(\.rdd)?\.[A-Z]""".r
    +    val SCALA_CLASS_REGEX = """^scala""".r
    +    val isSparkClass = 
SPARK_CORE_CLASS_REGEX.findFirstIn(className).isDefined
    +    val isScalaClass = SCALA_CLASS_REGEX.findFirstIn(className).isDefined
    +    // If the class neither belongs to Spark nor is a simple Scala class, 
then it is a
    +    // user-defined class
    +    !isSparkClass && !isScalaClass
    +  }
     
       /**
        * When called inside a class in the spark package, returns the name of 
the user code class
        * (outside the spark package) that called into Spark, as well as which 
Spark method they called.
        * This is used, for example, to tell users where in their code each RDD 
got created.
    +   *
    +   * @param classFilterFunc Function that returns true if the given class 
belongs to user code
        */
    -  def getCallSite: CallSite = {
    +  def getCallSite(classFilterFunc: String => Boolean = 
defaultCallSiteFilterFunc): CallSite = {
    --- End diff --
    
    I would find it more natural to pass a function that excludes line from 
Spark code (it just inverts the logic). Something like `internalExclusionFunc`. 
Then you could have `sparkCoreExclusionFunc` and `sparkStreamingExclusionFunc`. 
@andrewor14 what do you find more intuitive?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to