Github user sun-rui commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9192#discussion_r42602160
  
    --- Diff: R/pkg/R/SQLContext.R ---
    @@ -17,6 +17,34 @@
     
     # SQLcontext.R: SQLContext-driven functions
     
    +#' Temporary function to reroute old S3 Method call to new
    +#' We need to check the class of x to ensure it is SQLContext before 
dispatching
    +dispatchFunc <- function(newFuncSig, x, ...) {
    +  funcName <- as.character(sys.call(sys.parent())[[1]])
    +  f <- get0(paste0(funcName, ".default"))
    +  # Strip sqlContext from list of parameters and then pass the rest along.
    +  # In the following, if '&' is used instead of '&&', it warns about
    +  # "the condition has length > 1 and only the first element will be used"
    +  if (class(x) == "jobj" &&
    +      grepl("org.apache.spark.sql.SQLContext", capture.output(show(x)))) {
    +    .Deprecated(newFuncSig, old = paste0(funcName, "(sqlContext...)"))
    +    f(...)
    +  } else {
    +    f(x, ...)
    +  }
    +}
    --- End diff --
    
    Thought: dispatchFunc strips sqlContext and calls into .default methods. Is 
it better for dispatchFunc to add sqlContext if it does not exist and calls 
into .default methods which accepts sqlContext as its first argument? This 
allows a use case that user can pass in different sqlContext, say a sqlContext 
and a hiveContext. It seems that sqlContext and hiveContext can co-exist?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to