Github user vijoshi commented on the issue:

    https://github.com/apache/spark/pull/17731
  
    Thanks, I tried this out - looks like doing a `rm(".sparkRsession", 
envir=SparkR:::.sparkREnv)` is a way to prevent the infinite loop situation. If 
I need to setup an active binding for `.sparkRsession` etc,  I can achieve that 
too, but I need to do the `rm()` twice, first to ensure we don't get into 
infinite recursion and second to restore active binding once SparkR has created 
and assigned a new session. Something like this:
    
    ```
    makeActiveBinding(".sparkRsession", sparkSessionFn, SparkR:::.sparkREnv)
    
    sparkSessionFn <- local({
        function(v) {
          if (missing(v)) {
            # get SparkSession
              if (!exists("cachedSession", envir=.GlobalEnv)) {
                # rm to ensure no infinite recursion
                rm(".sparkRsession", envir=SparkR:::.sparkREnv) 
    
                cachedSession <<- sparkR.session(...)
    
                # rm again to restore active binding
                rm(".sparkRsession", envir=SparkR:::.sparkREnv) 
                makeActiveBinding(".sparkRsession", sparkSessionFn, 
SparkR:::.sparkREnv)
               }
               # check and update runtime config on the session if needed
                get("cachedSession", envir=.GlobalEnv)
          } else {
           # set sparkSession
           cachedSession <<- v
          }
        }
      })
    
    
    ```



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to