Github user vijoshi commented on the issue:
https://github.com/apache/spark/pull/17731
Thanks, I will close this PR.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/17731
great - if you have a workaround our preference would be not to change
Spark for it.
`.sparkREnv` and variables are not public API that we support
---
If your project is set up for it, you
Github user vijoshi commented on the issue:
https://github.com/apache/spark/pull/17731
Thanks, I tried this out - looks like doing a `rm(".sparkRsession",
envir=SparkR:::.sparkREnv)` is a way to prevent the infinite loop situation. If
I need to setup an active binding for
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/17731
so essentially it's still evaluating the `get` before when the 2nd `get` is
hit from the delay binding (as a way to prevent going into an infinite loop,
really)
what if you have this
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/17731
so both `sparkSession` or `sparkRjsc` are valid even after the call to
`get` failed?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user vijoshi commented on the issue:
https://github.com/apache/spark/pull/17731
"I understand these 2 cases, can you explain how your change connect to
these two?"
Say, I do this:
```
delayAssign(delayedAssign(".sparkRsession", { sparkR.session(..) },
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/17731
I understand these 2 cases, can you explain how your change connect to
these two?
if you delay bind to `".sparkRjsc", envir = .sparkREnv`, doesn't it just
work?
---
If your project is set
Github user vijoshi commented on the issue:
https://github.com/apache/spark/pull/17731
@felixcheung yes. We need to support these two types of possibilities:
```
#do not call sparkR.session() - followed by implicit reference to
sparkSession
a <- createDataFrame(iris)
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/17731
also, what if an user wants to explicitly create a spark session with
specific parameter? the delay binding model doesn't seem to support that
properly?
---
If your project is set up for it,
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/17731
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/17731
**[Test build #76074 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/76074/testReport)**
for PR 17731 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/17731
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/76074/
Test PASSed.
---
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/17731
this **might** be reasonable, but `sparkR.sparkContext` is only called when
`sparkR.session()` is called, and so I'm not sure I follow how if someone is
doing this in a brand new R session:
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/17731
**[Test build #76074 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/76074/testReport)**
for PR 17731 at commit
Github user vijoshi commented on the issue:
https://github.com/apache/spark/pull/17731
@felixcheung
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/17731
Merged build finished. Test FAILed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/17731
**[Test build #76073 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/76073/testReport)**
for PR 17731 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/17731
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/76073/
Test FAILed.
---
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/17731
**[Test build #76073 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/76073/testReport)**
for PR 17731 at commit
19 matches
Mail list logo