Hi Jack,
My use case is a bit different, I created a subprocess instead of thread. I
can't pass the args to subprocess.
Jack Goodson 於 2022年12月12日 週一 晚上8:03寫道:
> apologies, the code should read as below
>
> from threading import Thread
>
> context =
efinitely not something that we test or
> support, especially in a scenario you described.
>
> If you want to achieve concurrent execution, multithreading is normally
> more than sufficient and avoids problems with the context.
>
>
>
> On 12/13/22 00:40, Kevin Su wrote:
> > I r
d connect to that kernel.
>
> But in the end, this is like Spark Connect :)
>
>
> On Mon, Dec 12, 2022 at 2:55 PM Kevin Su wrote:
>
>> Also, is there any way to workaround this issue without using Spark
>> connect?
>>
>> Kevin Su 於 2022年12月12日 週一
Also, is there any way to workaround this issue without using Spark connect?
Kevin Su 於 2022年12月12日 週一 下午2:52寫道:
> nvm, I found the ticket.
> Also, is there any way to workaround this issue without using Spark
> connect?
>
> Kevin Su 於 2022年12月12日 週一 下午2:42寫道:
>
>&
Hey there, How can I get the same spark context in two different python
processes?
Let’s say I create a context in Process A, and then I want to use python
subprocess B to get the spark context created by Process A. How can I
achieve that?
I've tried
Hi all,
I want to run spark benchmark on a standalone cluster, and I have changed
the DataSourceReadBenchmark.scala setting. (Remove "spark.master")
--- a/sql/core/src/test
/scala/org/apache/spark/sql/execution/benchmark/DataSourceReadBenchmark.scala
+++ b/sql/core/src/test
Hi all,
I try to run a benchmark test in GitHub action in my fork, and I faced the
below error.
https://github.com/pingsutw/spark/runs/2867617238?check_suite_focus=true
java.lang.AssertionError: assertion failed: spark.test.home is not set!
23799