You can run Spark in local mode and not require any standalone master or
worker.
Are you sure you're not using local mode? are you sure the daemons aren't
running?
What is the Spark master you pass?

On Wed, Mar 9, 2022 at 7:35 PM <capitnfrak...@free.fr> wrote:

> What I tried to say is, I didn't start spark master/worker at all, for a
> standalone deployment.
>
> But I still can login into pyspark to run the job. I don't know why.
>
> $ ps -efw|grep spark
> $ netstat -ntlp
>
> both the output above have no spark related info.
> And this machine is managed by myself, I know how to start spark
> correctly. But I didn't start them yet, and I still can login to pyspark
> to run the jobs. for exmaple:
>
> >>> df = sc.parallelize([("t1",1),("t2",2)]).toDF(["name","number"])
> >>> df.show()
> +----+------+
> |name|number|
> +----+------+
> |  t1|     1|
> |  t2|     2|
> +----+------+
>
>
> do you know why?
> Thank you.
> frakass.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to