Re: [pyspark] Starting workers in a virtualenv

2015-05-21 Thread Karlson

That works, thank you!

On 2015-05-22 03:15, Davies Liu wrote:

Could you try with specify PYSPARK_PYTHON to the path of python in
your virtual env, for example

PYSPARK_PYTHON=/path/to/env/bin/python bin/spark-submit xx.py

On Mon, Apr 20, 2015 at 12:51 AM, Karlson  wrote:

Hi all,

I am running the Python process that communicates with Spark in a
virtualenv. Is there any way I can make sure that the Python processes 
of

the workers are also started in a virtualenv? Currently I am getting
ImportErrors when the worker tries to unpickle stuff that is not 
installed
system-wide. For now both the worker and the driver run on the same 
machine

in local mode.

Thanks in advance!

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: [pyspark] Starting workers in a virtualenv

2015-05-21 Thread Davies Liu
Could you try with specify PYSPARK_PYTHON to the path of python in
your virtual env, for example

PYSPARK_PYTHON=/path/to/env/bin/python bin/spark-submit xx.py

On Mon, Apr 20, 2015 at 12:51 AM, Karlson  wrote:
> Hi all,
>
> I am running the Python process that communicates with Spark in a
> virtualenv. Is there any way I can make sure that the Python processes of
> the workers are also started in a virtualenv? Currently I am getting
> ImportErrors when the worker tries to unpickle stuff that is not installed
> system-wide. For now both the worker and the driver run on the same machine
> in local mode.
>
> Thanks in advance!
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



[pyspark] Starting workers in a virtualenv

2015-04-20 Thread Karlson

Hi all,

I am running the Python process that communicates with Spark in a 
virtualenv. Is there any way I can make sure that the Python processes 
of the workers are also started in a virtualenv? Currently I am getting 
ImportErrors when the worker tries to unpickle stuff that is not 
installed system-wide. For now both the worker and the driver run on the 
same machine in local mode.


Thanks in advance!

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org