; Which must be being passed through by the driver I guess. I checked the
> spark-env.sh on each node and the appropriate SPARK_HOME is set
> correctly….
>
>
> From: Sun Rui [mailto:sunrise_...@163.com]
> Sent: 17 May 2016 11:32
> To: Mike Lewis
> Cc: user@spark
Rui [mailto:sunrise_...@163.com]
Sent: 17 May 2016 11:32
To: Mike Lewis
Cc: user@spark.apache.org
Subject: Re: SparkR query
Lewis,
1. Could you check the values of “SPARK_HOME” environment on all of your worker
nodes?
2. How did you start your SparkR shell?
On May 17, 2016, at 18:07, Mike Lewis
Lewis,
1. Could you check the values of “SPARK_HOME” environment on all of your worker
nodes?
2. How did you start your SparkR shell?
> On May 17, 2016, at 18:07, Mike Lewis wrote:
>
> Hi,
>
> I have a SparkR driver process that connects to a master running on Linux,
> I’ve tried to do a simp