Re: Spark setup on local windows machine

2014-12-02 Thread Sunita Arvind
Thanks Sameer and Akhil for your help.

I tried both your suggestions however, I still face the same issue. There
was indeed space in the installation path for Scala and Sbt since I had let
the defaults stay and hence the path was C:\Program Files . I
reinstalled scala and sbt in c:\ as well as spark.
The spark binaries I am using are built internally from source by our team
and the same binary works for rest of the team.

Here is what the spark-env.cmd looks like for me:
set SCALA_HOME=C:\scala
set
SPARK_CLASSPATH=C:\spark\bin\..\conf;C:\spark\bin\..\lib\spark-assembly-1.1.0-hadoop2.3.0.jar;C:\spark\bin\..\lib\datanucleus-api-jdo-3.2.1.jar;C:\spark\bin\..\lib\datanucleus-core-3.2.2.jar;C:\spark\bin\..\lib\datanucleus-rdbms-3.2.1.jar;C:\scala\lib\scala-library.jar;C:\scala\lib\scala-compiler.jar;



I get the same error inspite of this. The compute-classpath.cmd yields
correct results:

C:\spark>bin\spark-shell
Exception in thread "main" java.util.NoSuchElementException: key not found:
CLAS
SPATH
at scala.collection.MapLike$class.default(MapLike.scala:228)
at scala.collection.AbstractMap.default(Map.scala:58)
at scala.collection.MapLike$class.apply(MapLike.scala:141)
at scala.collection.AbstractMap.apply(Map.scala:58)
at
org.apache.spark.deploy.SparkSubmitDriverBootstrapper$.main(SparkSubm
itDriverBootstrapper.scala:49)
at
org.apache.spark.deploy.SparkSubmitDriverBootstrapper.main(SparkSubmi
tDriverBootstrapper.scala)

regards
Sunita

On Tue, Nov 25, 2014 at 11:47 PM, Sameer Farooqui 
wrote:

> Hi Sunita,
>
> This gitbook may also be useful for you to get Spark running in local mode
> on your Windows machine:
> http://blueplastic.gitbooks.io/how-to-light-your-spark-on-a-stick/content/
>
> On Tue, Nov 25, 2014 at 11:09 PM, Akhil Das 
> wrote:
>
>> You could try following this guidelines
>> http://docs.sigmoidanalytics.com/index.php/How_to_build_SPARK_on_Windows
>>
>> Thanks
>> Best Regards
>>
>> On Wed, Nov 26, 2014 at 12:24 PM, Sunita Arvind 
>> wrote:
>>
>>> Hi All,
>>>
>>> I just installed a spark on my laptop and trying to get spark-shell to
>>> work. Here is the error I see:
>>>
>>> C:\spark\bin>spark-shell
>>> Exception in thread "main" java.util.NoSuchElementException: key not
>>> found: CLAS
>>> SPATH
>>> at scala.collection.MapLike$class.default(MapLike.scala:228)
>>> at scala.collection.AbstractMap.default(Map.scala:58)
>>> at scala.collection.MapLike$class.apply(MapLike.scala:141)
>>> at scala.collection.AbstractMap.apply(Map.scala:58)
>>> at
>>> org.apache.spark.deploy.SparkSubmitDriverBootstrapper$.main(SparkSubm
>>> itDriverBootstrapper.scala:49)
>>> at
>>> org.apache.spark.deploy.SparkSubmitDriverBootstrapper.main(SparkSubmi
>>> tDriverBootstrapper.scala)
>>>
>>>
>>> The classpath seems to be right:
>>>
>>> C:\spark\bin>compute-classpath.cmd
>>>
>>> ;;C:\spark\bin\..\conf;C:\spark\bin\..\lib\spark-assembly-1.1.0-hadoop2.3.0.jar;
>>>
>>> ;C:\spark\bin\..\lib\datanucleus-api-jdo-3.2.1.jar;C:\spark\bin\..\lib\datanucle
>>> us-core-3.2.2.jar;C:\spark\bin\..\lib\datanucleus-rdbms-3.2.1.jar
>>>
>>> Manually exporting the classpath to include the assembly jar doesnt help
>>> either.
>>>
>>> What could be wrong with this installation? Scala and SBT are installed,
>>> in path and are working fine.
>>>
>>> Appreciate your help.
>>> regards
>>> Sunita
>>>
>>>
>>>
>>
>


Re: Spark setup on local windows machine

2014-11-25 Thread Sameer Farooqui
Hi Sunita,

This gitbook may also be useful for you to get Spark running in local mode
on your Windows machine:
http://blueplastic.gitbooks.io/how-to-light-your-spark-on-a-stick/content/

On Tue, Nov 25, 2014 at 11:09 PM, Akhil Das 
wrote:

> You could try following this guidelines
> http://docs.sigmoidanalytics.com/index.php/How_to_build_SPARK_on_Windows
>
> Thanks
> Best Regards
>
> On Wed, Nov 26, 2014 at 12:24 PM, Sunita Arvind 
> wrote:
>
>> Hi All,
>>
>> I just installed a spark on my laptop and trying to get spark-shell to
>> work. Here is the error I see:
>>
>> C:\spark\bin>spark-shell
>> Exception in thread "main" java.util.NoSuchElementException: key not
>> found: CLAS
>> SPATH
>> at scala.collection.MapLike$class.default(MapLike.scala:228)
>> at scala.collection.AbstractMap.default(Map.scala:58)
>> at scala.collection.MapLike$class.apply(MapLike.scala:141)
>> at scala.collection.AbstractMap.apply(Map.scala:58)
>> at
>> org.apache.spark.deploy.SparkSubmitDriverBootstrapper$.main(SparkSubm
>> itDriverBootstrapper.scala:49)
>> at
>> org.apache.spark.deploy.SparkSubmitDriverBootstrapper.main(SparkSubmi
>> tDriverBootstrapper.scala)
>>
>>
>> The classpath seems to be right:
>>
>> C:\spark\bin>compute-classpath.cmd
>>
>> ;;C:\spark\bin\..\conf;C:\spark\bin\..\lib\spark-assembly-1.1.0-hadoop2.3.0.jar;
>>
>> ;C:\spark\bin\..\lib\datanucleus-api-jdo-3.2.1.jar;C:\spark\bin\..\lib\datanucle
>> us-core-3.2.2.jar;C:\spark\bin\..\lib\datanucleus-rdbms-3.2.1.jar
>>
>> Manually exporting the classpath to include the assembly jar doesnt help
>> either.
>>
>> What could be wrong with this installation? Scala and SBT are installed,
>> in path and are working fine.
>>
>> Appreciate your help.
>> regards
>> Sunita
>>
>>
>>
>


Re: Spark setup on local windows machine

2014-11-25 Thread Akhil Das
You could try following this guidelines
http://docs.sigmoidanalytics.com/index.php/How_to_build_SPARK_on_Windows

Thanks
Best Regards

On Wed, Nov 26, 2014 at 12:24 PM, Sunita Arvind 
wrote:

> Hi All,
>
> I just installed a spark on my laptop and trying to get spark-shell to
> work. Here is the error I see:
>
> C:\spark\bin>spark-shell
> Exception in thread "main" java.util.NoSuchElementException: key not
> found: CLAS
> SPATH
> at scala.collection.MapLike$class.default(MapLike.scala:228)
> at scala.collection.AbstractMap.default(Map.scala:58)
> at scala.collection.MapLike$class.apply(MapLike.scala:141)
> at scala.collection.AbstractMap.apply(Map.scala:58)
> at
> org.apache.spark.deploy.SparkSubmitDriverBootstrapper$.main(SparkSubm
> itDriverBootstrapper.scala:49)
> at
> org.apache.spark.deploy.SparkSubmitDriverBootstrapper.main(SparkSubmi
> tDriverBootstrapper.scala)
>
>
> The classpath seems to be right:
>
> C:\spark\bin>compute-classpath.cmd
>
> ;;C:\spark\bin\..\conf;C:\spark\bin\..\lib\spark-assembly-1.1.0-hadoop2.3.0.jar;
>
> ;C:\spark\bin\..\lib\datanucleus-api-jdo-3.2.1.jar;C:\spark\bin\..\lib\datanucle
> us-core-3.2.2.jar;C:\spark\bin\..\lib\datanucleus-rdbms-3.2.1.jar
>
> Manually exporting the classpath to include the assembly jar doesnt help
> either.
>
> What could be wrong with this installation? Scala and SBT are installed,
> in path and are working fine.
>
> Appreciate your help.
> regards
> Sunita
>
>
>