Re: Does anyone have a stand alone spark instance running on Windows

2014-08-20 Thread Steve Lewis
I have made a little progress - by downloading a prebuilt version of Spark
I can call spark-shell.cmd and bring up a spark shell.
In the shell things run.
Next I go to my development environment and try to run JavaWordCount
i try -Dspark.master=spark://local[*]:55519
-Dspark.master=spark://Asterix:7707 (Asterix is my machine)
end many other combinations

I can hit a web page
 http://asterix:4040/environment/
and see many details about a presumably running spark master but the
incantation to allow a simple job like JavaWordCount is escaping me

Oh yes - I am running on Windows 8

Any help would be appreciated starting with how do I know a spark master is
running and what port it is on



On Sat, Aug 16, 2014 at 7:33 PM, Manu Suryavansh  wrote:

> Hi,
>
> I have built spark-1.0.0 on Windows using Java 7/8 and I have been able to
> run several examples - here are my notes -
> http://ml-nlp-ir.blogspot.com/2014/04/building-spark-on-windows-and-cloudera.html
> on how to build from source and run examples in spark shell.
>
>
> Regards,
> Manu
>
>
> On Sat, Aug 16, 2014 at 12:14 PM, Steve Lewis 
> wrote:
>
>> I want to look at porting a Hadoop problem to Spark - eventually I want
>> to run on a Hadoop 2.0 cluster but while I am learning and porting I want
>> to run small problems in my windows box.
>> I installed scala and sbt.
>> I download Spark and in the spark directory can say
>> mvn -Phadoop-0.23 -Dhadoop.version=0.23.7 -DskipTests clean package
>> which succeeds
>> I tried
>> sbt/sbt assembly
>> which fails with errors
>>
>> In the documentation
>> it says
>>
>> *Note:* The launch scripts do not currently support Windows. To run a
>> Spark cluster on Windows, start the master and workers by hand.
>> with no indication of how to do this.
>>
>> I can build and run samples (say JavaWordCount)  to the point where they
>> fail because a master cannot be found (none is running)
>>
>> I want to know how to get a spark master and a slave or two running on my
>> windows box so I can look at the samples and start playing with Spark
>>
>> Does anyone have a windows instance running??
>>  Please DON'T SAY I SHOULD RUN LINUX! if it is supposed to work on
>> windows someone should have tested it and be willing to state how.
>>
>>
>>
>>
>
>
> --
> Manu Suryavansh
>



-- 
Steven M. Lewis PhD
4221 105th Ave NE
Kirkland, WA 98033
206-384-1340 (cell)
Skype lordjoe_com


Re: Does anyone have a stand alone spark instance running on Windows

2014-08-18 Thread Steve Lewis
OK I tried your build -
First you need to put spt in C:\sbt
Then you get
Microsoft Windows [Version 6.2.9200]
(c) 2012 Microsoft Corporation. All rights reserved.

e:\>which java
/cygdrive/c/Program Files/Java/jdk1.6.0_25/bin/java

e:\>java -version
java version "1.6.0_25"
Java(TM) SE Runtime Environment (build 1.6.0_25-b06)


e:\spark>sbt_opt.bat

e:\spark>set SCRIPT_DIR=C:\sbt\

e:\spark>java -Xms512m -Xmx2g -Xss1M -XX:+CMSClassUnloadingEnabled
-XX:MaxPermSize=128m -jar "C:\sbt\sbt-launch.jar"
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but
interface was expected
at jline.TerminalFactory.create(TerminalFactory.java:101)
at jline.TerminalFactory.get(TerminalFactory.java:159)
at sbt.ConsoleLogger$.ansiSupported(ConsoleLogger.scala:86)
at sbt.ConsoleLogger$.(ConsoleLogger.scala:80)
at sbt.ConsoleLogger$.(ConsoleLogger.scala)
at sbt.GlobalLogging$.initial(GlobalLogging.scala:40)
at sbt.StandardMain$.initialGlobalLogging(Main.scala:64)
at sbt.StandardMain$.initialState(Main.scala:73)
at sbt.xMain.run(Main.scala:29)
at xsbt.boot.Launch$.run(Launch.scala:55)
at xsbt.boot.Launch$$anonfun$explicit$1.apply(Launch.scala:45)
at xsbt.boot.Launch$.launch(Launch.scala:69)
at xsbt.boot.Launch$.apply(Launch.scala:16)
at xsbt.boot.Boot$.runImpl(Boot.scala:31)
at xsbt.boot.Boot$.main(Boot.scala:20)
at xsbt.boot.Boot.main(Boot.scala)

java.lang.IncompatibleClassChangeError: JLine incompatibility detected.
 Check that the sbt launcher is version 0.13.x or later.
at sbt.ConsoleLogger$.ansiSupported(ConsoleLogger.scala:97)
at sbt.ConsoleLogger$.(ConsoleLogger.scala:80)
at sbt.ConsoleLogger$.(ConsoleLogger.scala)
at sbt.GlobalLogging$.initial(GlobalLogging.scala:40)
at sbt.StandardMain$.initialGlobalLogging(Main.scala:64)
at sbt.StandardMain$.initialState(Main.scala:73)
at sbt.xMain.run(Main.scala:29)
at xsbt.boot.Launch$.run(Launch.scala:55)
at xsbt.boot.Launch$$anonfun$explicit$1.apply(Launch.scala:45)
at xsbt.boot.Launch$.launch(Launch.scala:69)
at xsbt.boot.Launch$.apply(Launch.scala:16)
at xsbt.boot.Boot$.runImpl(Boot.scala:31)
at xsbt.boot.Boot$.main(Boot.scala:20)
at xsbt.boot.Boot.main(Boot.scala)
Error during sbt execution: java.lang.IncompatibleClassChangeError: JLine
incompatibility detected.  Check that the sbt launcher is version 0.13.x or
later.

I believe my version of sbt is -.0.13

Finally even if I could build Spark I still don't see how to launch a server


On Sat, Aug 16, 2014 at 7:33 PM, Manu Suryavansh  wrote:

> Hi,
>
> I have built spark-1.0.0 on Windows using Java 7/8 and I have been able to
> run several examples - here are my notes -
> http://ml-nlp-ir.blogspot.com/2014/04/building-spark-on-windows-and-cloudera.html
> on how to build from source and run examples in spark shell.
>
>
> Regards,
> Manu
>
>
> On Sat, Aug 16, 2014 at 12:14 PM, Steve Lewis 
> wrote:
>
>> I want to look at porting a Hadoop problem to Spark - eventually I want
>> to run on a Hadoop 2.0 cluster but while I am learning and porting I want
>> to run small problems in my windows box.
>> I installed scala and sbt.
>> I download Spark and in the spark directory can say
>> mvn -Phadoop-0.23 -Dhadoop.version=0.23.7 -DskipTests clean package
>> which succeeds
>> I tried
>> sbt/sbt assembly
>> which fails with errors
>>
>> In the documentation
>> it says
>>
>> *Note:* The launch scripts do not currently support Windows. To run a
>> Spark cluster on Windows, start the master and workers by hand.
>> with no indication of how to do this.
>>
>> I can build and run samples (say JavaWordCount)  to the point where they
>> fail because a master cannot be found (none is running)
>>
>> I want to know how to get a spark master and a slave or two running on my
>> windows box so I can look at the samples and start playing with Spark
>>
>> Does anyone have a windows instance running??
>>  Please DON'T SAY I SHOULD RUN LINUX! if it is supposed to work on
>> windows someone should have tested it and be willing to state how.
>>
>>
>>
>>
>
>
> --
> Manu Suryavansh
>



-- 
Steven M. Lewis PhD
4221 105th Ave NE
Kirkland, WA 98033
206-384-1340 (cell)
Skype lordjoe_com


Re: Does anyone have a stand alone spark instance running on Windows

2014-08-16 Thread Tushar Khairnar
I am also trying to run on Windows and will post once I am able to launch.

My guess is that "by hand" it probably means manually forming the java
command I.e. class path and java options and then appending right class
name for worker or master.

Spark script follow hierarchy : start-master or workers which calls
start-deamon which calls start-class. Each one building/appending command
line or environment variables. I saw equivalent windows scripts to in
latest build while I was running it on Linux.

Regards,
Tushar
On Aug 17, 2014 8:03 AM, "Manu Suryavansh" 
wrote:

> Hi,
>
> I have built spark-1.0.0 on Windows using Java 7/8 and I have been able to
> run several examples - here are my notes -
> http://ml-nlp-ir.blogspot.com/2014/04/building-spark-on-windows-and-cloudera.html
> on how to build from source and run examples in spark shell.
>
>
> Regards,
> Manu
>
>
> On Sat, Aug 16, 2014 at 12:14 PM, Steve Lewis 
> wrote:
>
>> I want to look at porting a Hadoop problem to Spark - eventually I want
>> to run on a Hadoop 2.0 cluster but while I am learning and porting I want
>> to run small problems in my windows box.
>> I installed scala and sbt.
>> I download Spark and in the spark directory can say
>> mvn -Phadoop-0.23 -Dhadoop.version=0.23.7 -DskipTests clean package
>> which succeeds
>> I tried
>> sbt/sbt assembly
>> which fails with errors
>>
>> In the documentation
>> it says
>>
>> *Note:* The launch scripts do not currently support Windows. To run a
>> Spark cluster on Windows, start the master and workers by hand.
>> with no indication of how to do this.
>>
>> I can build and run samples (say JavaWordCount)  to the point where they
>> fail because a master cannot be found (none is running)
>>
>> I want to know how to get a spark master and a slave or two running on my
>> windows box so I can look at the samples and start playing with Spark
>>
>> Does anyone have a windows instance running??
>>  Please DON'T SAY I SHOULD RUN LINUX! if it is supposed to work on
>> windows someone should have tested it and be willing to state how.
>>
>>
>>
>>
>
>
> --
> Manu Suryavansh
>


Re: Does anyone have a stand alone spark instance running on Windows

2014-08-16 Thread Manu Suryavansh
Hi,

I have built spark-1.0.0 on Windows using Java 7/8 and I have been able to
run several examples - here are my notes -
http://ml-nlp-ir.blogspot.com/2014/04/building-spark-on-windows-and-cloudera.html
on how to build from source and run examples in spark shell.


Regards,
Manu


On Sat, Aug 16, 2014 at 12:14 PM, Steve Lewis  wrote:

> I want to look at porting a Hadoop problem to Spark - eventually I want to
> run on a Hadoop 2.0 cluster but while I am learning and porting I want to
> run small problems in my windows box.
> I installed scala and sbt.
> I download Spark and in the spark directory can say
> mvn -Phadoop-0.23 -Dhadoop.version=0.23.7 -DskipTests clean package
> which succeeds
> I tried
> sbt/sbt assembly
> which fails with errors
>
> In the documentation
> it says
>
> *Note:* The launch scripts do not currently support Windows. To run a
> Spark cluster on Windows, start the master and workers by hand.
> with no indication of how to do this.
>
> I can build and run samples (say JavaWordCount)  to the point where they
> fail because a master cannot be found (none is running)
>
> I want to know how to get a spark master and a slave or two running on my
> windows box so I can look at the samples and start playing with Spark
>
> Does anyone have a windows instance running??
> Please DON'T SAY I SHOULD RUN LINUX! if it is supposed to work on windows
> someone should have tested it and be willing to state how.
>
>
>
>


-- 
Manu Suryavansh


Does anyone have a stand alone spark instance running on Windows

2014-08-16 Thread Steve Lewis
I want to look at porting a Hadoop problem to Spark - eventually I want to
run on a Hadoop 2.0 cluster but while I am learning and porting I want to
run small problems in my windows box.
I installed scala and sbt.
I download Spark and in the spark directory can say
mvn -Phadoop-0.23 -Dhadoop.version=0.23.7 -DskipTests clean package
which succeeds
I tried
sbt/sbt assembly
which fails with errors

In the documentation
it says

*Note:* The launch scripts do not currently support Windows. To run a Spark
cluster on Windows, start the master and workers by hand.
with no indication of how to do this.

I can build and run samples (say JavaWordCount)  to the point where they
fail because a master cannot be found (none is running)

I want to know how to get a spark master and a slave or two running on my
windows box so I can look at the samples and start playing with Spark

Does anyone have a windows instance running??
Please DON'T SAY I SHOULD RUN LINUX! if it is supposed to work on windows
someone should have tested it and be willing to state how.