Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-29 Thread Iulian Dragoș
here a way >> of doing this? >> >> Feels like a simple/naive question but really couldn’t find an answer. >> >> >> >> *From:* Fernandez, Andres >> *Sent:* Tuesday, January 26, 2016 2:53 PM >> *To:* 'Ewan Leith'; Iulian Dragoș >> *Cc:* user >

Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-29 Thread Iulian Dragoș
26, 2016 2:53 PM > *To:* 'Ewan Leith'; Iulian Dragoș > *Cc:* user > *Subject:* RE: how to correctly run scala script using spark-shell > through stdin (spark v1.0.0) > > > > True thank you. Is there a way of having the shell not closed (how to > avoid the :quit statem

RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-27 Thread Andres.Fernandez
To: 'Ewan Leith'; Iulian Dragoș Cc: user Subject: RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0) True thank you. Is there a way of having the shell not closed (how to avoid the :quit statement). Thank you both. Andres From: Ewan Leith [mailto:ewan.le

RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-26 Thread fernandrez1987
spark-shell -i file.scala is not working for me in Spark 1.6.0, was this removed or what do I have to take into account? The script does not get run at all. What can be happening?

Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-26 Thread Iulian Dragoș
I don’t see -i in the output of spark-shell --help. Moreover, in master I get an error: $ bin/spark-shell -i test.scala bad option: '-i' iulian ​ On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 < andres.fernan...@wellsfargo.com> wrote: > spark-shell -i file.scala is not working for me in Spark

Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-26 Thread Iulian Dragoș
rguments as the Scala interpreter. I'll look into it. iulian > > > Thank you very much for your time. > > > > *From:* Iulian Dragoș [mailto:iulian.dra...@typesafe.com] > *Sent:* Tuesday, January 26, 2016 12:00 PM > *To:* Fernandez, Andres > *Cc:* user > *Subject:* R

RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-26 Thread Ewan Leith
:00 To: fernandrez1987 <andres.fernan...@wellsfargo.com> Cc: user <user@spark.apache.org> Subject: Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0) I don’t see -i in the output of spark-shell --help. Moreover, in master I get an error: $ bin/spark-shell -i

RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-26 Thread Andres.Fernandez
dra...@typesafe.com] Sent: 26 January 2016 15:00 To: fernandrez1987 <andres.fernan...@wellsfargo.com<mailto:andres.fernan...@wellsfargo.com>> Cc: user <user@spark.apache.org<mailto:user@spark.apache.org>> Subject: Re: how to correctly run scala script using spark-shell throug

RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2014-08-27 Thread Henry Hung
Update: I use shell script to execute the spark-shell, inside the my-script.sh: $SPARK_HOME/bin/spark-shell $HOME/test.scala $HOME/test.log 21 Although it correctly finish the println(hallo world), but the strange thing is that my-script.sh finished before spark-shell even finish executing

RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2014-08-27 Thread Matei Zaharia
You can use spark-shell -i file.scala to run that. However, that keeps the interpreter open at the end, so you need to make your file end with System.exit(0) (or even more robustly, do stuff in a try {} and add that in finally {}). In general it would be better to compile apps and run them