Re: execute native system commands in Spark

2015-11-02 Thread Adrian Tanase
Have you seen .pipe()?




On 11/2/15, 5:36 PM, "patcharee" <patcharee.thong...@uni.no> wrote:

>Hi,
>
>Is it possible to execute native system commands (in parallel) Spark, 
>like scala.sys.process ?
>
>Best,
>Patcharee
>
>-
>To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>For additional commands, e-mail: user-h...@spark.apache.org
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



execute native system commands in Spark

2015-11-02 Thread patcharee

Hi,

Is it possible to execute native system commands (in parallel) Spark, 
like scala.sys.process ?


Best,
Patcharee

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: execute native system commands in Spark

2015-11-02 Thread Deenar Toraskar
You can do the following, make sure you the no of executors requested equal
the number of executors on your cluster.

import scala.sys.process._
import org.apache.hadoop.security.UserGroupInformation
import org.apache.spark.deploy.SparkHadoopUtil
sc.parallelize(0 to 10).map { _ =>(("hostname".!!).trim,
UserGroupInformation.getCurrentUser.toString)}.collect.distinct

Regards
Deenar
*Think Reactive Ltd*
deenar.toras...@thinkreactive.co.uk
07714140812




On 2 November 2015 at 15:38, Adrian Tanase <atan...@adobe.com> wrote:

> Have you seen .pipe()?
>
>
>
>
> On 11/2/15, 5:36 PM, "patcharee" <patcharee.thong...@uni.no> wrote:
>
> >Hi,
> >
> >Is it possible to execute native system commands (in parallel) Spark,
> >like scala.sys.process ?
> >
> >Best,
> >Patcharee
> >
> >-
> >To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >For additional commands, e-mail: user-h...@spark.apache.org
> >
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>