From: Vivek Suvarna [mailto:vikk...@gmail.com]
Sent: martes, 25 de julio de 2017 1:19
To: user@livy.incubator.apache.org
Subject: Re: Input file as an argument og a Spark code
I had a similar requirement.
I used webhdfs to first copy the file across to hdfs before starting the spark
job via
I had a similar requirement.
I used webhdfs to first copy the file across to hdfs before starting the spark
job via Livy.
Sent from my iPhone
> On 25 Jul 2017, at 9:39 AM, Saisai Shao wrote:
>
> I think you have to make this csv file accessible from Spark cluster,
I think you have to make this csv file accessible from Spark cluster,
putting to HDFS is one possible solution.
On Tue, Jul 25, 2017 at 1:26 AM, Joaquín Silva wrote:
> Hello,
>
>
>
> I'm building a BASH program (using Curl) that should run a Spark code
> remotely using