Hello everyone,
I tried to run the example data--manipulation.R, and can't get it to read
the flights.csv file that is stored in my local fs. I don't want to store
big files in my hdfs, so reading from a local fs (lustre fs) is the desired
behavior for me.
I tried the following:
flightsDF <-
est_data/sparkR/flights.csv", source =
> "com.databricks.spark.csv", header = "true")
>
>
>
> _
> From: Boyu Zhang <boyuzhan...@gmail.com>
> Sent: Tuesday, December 8, 2015 8:47 AM
> Subject: SparkR read.df failed to read file fr
8, 2015 8:47 AM
Subject: SparkR read.df failed to read file from local directory
To: <user@spark.apache.org>
Hello everyone,
I tried to run the example data--manipulation.R, and can't get it to
read the flights.csv file that is stored in my local fs. I don't wa
@spark.apache.org
Subject: Re: SparkR read.df failed to read file from local directory
Thanks for the comment Felix, I tried giving
"/home/myuser/test_data/sparkR/flights.csv", but it tried to search the path in
hdfs, and gave errors:
15/12/08 12:47:10 ERROR r.RBackendHandl