Hi Dan,
If you use spark <= 1.6, you can also do
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-csv_2.10:1.5.0
to quickly link the spark-csv jars to spark shell. Otherwise as Holden
suggested you link it in your maven/sbt dependencies. Spark guys assume
that their users have a good
So the good news is the csv library has been integrated into Spark 2.0 so
you don't need to use that package. On the other hand if your in an older
version you can included it using the standard sbt or maven package
configuration.
On Friday, September 23, 2016, Dan Bikle
hello world-of-spark,
I am learning spark today.
I want to understand the spark code in this repo:
https://github.com/databricks/spark-csv
In the README.md I see this info:
Linking
You can link against this library in your program at the following
coordinates:
Scala 2.10
groupId: