On Fri, Jul 29, 2016 at 1:13 PM, Nicholas Chammas <nicholas.cham...@gmail.com> wrote: > The Hadoop jars packaged with Spark just allow Spark to interact with Hadoop, > or allow it to use the Hadoop API for interacting with systems like S3, > right? If you want HDFS, MapReduce, etc. you're obviously not getting that in > those Spark packages.
Correct. They're the Hadoop client jars needed to talk to Hadoop services (and also because Spark exposes some Hadoop APIs in its own public API). -- Marcelo --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org