Yes, somehow seems logical. But where / how to pass -the InputFormat definition (.jar/.java/.class) Spark. I mean when using Hadoop I need to call something like 'hadoop jar <myInputFormat.jar> -inFormat <myFormat> other stuff' to register the file format definition file.
-- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-Hadoop-InputFormat-in-Python-tp12067p12069.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org