The version of commons-io included in the Spark assembly is an old one, which
doesn't have the version of closeQuietly that takes a Closeable:
$ javap -cp
/root/spark/assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.1-incubating-hadoop2.0.0-mr1-cdh4.2.0.jar
org.apache.commons.io.IOUtils
Compiled from "IOUtils.java"
public class org.apache.commons.io.IOUtils {
...
public org.apache.commons.io.IOUtils();
public static void closeQuietly(java.io.Reader);
public static void closeQuietly(java.io.Writer);
public static void closeQuietly(java.io.InputStream);
public static void closeQuietly(java.io.OutputStream);
public static byte[] toByteArray(java.io.InputStream) throws
java.io.IOException;
...
It looks to me like org.apache.hadoop.hdfs.DFSInputStream depends on
commons-io 2.4, while spark 0.8.1 depends on commons-io 2.1.
Luckily spark 0.9 depends on commons-io 2.4, so the next release should fix
this issue.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-org-apache-commons-io-IOUtils-closeQuietly-with-cdh4-binary-tp204p971.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.