This isn't a Spark 0.8.0-specific problem. I googled for "sbt error filen ame too long" and found a couple of links that suggest that this error may crop up for Linux users with encrypted filesystems or home directories:
http://stackoverflow.com/questions/8404815/how-do-i-build-a-project-that-uses-sbt-as-its-build-system https://github.com/sbt/sbt-assembly/issues/69 Try one of the workarounds from those links, such as storing the build target directory on an unencrypted volume. On Fri, Dec 6, 2013 at 11:25 AM, Gustavo Enrique Salazar Torres < [email protected]> wrote: > Hi there: > > I've trying to compile using sbt/sbt assembly and mvn clean package (with > memory adjustments as suggested here > http://spark.incubator.apache.org/docs/latest/building-with-maven.html). > Unfortunately, compiling fails for both of them with the following error > (here is with Maven > but with SBT the error happens at the same class): > > [INFO] Using incremental compilation > [INFO] 'compiler-interface' not yet compiled for Scala 2.9.3. Compiling... > [INFO] Compilation completed in 17.554 s > [INFO] Compiling 258 Scala sources and 16 Java sources to > /home/gustavo/tools/spark-0.8.0-incubating/core/target/scala-2.9.3/classes... > [WARNING] > /home/gustavo/tools/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:129: > method cleanupJob in class OutputCommitter is deprecated: see corresponding > Javadoc for more information. > [WARNING] getOutputCommitter().cleanupJob(getJobContext()) > [WARNING] ^ > [WARNING] > /home/gustavo/tools/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:592: > method cleanupJob in class OutputCommitter is deprecated: see corresponding > Javadoc for more information. > [WARNING] jobCommitter.cleanupJob(jobTaskContext) > [WARNING] ^ > [ERROR] File name too long > [WARNING] two warnings found > [ERROR] one error found > > Is 0.8.0 ready for production? is 0.7.0 more stable? > I'm running on java 6. > > Cheers > Gustavo >
