Re: Spark 1.0: slf4j version conflicts with pig

2014-05-28 Thread Ryan Compton
Remark, just including the jar built by sbt will produce the same error. i,.e this pig script will fail: REGISTER /usr/share/osi1/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar; edgeList0 = LOAD

Re: Spark 1.0: slf4j version conflicts with pig

2014-05-28 Thread Ryan Compton
posted a JIRA https://issues.apache.org/jira/browse/SPARK-1952 On Wed, May 28, 2014 at 1:14 PM, Ryan Compton compton.r...@gmail.com wrote: Remark, just including the jar built by sbt will produce the same error. i,.e this pig script will fail: REGISTER

Spark 1.0: slf4j version conflicts with pig

2014-05-27 Thread Ryan Compton
I use both Pig and Spark. All my code is built with Maven into a giant *-jar-with-dependencies.jar. I recently upgraded to Spark 1.0 and now all my pig scripts fail with: Caused by: java.lang.RuntimeException: Could not resolve error that occured when launching map reduce job:

Re: Spark 1.0: slf4j version conflicts with pig

2014-05-27 Thread Sean Owen
Spark uses 1.7.5, and you should probably see 1.7.{4,5} in use through Hadoop. But those are compatible. That method appears to have been around since 1.3. What version does Pig want? I usually do mvn -Dverbose dependency:tree to see both what the final dependencies are, and what got