cc'ing the bigtop dev list in case.... It might effect our spark1.2 packaging when we bump versions.?
Maybe Rj keep us posted once u guys get to the bottom of this..... > On Jan 14, 2015, at 2:59 PM, RJ Nowling <[email protected]> wrote: > > Thanks, Sean. > > Yes, Spark is incorrectly copying the spark assembly jar to > com/google/guava in the maven repository. This is for the 1.2.0 release, > just to clarify. > > I reverted the patches that shade Guava and removed the parts disabling the > install plugin and it seemed to fix the issue. > > It seems that Spark poms are inheriting something from Guava. > > RJ > >> On Wed, Jan 14, 2015 at 2:33 PM, Sean Owen <[email protected]> wrote: >> >> Guava is shaded, although one class is left in its original package. >> This shouldn't have anything to do with Spark's package or namespace >> though. What are you saying is in com/google/guava? >> >> You can un-skip the install plugin with -Dmaven.install.skip=false >> >>> On Wed, Jan 14, 2015 at 7:26 PM, RJ Nowling <[email protected]> wrote: >>> Hi all, >>> >>> I'm trying to upgrade some Spark RPMs from 1.1.0 to 1.2.0. As part of >> the >>> RPM process, we build Spark with Maven. With Spark 1.2.0, though, the >>> artifacts are placed in com/google/guava and there is no >> org/apache/spark. >>> >>> I saw that the pom.xml files had been modified to prevent the install >>> command and that the guava dependency was modified. Could someone who is >>> more familiar with the Spark maven files comment on what might be causing >>> this oddity? >>> >>> Thanks, >>> RJ >>> >>> We build Spark like so: >>> $ mvn -Phadoop-2.4 -Dmesos.version=0.20.0 -DskipTests clean package >>> >>> Then build a local Maven repo: >>> >>> mvn -Phadoop-2.4 \ >>> -Dmesos.version=0.20.0 \ >>> -DskipTests install:install-file \ >> -Dfile=assembly/target/scala-2.10/spark-assembly-1.2.0-hadoop2.4.0.jar \ >>> -DcreateChecksum=true \ >>> -DgeneratePom=true \ >>> -DartifactId=spark-assembly_2.1.0 \ >>> -DlocalRepositoryPath=../maven2 >>
