Thanks, Marcelo!

I'll look into "install" vs "install-file".

What is the difference between pom and jar packaging?

One of the challenges is that I have to satisfy Fedora / Red Hat packaging
guidelines, which makes life a little more interesting. :)  (e.g., RPMs
should resolve against other RPMs instead of external repositories.)

On Wed, Jan 14, 2015 at 4:16 PM, Marcelo Vanzin <van...@cloudera.com> wrote:

> Hi RJ,
>
> I think I remember noticing in the past that some Guava metadata ends
> up overwriting maven-generated metadata in the assembly's manifest.
> That's probably something we should fix if that still affects the
> build.
>
> That being said, this is probably happening because you're using
> "install-file" instead of "install". If you want a workaround that
> doesn't require unshading things, you can change assembly.pom to (i)
> not skip the install plugin and (ii) have "jar" as the packaging,
> instead of pom.
>
>
>
> On Wed, Jan 14, 2015 at 1:08 PM, RJ Nowling <rnowl...@gmail.com> wrote:
> > Hi Sean,
> >
> > I confirmed that if I take the Spark 1.2.0 release (a428c446), undo the
> > guava PR [1], and use -Dmaven.install.skip=false with the workflow above,
> > the problem is fixed.
> >
> > RJ
> >
> >
> > [1]
> >
> https://github.com/apache/spark/commit/c9f743957fa963bc1dbed7a44a346ffce1a45cf2#diff-6382f8428b13fa6082fa688178f3dbcc
> >
> > On Wed, Jan 14, 2015 at 2:59 PM, RJ Nowling <rnowl...@gmail.com> wrote:
> >
> >> Thanks, Sean.
> >>
> >> Yes, Spark is incorrectly copying the spark assembly jar to
> >> com/google/guava in the maven repository.  This is for the 1.2.0
> release,
> >> just to clarify.
> >>
> >> I reverted the patches that shade Guava and removed the parts disabling
> >> the install plugin and it seemed to fix the issue.
> >>
> >> It seems that Spark poms are inheriting something from Guava.
> >>
> >> RJ
> >>
> >> On Wed, Jan 14, 2015 at 2:33 PM, Sean Owen <so...@cloudera.com> wrote:
> >>
> >>> Guava is shaded, although one class is left in its original package.
> >>> This shouldn't have anything to do with Spark's package or namespace
> >>> though. What are you saying is in com/google/guava?
> >>>
> >>> You can un-skip the install plugin with -Dmaven.install.skip=false
> >>>
> >>> On Wed, Jan 14, 2015 at 7:26 PM, RJ Nowling <rnowl...@gmail.com>
> wrote:
> >>> > Hi all,
> >>> >
> >>> > I'm trying to upgrade some Spark RPMs from 1.1.0 to 1.2.0.  As part
> of
> >>> the
> >>> > RPM process, we build Spark with Maven. With Spark 1.2.0, though, the
> >>> > artifacts are  placed in com/google/guava and there is no
> >>> org/apache/spark.
> >>> >
> >>> > I saw that the pom.xml files had been modified to prevent the install
> >>> > command and that the guava dependency was modified.  Could someone
> who
> >>> is
> >>> > more familiar with the Spark maven files comment on what might be
> >>> causing
> >>> > this oddity?
> >>> >
> >>> > Thanks,
> >>> > RJ
> >>> >
> >>> > We build Spark like so:
> >>> > $ mvn -Phadoop-2.4 -Dmesos.version=0.20.0 -DskipTests clean package
> >>> >
> >>> > Then build a local Maven repo:
> >>> >
> >>> > mvn -Phadoop-2.4 \
> >>> >     -Dmesos.version=0.20.0 \
> >>> >     -DskipTests install:install-file  \
> >>> >
> >>>
> -Dfile=assembly/target/scala-2.10/spark-assembly-1.2.0-hadoop2.4.0.jar \
> >>> >     -DcreateChecksum=true \
> >>> >     -DgeneratePom=true \
> >>> >     -DartifactId=spark-assembly_2.1.0 \
> >>> >     -DlocalRepositoryPath=../maven2
> >>>
> >>
> >>
>
>
>
> --
> Marcelo
>

Reply via email to