Guava is shaded, although one class is left in its original package.
This shouldn't have anything to do with Spark's package or namespace
though. What are you saying is in com/google/guava?

You can un-skip the install plugin with -Dmaven.install.skip=false

On Wed, Jan 14, 2015 at 7:26 PM, RJ Nowling <rnowl...@gmail.com> wrote:
> Hi all,
>
> I'm trying to upgrade some Spark RPMs from 1.1.0 to 1.2.0.  As part of the
> RPM process, we build Spark with Maven. With Spark 1.2.0, though, the
> artifacts are  placed in com/google/guava and there is no org/apache/spark.
>
> I saw that the pom.xml files had been modified to prevent the install
> command and that the guava dependency was modified.  Could someone who is
> more familiar with the Spark maven files comment on what might be causing
> this oddity?
>
> Thanks,
> RJ
>
> We build Spark like so:
> $ mvn -Phadoop-2.4 -Dmesos.version=0.20.0 -DskipTests clean package
>
> Then build a local Maven repo:
>
> mvn -Phadoop-2.4 \
>     -Dmesos.version=0.20.0 \
>     -DskipTests install:install-file  \
>     -Dfile=assembly/target/scala-2.10/spark-assembly-1.2.0-hadoop2.4.0.jar \
>     -DcreateChecksum=true \
>     -DgeneratePom=true \
>     -DartifactId=spark-assembly_2.1.0 \
>     -DlocalRepositoryPath=../maven2

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to