Hi,

Maven uses its own repo as does sbt. To cross the repo boundaries use
the following:

resolvers += Resolver.mavenLocal

in your build.sbt or any other build definition as described in
http://www.scala-sbt.org/0.13/tutorial/Library-Dependencies.html#Resolvers.

You did it so let's give the other options a try.

Can you show the exact location of the jar you want your Spark app to
depend on (using `ls`) and how you defined the dependency in
build.sbt?

Pozdrawiam,
Jacek

--
Jacek Laskowski | https://medium.com/@jaceklaskowski/ |
http://blog.jaceklaskowski.pl
Mastering Spark https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski


On Fri, Nov 27, 2015 at 9:03 AM, lihu <lihu...@gmail.com> wrote:
> Hi, All:
>
>      I modify the spark code and try to use some extra jars in Spark, the
> extra jars is published in my local maven repository using mvn install.
>      However the sbt can not find this jars file, even I can find this jar
> fils under /home/myname/.m2/resposiroty.
>     I can guarantee that the local m2 repository is added in the resolvers,
> because I get the following resolvers using show resolvers command.
>
>
> List(central: https://repo1.maven.org/maven2, apache-repo:
> https://repository.apache.org/content/repositories/releases, jboss-repo:
> https://repository.jboss.org/nexus/content/repositories/releases, mqtt-repo:
> https://repo.eclipse.org/content/repositories/paho-releases, cloudera-repo:
> https://repository.cloudera.com/artifactory/cloudera-repos,
> spark-hive-staging:
> https://oss.sonatype.org/content/repositories/orgspark-project-1113,
> mapr-repo: http://repository.mapr.com/maven/, spring-releases:
> https://repo.spring.io/libs-release, twttr-repo: http://maven.twttr.com,
> apache.snapshots: http://repository.apache.org/snapshots, cache:Maven2
> Local: /home/myname/.m2/repository)
>
>
>     Does anyone know how to deal with this. In fact, some days ago this can
> work, but after update my custom jar file and install again recently, it can
> not work now.
>
>     Environment: spark1.5  sbt 0.13.7/0.13.9

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to