On Mon, Nov 30, 2015 at 1:53 PM, Josh Rosen <joshro...@databricks.com>
wrote:

> The JDBC drivers are currently being pulled in as test-scope dependencies
> of the `sql/core` module:
> https://github.com/apache/spark/blob/f2fbfa444f6e8d27953ec2d1c0b3abd603c963f9/sql/core/pom.xml#L91
>
> In SBT, these wind up on the Docker JDBC tests' classpath as a transitive
> dependency of the `spark-sql` test JAR. However, what we *should* be
> doing is adding them as explicit test dependencies of the
> `docker-integration-tests` subproject, since Maven handles transitive test
> JAR dependencies differently than SBT (see
> https://github.com/apache/spark/pull/9876#issuecomment-158593498 for some
> discussion). If you choose to make that fix as part of your PR, be sure to
> move the version handling to the root POM's <dependencyManagement> section
> so that the versions in both modules stay in sync. We might also be able to
> just simply move the JDBC driver dependencies to docker-integration-tests'
> POM if it turns out that they're not used anywhere else (that's my hunch).
>
>

So, the issue I am having now is that the DB2 JDBC is not available in any
maven public repository, so the plan I am going in with is :

- Before running the DB2 Docker Tests, the client machine needs to download
the jdbc driver locally and install it to it's local maven repository (or
sbt equivalent)  (instructions to be provided in either readme or pom file)

- We would need help with installing the DB2 JDBC on the Jenkins slaves
machines

- We could also create a new profile for the DB2 Docker Tests, so that this
tests are running when this profile is enabled.

I could probably think about other options, but they would sound a lot
hacky.....

Thoughts ? Some suggestions ?

-- 
Luciano Resende
http://people.apache.org/~lresende
http://twitter.com/lresende1975
http://lresende.blogspot.com/

Reply via email to