Hey Josh,

Thanks for helping bringing this up, I have just pushed a WIP PR for
bringing the DB2 tests to be running on Docker, and I have a question about
how the jdbc drivers are actually being setup for the other datasources
(MySQL and PostgreSQL), are these setup directly on the Jenkins slaves ? I
didn't see the jars or anything specific on the pom or other files...


Thanks

On Wed, Oct 21, 2015 at 1:26 PM, Josh Rosen <rosenvi...@gmail.com> wrote:

> Hey Luciano,
>
> This sounds like a reasonable plan to me. One of my colleagues has written
> some Dockerized MySQL testing utilities, so I'll take a peek at those to
> see if there are any specifics of their solution that we should adapt for
> Spark.
>
> On Wed, Oct 21, 2015 at 1:16 PM, Luciano Resende <luckbr1...@gmail.com>
> wrote:
>
>> I have started looking into PR-8101 [1] and what is required to merge it
>> into trunk which will also unblock me around SPARK-10521 [2].
>>
>> So here is the minimal plan I was thinking about :
>>
>> - make the docker image version fixed so we make sure we are using the
>> same image all the time
>> - pull the required images on the Jenkins executors so tests are not
>> delayed/timedout because it is waiting for docker images to download
>> - create a profile to run the JDBC tests
>> - create daily jobs for running the JDBC tests
>>
>>
>> In parallel, I learned that Alan Chin from my team is working with the
>> AmpLab team to expand the build capacity for Spark, so I will use some of
>> the nodes he is preparing to test/run these builds for now.
>>
>> Please let me know if there is anything else needed around this.
>>
>>
>> [1] https://github.com/apache/spark/pull/8101
>> [2] https://issues.apache.org/jira/browse/SPARK-10521
>>
>> --
>> Luciano Resende
>> http://people.apache.org/~lresende
>> http://twitter.com/lresende1975
>> http://lresende.blogspot.com/
>>
>
>


-- 
Luciano Resende
http://people.apache.org/~lresende
http://twitter.com/lresende1975
http://lresende.blogspot.com/

Reply via email to