Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-09 Thread Luciano Resende
The manual download is not required on latest trunk code anymore.

On Monday, May 9, 2016, Andrew Lee <alee...@hotmail.com> wrote:

> In fact, it does require ojdbc from Oracle which also requires a username
> and password. This was added as part of the testing scope for
> Oracle's docker.
>
>
> I notice this PR and commit in branch-2.0 according to
> https://issues.apache.org/jira/browse/SPARK-12941.
>
> In the comment, I'm not sure what does it mean by installing the JAR
> locally while Spark QA test run. IF this is the case,
>
> it means someone downloaded the JAR from Oracle and manually added to the
> local build machine that is building Spark branch-2.0 or internal maven
> repository that will serve this ojdbc JAR.
>
>
> 
>
> commit 8afe49141d9b6a603eb3907f32dce802a3d05172
>
> Author: thomastechs <thomas.sebast...@tcs.com
> <javascript:_e(%7B%7D,'cvml','thomas.sebast...@tcs.com');>>
>
> Date:   Thu Feb 25 22:52:25 2016 -0800
>
>
> [SPARK-12941][SQL][MASTER] Spark-SQL JDBC Oracle dialect fails to map
> string datatypes to Oracle VARCHAR datatype
>
>
>
> ## What changes were proposed in this pull request?
>
>
>
> This Pull request is used for the fix SPARK-12941, creating a data
> type mapping to Oracle for the corresponding data type"Stringtype" from
>
> dataframe. This PR is for the master branch fix, where as another PR is
> already tested with the branch 1.4
>
>
>
> ## How was the this patch tested?
>
>
>
> (Please explain how this patch was tested. E.g. unit tests,
> integration tests, manual tests)
>
> This patch was tested using the Oracle docker .Created a new
> integration suite for the same.The oracle.jdbc jar was to be downloaded
> from the maven repository.Since there was no jdbc jar available in the
> maven repository, the jar was downloaded from oracle site manually and
> installed in the local; thus tested. So, for SparkQA test case run, the
> ojdbc jar might be manually placed in the local maven
> repository(com/oracle/ojdbc6/11.2.0.2.0) while Spark QA test run.
>
>
>
> Author: thomastechs <thomas.sebast...@tcs.com
> <javascript:_e(%7B%7D,'cvml','thomas.sebast...@tcs.com');>>
>
>
>
> Closes #11306 from thomastechs/master.
> 
>
>
> Meanwhile, I also notice that the ojdbc groupID provided by Oracle
> (official website
> https://blogs.oracle.com/dev2dev/entry/how_to_get_oracle_jdbc)  is
> different.
>
>
> 
>
>   com.oracle.jdbc
>
>   ojdbc6
>
>   11.2.0.4
>
>   test
>
> 
>
> as oppose to the one in Spark branch-2.0
>
> external/docker-integration-tests/pom.xml
>
>
> 
>
>   com.oracle
>
>   ojdbc6
>
>   11.2.0.1.0
>
>   test
>
> 
>
>
> The version is out of date and not available from the Oracle Maven repo.
> The PR was created awhile back, so the solution may just cross Oracle's
> maven release blog.
>
>
> Just my inference based on what I see form git and JIRA, however, I do see
> a fix required to patch pom.xml to apply the correct groupId and version #
> for ojdbc6 driver.
>
>
> Thoughts?
>
>
>
> Get Oracle JDBC drivers and UCP from Oracle Maven ...
> <https://blogs.oracle.com/dev2dev/entry/how_to_get_oracle_jdbc>
> blogs.oracle.com
> Get Oracle JDBC drivers and UCP from Oracle Maven Repository (without
> IDEs) By Nirmala Sundarappa-Oracle on Feb 15, 2016
>
>
>
>
>
>
>
> --
> *From:* Mich Talebzadeh <mich.talebza...@gmail.com
> <javascript:_e(%7B%7D,'cvml','mich.talebza...@gmail.com');>>
> *Sent:* Tuesday, May 3, 2016 1:04 AM
> *To:* Luciano Resende
> *Cc:* Hien Luu; ☼ R Nair (रविशंकर नायर); user
> *Subject:* Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0
>
> which version of Spark are using?
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> On 3 May 2016 at 02:13, Luciano Resende <luckbr1...@gmail.com
> <javascript:_e(%7B%7D,'cvml','luckbr1...@gmail.com');>> wrote:
>
>> You might have a settings.xml that is forcing your internal Maven
>> repository to be the mirror of external repositories and thus not finding
>> the dependency.
>>
>> On Mon, May 2, 2016 at 6:11 PM, Hien Luu <hien...@gmail.com
>> <javascript:_e(%7B%7D,'cvml','hien...@gmail.com');>&

Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-09 Thread Andrew Lee
In fact, it does require ojdbc from Oracle which also requires a username and 
password. This was added as part of the testing scope for Oracle's docker.


I notice this PR and commit in branch-2.0 according to 
https://issues.apache.org/jira/browse/SPARK-12941.

In the comment, I'm not sure what does it mean by installing the JAR locally 
while Spark QA test run. IF this is the case,

it means someone downloaded the JAR from Oracle and manually added to the local 
build machine that is building Spark branch-2.0 or internal maven repository 
that will serve this ojdbc JAR.




commit 8afe49141d9b6a603eb3907f32dce802a3d05172

Author: thomastechs <thomas.sebast...@tcs.com>

Date:   Thu Feb 25 22:52:25 2016 -0800


[SPARK-12941][SQL][MASTER] Spark-SQL JDBC Oracle dialect fails to map 
string datatypes to Oracle VARCHAR datatype



## What changes were proposed in this pull request?



This Pull request is used for the fix SPARK-12941, creating a data type 
mapping to Oracle for the corresponding data type"Stringtype" from

dataframe. This PR is for the master branch fix, where as another PR is already 
tested with the branch 1.4



## How was the this patch tested?



(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)

This patch was tested using the Oracle docker .Created a new integration 
suite for the same.The oracle.jdbc jar was to be downloaded from the maven 
repository.Since there was no jdbc jar available in the maven repository, the 
jar was downloaded from oracle site manually and installed in the local; thus 
tested. So, for SparkQA test case run, the ojdbc jar might be manually placed 
in the local maven repository(com/oracle/ojdbc6/11.2.0.2.0) while Spark QA test 
run.



Author: thomastechs <thomas.sebast...@tcs.com>



Closes #11306 from thomastechs/master.




Meanwhile, I also notice that the ojdbc groupID provided by Oracle (official 
website https://blogs.oracle.com/dev2dev/entry/how_to_get_oracle_jdbc)  is 
different.




  com.oracle.jdbc

  ojdbc6

  11.2.0.4

  test




as oppose to the one in Spark branch-2.0

external/docker-integration-tests/pom.xml




  com.oracle

  ojdbc6

  11.2.0.1.0

  test





The version is out of date and not available from the Oracle Maven repo. The PR 
was created awhile back, so the solution may just cross Oracle's maven release 
blog.


Just my inference based on what I see form git and JIRA, however, I do see a 
fix required to patch pom.xml to apply the correct groupId and version # for 
ojdbc6 driver.


Thoughts?



Get Oracle JDBC drivers and UCP from Oracle Maven 
...<https://blogs.oracle.com/dev2dev/entry/how_to_get_oracle_jdbc>
blogs.oracle.com
Get Oracle JDBC drivers and UCP from Oracle Maven Repository (without IDEs) By 
Nirmala Sundarappa-Oracle on Feb 15, 2016









From: Mich Talebzadeh <mich.talebza...@gmail.com>
Sent: Tuesday, May 3, 2016 1:04 AM
To: Luciano Resende
Cc: Hien Luu; ☼ R Nair (रविशंकर नायर); user
Subject: Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

which version of Spark are using?


Dr Mich Talebzadeh



LinkedIn  
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw



http://talebzadehmich.wordpress.com<http://talebzadehmich.wordpress.com/>



On 3 May 2016 at 02:13, Luciano Resende 
<luckbr1...@gmail.com<mailto:luckbr1...@gmail.com>> wrote:
You might have a settings.xml that is forcing your internal Maven repository to 
be the mirror of external repositories and thus not finding the dependency.

On Mon, May 2, 2016 at 6:11 PM, Hien Luu 
<hien...@gmail.com<mailto:hien...@gmail.com>> wrote:
Not I am not.  I am considering downloading it manually and place it in my 
local repository.

On Mon, May 2, 2016 at 5:54 PM, ☼ R Nair (रविशंकर नायर) 
<ravishankar.n...@gmail.com<mailto:ravishankar.n...@gmail.com>> wrote:

Oracle jdbc is not part of Maven repository,  are you keeping a downloaded file 
in your local repo?

Best, RS

On May 2, 2016 8:51 PM, "Hien Luu" 
<hien...@gmail.com<mailto:hien...@gmail.com>> wrote:
Hi all,

I am running into a build problem with com.oracle:ojdbc6:jar:11.2.0.1.0.  It 
kept getting "Operation timed out" while building Spark Project Docker 
Integration Tests module (see the error below).

Has anyone run this problem before? If so, how did you resolve around this 
problem?

[INFO] Reactor Summary:

[INFO]

[INFO] Spark Project Parent POM ... SUCCESS [  2.423 s]

[INFO] Spark Project Test Tags  SUCCESS [  0.712 s]

[INFO] Spark Project Sketch ... SUCCESS [  0.498 s]

[INFO] Spark Project Networking ... SUCCESS [  1.743 s]

[INFO] Spark Project Shuffle Streaming Service 

Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-03 Thread Mich Talebzadeh
which version of Spark are using?

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
*



http://talebzadehmich.wordpress.com



On 3 May 2016 at 02:13, Luciano Resende  wrote:

> You might have a settings.xml that is forcing your internal Maven
> repository to be the mirror of external repositories and thus not finding
> the dependency.
>
> On Mon, May 2, 2016 at 6:11 PM, Hien Luu  wrote:
>
>> Not I am not.  I am considering downloading it manually and place it in
>> my local repository.
>>
>> On Mon, May 2, 2016 at 5:54 PM, ☼ R Nair (रविशंकर नायर) <
>> ravishankar.n...@gmail.com> wrote:
>>
>>> Oracle jdbc is not part of Maven repository,  are you keeping a
>>> downloaded file in your local repo?
>>>
>>> Best, RS
>>> On May 2, 2016 8:51 PM, "Hien Luu"  wrote:
>>>
 Hi all,

 I am running into a build problem with
 com.oracle:ojdbc6:jar:11.2.0.1.0.  It kept getting "Operation timed out"
 while building Spark Project Docker Integration Tests module (see the error
 below).

 Has anyone run this problem before? If so, how did you resolve around
 this problem?

 [INFO] Reactor Summary:

 [INFO]

 [INFO] Spark Project Parent POM ... SUCCESS [
 2.423 s]

 [INFO] Spark Project Test Tags  SUCCESS [
 0.712 s]

 [INFO] Spark Project Sketch ... SUCCESS [
 0.498 s]

 [INFO] Spark Project Networking ... SUCCESS [
 1.743 s]

 [INFO] Spark Project Shuffle Streaming Service  SUCCESS [
 0.587 s]

 [INFO] Spark Project Unsafe ... SUCCESS [
 0.503 s]

 [INFO] Spark Project Launcher . SUCCESS [
 4.894 s]

 [INFO] Spark Project Core . SUCCESS [
 17.953 s]

 [INFO] Spark Project GraphX ... SUCCESS [
 3.480 s]

 [INFO] Spark Project Streaming  SUCCESS [
 6.022 s]

 [INFO] Spark Project Catalyst . SUCCESS [
 8.664 s]

 [INFO] Spark Project SQL .. SUCCESS [
 12.440 s]

 [INFO] Spark Project ML Local Library . SUCCESS [
 0.498 s]

 [INFO] Spark Project ML Library ... SUCCESS [
 8.594 s]

 [INFO] Spark Project Tools  SUCCESS [
 0.162 s]

 [INFO] Spark Project Hive . SUCCESS [
 9.834 s]

 [INFO] Spark Project HiveContext Compatibility  SUCCESS [
 1.428 s]

 [INFO] Spark Project Docker Integration Tests . FAILURE
 [02:32 min]

 [INFO] Spark Project REPL . SKIPPED

 [INFO] Spark Project Assembly . SKIPPED

 [INFO] Spark Project External Flume Sink .. SKIPPED

 [INFO] Spark Project External Flume ... SKIPPED

 [INFO] Spark Project External Flume Assembly .. SKIPPED

 [INFO] Spark Project External Kafka ... SKIPPED

 [INFO] Spark Project Examples . SKIPPED

 [INFO] Spark Project External Kafka Assembly .. SKIPPED

 [INFO] Spark Project Java 8 Tests . SKIPPED

 [INFO]
 

 [INFO] BUILD FAILURE

 [INFO]
 

 [INFO] Total time: 03:53 min

 [INFO] Finished at: 2016-05-02T17:44:57-07:00

 [INFO] Final Memory: 80M/1525M

 [INFO]
 

 [ERROR] Failed to execute goal on project
 spark-docker-integration-tests_2.11: Could not resolve dependencies for
 project
 org.apache.spark:spark-docker-integration-tests_2.11:jar:2.0.0-SNAPSHOT:
 Failed to collect dependencies at com.oracle:ojdbc6:jar:11.2.0.1.0: Failed
 to read artifact descriptor for com.oracle:ojdbc6:jar:11.2.0.1.0: Could not
 transfer artifact com.oracle:ojdbc6:pom:11.2.0.1.0 from/to uber-artifactory
 (http://artifactory.uber.internal:4587/artifactory/repo/): Connect to
 artifactory.uber.internal:4587 [artifactory.uber.internal/10.162.11.61]
 failed: Operation timed out -> [Help 1]


>>
>>
>> --
>> Regards,
>>
>
>
>
> --
> Luciano Resende
> http://twitter.com/lresende1975
> 

Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread Luciano Resende
You might have a settings.xml that is forcing your internal Maven
repository to be the mirror of external repositories and thus not finding
the dependency.

On Mon, May 2, 2016 at 6:11 PM, Hien Luu  wrote:

> Not I am not.  I am considering downloading it manually and place it in my
> local repository.
>
> On Mon, May 2, 2016 at 5:54 PM, ☼ R Nair (रविशंकर नायर) <
> ravishankar.n...@gmail.com> wrote:
>
>> Oracle jdbc is not part of Maven repository,  are you keeping a
>> downloaded file in your local repo?
>>
>> Best, RS
>> On May 2, 2016 8:51 PM, "Hien Luu"  wrote:
>>
>>> Hi all,
>>>
>>> I am running into a build problem with
>>> com.oracle:ojdbc6:jar:11.2.0.1.0.  It kept getting "Operation timed out"
>>> while building Spark Project Docker Integration Tests module (see the error
>>> below).
>>>
>>> Has anyone run this problem before? If so, how did you resolve around
>>> this problem?
>>>
>>> [INFO] Reactor Summary:
>>>
>>> [INFO]
>>>
>>> [INFO] Spark Project Parent POM ... SUCCESS [
>>> 2.423 s]
>>>
>>> [INFO] Spark Project Test Tags  SUCCESS [
>>> 0.712 s]
>>>
>>> [INFO] Spark Project Sketch ... SUCCESS [
>>> 0.498 s]
>>>
>>> [INFO] Spark Project Networking ... SUCCESS [
>>> 1.743 s]
>>>
>>> [INFO] Spark Project Shuffle Streaming Service  SUCCESS [
>>> 0.587 s]
>>>
>>> [INFO] Spark Project Unsafe ... SUCCESS [
>>> 0.503 s]
>>>
>>> [INFO] Spark Project Launcher . SUCCESS [
>>> 4.894 s]
>>>
>>> [INFO] Spark Project Core . SUCCESS [
>>> 17.953 s]
>>>
>>> [INFO] Spark Project GraphX ... SUCCESS [
>>> 3.480 s]
>>>
>>> [INFO] Spark Project Streaming  SUCCESS [
>>> 6.022 s]
>>>
>>> [INFO] Spark Project Catalyst . SUCCESS [
>>> 8.664 s]
>>>
>>> [INFO] Spark Project SQL .. SUCCESS [
>>> 12.440 s]
>>>
>>> [INFO] Spark Project ML Local Library . SUCCESS [
>>> 0.498 s]
>>>
>>> [INFO] Spark Project ML Library ... SUCCESS [
>>> 8.594 s]
>>>
>>> [INFO] Spark Project Tools  SUCCESS [
>>> 0.162 s]
>>>
>>> [INFO] Spark Project Hive . SUCCESS [
>>> 9.834 s]
>>>
>>> [INFO] Spark Project HiveContext Compatibility  SUCCESS [
>>> 1.428 s]
>>>
>>> [INFO] Spark Project Docker Integration Tests . FAILURE
>>> [02:32 min]
>>>
>>> [INFO] Spark Project REPL . SKIPPED
>>>
>>> [INFO] Spark Project Assembly . SKIPPED
>>>
>>> [INFO] Spark Project External Flume Sink .. SKIPPED
>>>
>>> [INFO] Spark Project External Flume ... SKIPPED
>>>
>>> [INFO] Spark Project External Flume Assembly .. SKIPPED
>>>
>>> [INFO] Spark Project External Kafka ... SKIPPED
>>>
>>> [INFO] Spark Project Examples . SKIPPED
>>>
>>> [INFO] Spark Project External Kafka Assembly .. SKIPPED
>>>
>>> [INFO] Spark Project Java 8 Tests . SKIPPED
>>>
>>> [INFO]
>>> 
>>>
>>> [INFO] BUILD FAILURE
>>>
>>> [INFO]
>>> 
>>>
>>> [INFO] Total time: 03:53 min
>>>
>>> [INFO] Finished at: 2016-05-02T17:44:57-07:00
>>>
>>> [INFO] Final Memory: 80M/1525M
>>>
>>> [INFO]
>>> 
>>>
>>> [ERROR] Failed to execute goal on project
>>> spark-docker-integration-tests_2.11: Could not resolve dependencies for
>>> project
>>> org.apache.spark:spark-docker-integration-tests_2.11:jar:2.0.0-SNAPSHOT:
>>> Failed to collect dependencies at com.oracle:ojdbc6:jar:11.2.0.1.0: Failed
>>> to read artifact descriptor for com.oracle:ojdbc6:jar:11.2.0.1.0: Could not
>>> transfer artifact com.oracle:ojdbc6:pom:11.2.0.1.0 from/to uber-artifactory
>>> (http://artifactory.uber.internal:4587/artifactory/repo/): Connect to
>>> artifactory.uber.internal:4587 [artifactory.uber.internal/10.162.11.61]
>>> failed: Operation timed out -> [Help 1]
>>>
>>>
>
>
> --
> Regards,
>



-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/


Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread Hien Luu
Not I am not.  I am considering downloading it manually and place it in my
local repository.

On Mon, May 2, 2016 at 5:54 PM, ☼ R Nair (रविशंकर नायर) <
ravishankar.n...@gmail.com> wrote:

> Oracle jdbc is not part of Maven repository,  are you keeping a downloaded
> file in your local repo?
>
> Best, RS
> On May 2, 2016 8:51 PM, "Hien Luu"  wrote:
>
>> Hi all,
>>
>> I am running into a build problem with com.oracle:ojdbc6:jar:11.2.0.1.0.
>> It kept getting "Operation timed out" while building Spark Project Docker
>> Integration Tests module (see the error below).
>>
>> Has anyone run this problem before? If so, how did you resolve around
>> this problem?
>>
>> [INFO] Reactor Summary:
>>
>> [INFO]
>>
>> [INFO] Spark Project Parent POM ... SUCCESS [
>> 2.423 s]
>>
>> [INFO] Spark Project Test Tags  SUCCESS [
>> 0.712 s]
>>
>> [INFO] Spark Project Sketch ... SUCCESS [
>> 0.498 s]
>>
>> [INFO] Spark Project Networking ... SUCCESS [
>> 1.743 s]
>>
>> [INFO] Spark Project Shuffle Streaming Service  SUCCESS [
>> 0.587 s]
>>
>> [INFO] Spark Project Unsafe ... SUCCESS [
>> 0.503 s]
>>
>> [INFO] Spark Project Launcher . SUCCESS [
>> 4.894 s]
>>
>> [INFO] Spark Project Core . SUCCESS [
>> 17.953 s]
>>
>> [INFO] Spark Project GraphX ... SUCCESS [
>> 3.480 s]
>>
>> [INFO] Spark Project Streaming  SUCCESS [
>> 6.022 s]
>>
>> [INFO] Spark Project Catalyst . SUCCESS [
>> 8.664 s]
>>
>> [INFO] Spark Project SQL .. SUCCESS [
>> 12.440 s]
>>
>> [INFO] Spark Project ML Local Library . SUCCESS [
>> 0.498 s]
>>
>> [INFO] Spark Project ML Library ... SUCCESS [
>> 8.594 s]
>>
>> [INFO] Spark Project Tools  SUCCESS [
>> 0.162 s]
>>
>> [INFO] Spark Project Hive . SUCCESS [
>> 9.834 s]
>>
>> [INFO] Spark Project HiveContext Compatibility  SUCCESS [
>> 1.428 s]
>>
>> [INFO] Spark Project Docker Integration Tests . FAILURE
>> [02:32 min]
>>
>> [INFO] Spark Project REPL . SKIPPED
>>
>> [INFO] Spark Project Assembly . SKIPPED
>>
>> [INFO] Spark Project External Flume Sink .. SKIPPED
>>
>> [INFO] Spark Project External Flume ... SKIPPED
>>
>> [INFO] Spark Project External Flume Assembly .. SKIPPED
>>
>> [INFO] Spark Project External Kafka ... SKIPPED
>>
>> [INFO] Spark Project Examples . SKIPPED
>>
>> [INFO] Spark Project External Kafka Assembly .. SKIPPED
>>
>> [INFO] Spark Project Java 8 Tests . SKIPPED
>>
>> [INFO]
>> 
>>
>> [INFO] BUILD FAILURE
>>
>> [INFO]
>> 
>>
>> [INFO] Total time: 03:53 min
>>
>> [INFO] Finished at: 2016-05-02T17:44:57-07:00
>>
>> [INFO] Final Memory: 80M/1525M
>>
>> [INFO]
>> 
>>
>> [ERROR] Failed to execute goal on project
>> spark-docker-integration-tests_2.11: Could not resolve dependencies for
>> project
>> org.apache.spark:spark-docker-integration-tests_2.11:jar:2.0.0-SNAPSHOT:
>> Failed to collect dependencies at com.oracle:ojdbc6:jar:11.2.0.1.0: Failed
>> to read artifact descriptor for com.oracle:ojdbc6:jar:11.2.0.1.0: Could not
>> transfer artifact com.oracle:ojdbc6:pom:11.2.0.1.0 from/to uber-artifactory
>> (http://artifactory.uber.internal:4587/artifactory/repo/): Connect to
>> artifactory.uber.internal:4587 [artifactory.uber.internal/10.162.11.61]
>> failed: Operation timed out -> [Help 1]
>>
>>


-- 
Regards,


Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread Ted Yu
>From the output of dependency:tree of master branch:

[INFO]

[INFO] Building Spark Project Docker Integration Tests 2.0.0-SNAPSHOT
[INFO]

[WARNING] The POM for com.oracle:ojdbc6:jar:11.2.0.1.0 is missing, no
dependency information available
[INFO]
...
[INFO] +- com.oracle:ojdbc6:jar:11.2.0.1.0:test

Are you building behind a proxy ?

On Mon, May 2, 2016 at 5:51 PM, Hien Luu  wrote:

> Hi all,
>
> I am running into a build problem with com.oracle:ojdbc6:jar:11.2.0.1.0.
> It kept getting "Operation timed out" while building Spark Project Docker
> Integration Tests module (see the error below).
>
> Has anyone run this problem before? If so, how did you resolve around this
> problem?
>
> [INFO] Reactor Summary:
>
> [INFO]
>
> [INFO] Spark Project Parent POM ... SUCCESS [
> 2.423 s]
>
> [INFO] Spark Project Test Tags  SUCCESS [
> 0.712 s]
>
> [INFO] Spark Project Sketch ... SUCCESS [
> 0.498 s]
>
> [INFO] Spark Project Networking ... SUCCESS [
> 1.743 s]
>
> [INFO] Spark Project Shuffle Streaming Service  SUCCESS [
> 0.587 s]
>
> [INFO] Spark Project Unsafe ... SUCCESS [
> 0.503 s]
>
> [INFO] Spark Project Launcher . SUCCESS [
> 4.894 s]
>
> [INFO] Spark Project Core . SUCCESS [
> 17.953 s]
>
> [INFO] Spark Project GraphX ... SUCCESS [
> 3.480 s]
>
> [INFO] Spark Project Streaming  SUCCESS [
> 6.022 s]
>
> [INFO] Spark Project Catalyst . SUCCESS [
> 8.664 s]
>
> [INFO] Spark Project SQL .. SUCCESS [
> 12.440 s]
>
> [INFO] Spark Project ML Local Library . SUCCESS [
> 0.498 s]
>
> [INFO] Spark Project ML Library ... SUCCESS [
> 8.594 s]
>
> [INFO] Spark Project Tools  SUCCESS [
> 0.162 s]
>
> [INFO] Spark Project Hive . SUCCESS [
> 9.834 s]
>
> [INFO] Spark Project HiveContext Compatibility  SUCCESS [
> 1.428 s]
>
> [INFO] Spark Project Docker Integration Tests . FAILURE [02:32
> min]
>
> [INFO] Spark Project REPL . SKIPPED
>
> [INFO] Spark Project Assembly . SKIPPED
>
> [INFO] Spark Project External Flume Sink .. SKIPPED
>
> [INFO] Spark Project External Flume ... SKIPPED
>
> [INFO] Spark Project External Flume Assembly .. SKIPPED
>
> [INFO] Spark Project External Kafka ... SKIPPED
>
> [INFO] Spark Project Examples . SKIPPED
>
> [INFO] Spark Project External Kafka Assembly .. SKIPPED
>
> [INFO] Spark Project Java 8 Tests . SKIPPED
>
> [INFO]
> 
>
> [INFO] BUILD FAILURE
>
> [INFO]
> 
>
> [INFO] Total time: 03:53 min
>
> [INFO] Finished at: 2016-05-02T17:44:57-07:00
>
> [INFO] Final Memory: 80M/1525M
>
> [INFO]
> 
>
> [ERROR] Failed to execute goal on project
> spark-docker-integration-tests_2.11: Could not resolve dependencies for
> project
> org.apache.spark:spark-docker-integration-tests_2.11:jar:2.0.0-SNAPSHOT:
> Failed to collect dependencies at com.oracle:ojdbc6:jar:11.2.0.1.0: Failed
> to read artifact descriptor for com.oracle:ojdbc6:jar:11.2.0.1.0: Could not
> transfer artifact com.oracle:ojdbc6:pom:11.2.0.1.0 from/to uber-artifactory
> (http://artifactory.uber.internal:4587/artifactory/repo/): Connect to
> artifactory.uber.internal:4587 [artifactory.uber.internal/10.162.11.61]
> failed: Operation timed out -> [Help 1]
>
>


Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread रविशंकर नायर
Oracle jdbc is not part of Maven repository,  are you keeping a downloaded
file in your local repo?

Best, RS
On May 2, 2016 8:51 PM, "Hien Luu"  wrote:

> Hi all,
>
> I am running into a build problem with com.oracle:ojdbc6:jar:11.2.0.1.0.
> It kept getting "Operation timed out" while building Spark Project Docker
> Integration Tests module (see the error below).
>
> Has anyone run this problem before? If so, how did you resolve around this
> problem?
>
> [INFO] Reactor Summary:
>
> [INFO]
>
> [INFO] Spark Project Parent POM ... SUCCESS [
> 2.423 s]
>
> [INFO] Spark Project Test Tags  SUCCESS [
> 0.712 s]
>
> [INFO] Spark Project Sketch ... SUCCESS [
> 0.498 s]
>
> [INFO] Spark Project Networking ... SUCCESS [
> 1.743 s]
>
> [INFO] Spark Project Shuffle Streaming Service  SUCCESS [
> 0.587 s]
>
> [INFO] Spark Project Unsafe ... SUCCESS [
> 0.503 s]
>
> [INFO] Spark Project Launcher . SUCCESS [
> 4.894 s]
>
> [INFO] Spark Project Core . SUCCESS [
> 17.953 s]
>
> [INFO] Spark Project GraphX ... SUCCESS [
> 3.480 s]
>
> [INFO] Spark Project Streaming  SUCCESS [
> 6.022 s]
>
> [INFO] Spark Project Catalyst . SUCCESS [
> 8.664 s]
>
> [INFO] Spark Project SQL .. SUCCESS [
> 12.440 s]
>
> [INFO] Spark Project ML Local Library . SUCCESS [
> 0.498 s]
>
> [INFO] Spark Project ML Library ... SUCCESS [
> 8.594 s]
>
> [INFO] Spark Project Tools  SUCCESS [
> 0.162 s]
>
> [INFO] Spark Project Hive . SUCCESS [
> 9.834 s]
>
> [INFO] Spark Project HiveContext Compatibility  SUCCESS [
> 1.428 s]
>
> [INFO] Spark Project Docker Integration Tests . FAILURE [02:32
> min]
>
> [INFO] Spark Project REPL . SKIPPED
>
> [INFO] Spark Project Assembly . SKIPPED
>
> [INFO] Spark Project External Flume Sink .. SKIPPED
>
> [INFO] Spark Project External Flume ... SKIPPED
>
> [INFO] Spark Project External Flume Assembly .. SKIPPED
>
> [INFO] Spark Project External Kafka ... SKIPPED
>
> [INFO] Spark Project Examples . SKIPPED
>
> [INFO] Spark Project External Kafka Assembly .. SKIPPED
>
> [INFO] Spark Project Java 8 Tests . SKIPPED
>
> [INFO]
> 
>
> [INFO] BUILD FAILURE
>
> [INFO]
> 
>
> [INFO] Total time: 03:53 min
>
> [INFO] Finished at: 2016-05-02T17:44:57-07:00
>
> [INFO] Final Memory: 80M/1525M
>
> [INFO]
> 
>
> [ERROR] Failed to execute goal on project
> spark-docker-integration-tests_2.11: Could not resolve dependencies for
> project
> org.apache.spark:spark-docker-integration-tests_2.11:jar:2.0.0-SNAPSHOT:
> Failed to collect dependencies at com.oracle:ojdbc6:jar:11.2.0.1.0: Failed
> to read artifact descriptor for com.oracle:ojdbc6:jar:11.2.0.1.0: Could not
> transfer artifact com.oracle:ojdbc6:pom:11.2.0.1.0 from/to uber-artifactory
> (http://artifactory.uber.internal:4587/artifactory/repo/): Connect to
> artifactory.uber.internal:4587 [artifactory.uber.internal/10.162.11.61]
> failed: Operation timed out -> [Help 1]
>
>


Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread Hien Luu
Hi all,

I am running into a build problem with com.oracle:ojdbc6:jar:11.2.0.1.0.
It kept getting "Operation timed out" while building Spark Project Docker
Integration Tests module (see the error below).

Has anyone run this problem before? If so, how did you resolve around this
problem?

[INFO] Reactor Summary:

[INFO]

[INFO] Spark Project Parent POM ... SUCCESS [
2.423 s]

[INFO] Spark Project Test Tags  SUCCESS [
0.712 s]

[INFO] Spark Project Sketch ... SUCCESS [
0.498 s]

[INFO] Spark Project Networking ... SUCCESS [
1.743 s]

[INFO] Spark Project Shuffle Streaming Service  SUCCESS [
0.587 s]

[INFO] Spark Project Unsafe ... SUCCESS [
0.503 s]

[INFO] Spark Project Launcher . SUCCESS [
4.894 s]

[INFO] Spark Project Core . SUCCESS [
17.953 s]

[INFO] Spark Project GraphX ... SUCCESS [
3.480 s]

[INFO] Spark Project Streaming  SUCCESS [
6.022 s]

[INFO] Spark Project Catalyst . SUCCESS [
8.664 s]

[INFO] Spark Project SQL .. SUCCESS [
12.440 s]

[INFO] Spark Project ML Local Library . SUCCESS [
0.498 s]

[INFO] Spark Project ML Library ... SUCCESS [
8.594 s]

[INFO] Spark Project Tools  SUCCESS [
0.162 s]

[INFO] Spark Project Hive . SUCCESS [
9.834 s]

[INFO] Spark Project HiveContext Compatibility  SUCCESS [
1.428 s]

[INFO] Spark Project Docker Integration Tests . FAILURE [02:32
min]

[INFO] Spark Project REPL . SKIPPED

[INFO] Spark Project Assembly . SKIPPED

[INFO] Spark Project External Flume Sink .. SKIPPED

[INFO] Spark Project External Flume ... SKIPPED

[INFO] Spark Project External Flume Assembly .. SKIPPED

[INFO] Spark Project External Kafka ... SKIPPED

[INFO] Spark Project Examples . SKIPPED

[INFO] Spark Project External Kafka Assembly .. SKIPPED

[INFO] Spark Project Java 8 Tests . SKIPPED

[INFO]


[INFO] BUILD FAILURE

[INFO]


[INFO] Total time: 03:53 min

[INFO] Finished at: 2016-05-02T17:44:57-07:00

[INFO] Final Memory: 80M/1525M

[INFO]


[ERROR] Failed to execute goal on project
spark-docker-integration-tests_2.11: Could not resolve dependencies for
project
org.apache.spark:spark-docker-integration-tests_2.11:jar:2.0.0-SNAPSHOT:
Failed to collect dependencies at com.oracle:ojdbc6:jar:11.2.0.1.0: Failed
to read artifact descriptor for com.oracle:ojdbc6:jar:11.2.0.1.0: Could not
transfer artifact com.oracle:ojdbc6:pom:11.2.0.1.0 from/to uber-artifactory
(http://artifactory.uber.internal:4587/artifactory/repo/): Connect to
artifactory.uber.internal:4587 [artifactory.uber.internal/10.162.11.61]
failed: Operation timed out -> [Help 1]