Disabled hadoop jobs on jenkins, because we still have an outstanding bug.
https://issues.apache.org/jira/browse/MESOS-480


@vinodkone


On Wed, Jun 19, 2013 at 9:15 AM, Apache Jenkins Server <
[email protected]> wrote:

> See <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/20/>
>
> ------------------------------------------
> [...truncated 7353 lines...]
>       [get] To: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar
> >
>       [get] Not modified - so not downloaded
>
> ivy-probe-antlib:
>
> ivy-init-antlib:
>
> ivy-init:
> [ivy:configure] :: loading settings :: file = <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml
> >
>
> ivy-resolve-common:
> [ivy:resolve] :: resolving dependencies ::
> org.apache.hadoop#streaming;working@janus
> [ivy:resolve]   confs: [common]
> [ivy:resolve]   found commons-cli#commons-cli;1.2 in default
> [ivy:resolve]   found commons-logging#commons-logging;1.0.4 in maven2
> [ivy:resolve]   found junit#junit;4.5 in maven2
> [ivy:resolve]   found org.mortbay.jetty#jetty-util;6.1.26 in maven2
> [ivy:resolve]   found org.mortbay.jetty#jetty;6.1.26 in maven2
> [ivy:resolve]   found org.mortbay.jetty#servlet-api;2.5-20081211 in default
> [ivy:resolve]   found asm#asm;3.2 in default
> [ivy:resolve]   found com.sun.jersey#jersey-core;1.8 in default
> [ivy:resolve]   found com.sun.jersey#jersey-json;1.8 in default
> [ivy:resolve]   found com.sun.jersey#jersey-server;1.8 in default
> [ivy:resolve]   found commons-httpclient#commons-httpclient;3.0.1 in maven2
> [ivy:resolve]   found log4j#log4j;1.2.15 in maven2
> [ivy:resolve]   found commons-codec#commons-codec;1.4 in default
> [ivy:resolve]   found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in
> default
> [ivy:resolve]   found org.codehaus.jackson#jackson-core-asl;1.0.1 in
> default
> [ivy:resolve]   found commons-configuration#commons-configuration;1.6 in
> default
> [ivy:resolve]   found commons-collections#commons-collections;3.2.1 in
> default
> [ivy:resolve]   found commons-lang#commons-lang;2.4 in default
> [ivy:resolve]   found commons-logging#commons-logging;1.1.1 in default
> [ivy:resolve]   found commons-digester#commons-digester;1.8 in default
> [ivy:resolve]   found commons-beanutils#commons-beanutils;1.7.0 in default
> [ivy:resolve]   found commons-beanutils#commons-beanutils-core;1.8.0 in
> default
> [ivy:resolve]   found org.apache.commons#commons-math;2.1 in maven2
> [ivy:resolve] :: resolution report :: resolve 129ms :: artifacts dl 7ms
> [ivy:resolve]   :: evicted modules:
> [ivy:resolve]   commons-logging#commons-logging;1.0.4 by
> [commons-logging#commons-logging;1.1.1] in [common]
> [ivy:resolve]   commons-logging#commons-logging;1.0.3 by
> [commons-logging#commons-logging;1.1.1] in [common]
> [ivy:resolve]   commons-logging#commons-logging;1.1 by
> [commons-logging#commons-logging;1.1.1] in [common]
>
> ---------------------------------------------------------------------
>         |                  |            modules            ||   artifacts
>   |
>         |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
> ---------------------------------------------------------------------
>         |      common      |   25  |   0   |   0   |   3   ||   22  |   0
>   |
>
> ---------------------------------------------------------------------
>
> ivy-retrieve-common:
> [ivy:retrieve] :: retrieving :: org.apache.hadoop#streaming [sync]
> [ivy:retrieve]  confs: [common]
> [ivy:retrieve]  0 artifacts copied, 22 already retrieved (0kB/5ms)
> [ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use
> 'ivy.settings.file' instead
> [ivy:cachepath] :: loading settings :: file = <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml
> >
>
> compile:
>      [echo] contrib: streaming
>     [javac] <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185:
> warning: 'includeantruntime' was not set, defaulting to
> build.sysclasspath=last; set to false for repeatable builds
>
> jar:
>       [jar] Building jar: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/streaming/hadoop-streaming-0.20.205.0.jar
> >
>
> compile-examples:
>
> jar-examples:
>
> package:
>     [mkdir] Created dir: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming
> >
>      [copy] Copying 1 file to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming
> >
>
> check-contrib:
>
> init:
>      [echo] contrib: thriftfs
>
> init-contrib:
>
> ivy-download:
>       [get] Getting:
> http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
>       [get] To: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar
> >
>       [get] Not modified - so not downloaded
>
> ivy-probe-antlib:
>
> ivy-init-antlib:
>
> ivy-init:
> [ivy:configure] :: loading settings :: file = <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml
> >
>
> ivy-resolve-common:
> [ivy:resolve] :: resolving dependencies ::
> org.apache.hadoop#thriftfs;working@janus
> [ivy:resolve]   confs: [common]
> [ivy:resolve]   found commons-logging#commons-logging;1.0.4 in maven2
> [ivy:resolve]   found log4j#log4j;1.2.15 in maven2
> [ivy:resolve]   found commons-configuration#commons-configuration;1.6 in
> default
> [ivy:resolve]   found commons-collections#commons-collections;3.2.1 in
> default
> [ivy:resolve]   found commons-lang#commons-lang;2.4 in default
> [ivy:resolve]   found commons-logging#commons-logging;1.1.1 in default
> [ivy:resolve]   found commons-digester#commons-digester;1.8 in default
> [ivy:resolve]   found commons-beanutils#commons-beanutils;1.7.0 in default
> [ivy:resolve]   found commons-beanutils#commons-beanutils-core;1.8.0 in
> default
> [ivy:resolve]   found org.apache.commons#commons-math;2.1 in maven2
> [ivy:resolve] :: resolution report :: resolve 52ms :: artifacts dl 3ms
> [ivy:resolve]   :: evicted modules:
> [ivy:resolve]   commons-logging#commons-logging;1.0.4 by
> [commons-logging#commons-logging;1.1.1] in [common]
> [ivy:resolve]   commons-logging#commons-logging;1.0.3 by
> [commons-logging#commons-logging;1.1.1] in [common]
> [ivy:resolve]   commons-logging#commons-logging;1.1 by
> [commons-logging#commons-logging;1.1.1] in [common]
>
> ---------------------------------------------------------------------
>         |                  |            modules            ||   artifacts
>   |
>         |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
> ---------------------------------------------------------------------
>         |      common      |   12  |   0   |   0   |   3   ||   9   |   0
>   |
>
> ---------------------------------------------------------------------
>
> ivy-retrieve-common:
> [ivy:retrieve] :: retrieving :: org.apache.hadoop#thriftfs [sync]
> [ivy:retrieve]  confs: [common]
> [ivy:retrieve]  0 artifacts copied, 9 already retrieved (0kB/3ms)
> [ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use
> 'ivy.settings.file' instead
> [ivy:cachepath] :: loading settings :: file = <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml
> >
>
> compile:
>      [echo] contrib: thriftfs
>     [javac] <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185:
> warning: 'includeantruntime' was not set, defaulting to
> build.sysclasspath=last; set to false for repeatable builds
>
> jar:
>       [jar] Building jar: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/thriftfs/hadoop-thriftfs-0.20.205.0.jar
> >
>
> compile-examples:
>
> jar-examples:
>
> package:
>      [copy] Copying 1 file to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/lib
> >
>
> init:
>
> ivy-download:
>       [get] Getting:
> http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
>       [get] To: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar
> >
>       [get] Not modified - so not downloaded
>
> ivy-probe-antlib:
>
> ivy-init-antlib:
>
> ivy-init:
> [ivy:configure] :: loading settings :: file = <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml
> >
>
> ivy-resolve-common:
> [ivy:resolve] :: resolving dependencies ::
> org.apache.hadoop#vaidya;working@janus
> [ivy:resolve]   confs: [common]
> [ivy:resolve]   found commons-logging#commons-logging;1.0.4 in maven2
> [ivy:resolve]   found log4j#log4j;1.2.15 in maven2
> [ivy:resolve] :: resolution report :: resolve 10ms :: artifacts dl 1ms
>
> ---------------------------------------------------------------------
>         |                  |            modules            ||   artifacts
>   |
>         |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
>
> ---------------------------------------------------------------------
>         |      common      |   2   |   0   |   0   |   0   ||   2   |   0
>   |
>
> ---------------------------------------------------------------------
>
> ivy-retrieve-common:
> [ivy:retrieve] :: retrieving :: org.apache.hadoop#vaidya [sync]
> [ivy:retrieve]  confs: [common]
> [ivy:retrieve]  0 artifacts copied, 2 already retrieved (0kB/1ms)
> [ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use
> 'ivy.settings.file' instead
> [ivy:cachepath] :: loading settings :: file = <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml
> >
>
> compile:
>      [echo] contrib: vaidya
>     [javac] <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185:
> warning: 'includeantruntime' was not set, defaulting to
> build.sysclasspath=last; set to false for repeatable builds
>
> jar:
>      [echo] contrib: vaidya
>       [jar] Building jar: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/vaidya/hadoop-vaidya-0.20.205.0.jar
> >
>
> package:
>     [mkdir] Created dir: <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya
> >
>      [copy] Copying 3 files to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya
> >
>      [copy] Copying 35 files to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps
> >
>      [copy] Copied 13 empty directories to 2 empty directories under <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps
> >
>      [copy] Copying 5 files to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop
> >
>      [copy] Copying 1 file to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/bin
> >
>      [copy] Copying 16 files to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin
> >
>      [copy] Copying 1 file to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/libexec
> >
>      [copy] Copying 16 files to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/etc/hadoop
> >
>      [copy] Copying 4 files to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/doc/hadoop
> >
>      [copy] Copying 7 files to <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin
> >
>
> BUILD SUCCESSFUL
> Total time: 1 minute 4 seconds
>
> To build the Mesos executor package, we first copy the
> necessary Mesos libraries.
>
>
>   $ cd build/hadoop-0.20.205.0
>   $ mkdir -p lib/native/Linux-amd64-64
>   $ cp <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/.libs/libmesos.so>
> lib/native/Linux-amd64-64
>
>
>
>   Finally, we will build the Mesos executor package as follows:
>
>
>   $ cd ..
>   $ mv hadoop-0.20.205.0 hadoop-0.20.205.0-mesos
>   $ tar czf hadoop-0.20.205.0-mesos.tar.gz hadoop-0.20.205.0-mesos/
>
>
>
> Build success!
>
> The Mesos distribution is now built in 'hadoop-0.20.205.0-mesos'
>
> Now let's run something!
>
> We'll try and start the JobTracker from the Mesos distribution path via:
>   $ cd hadoop-0.20.205.0-mesos
>   $ ./bin/hadoop jobtracker
>
>
>
> JobTracker started at 4661.
>
> Waiting 5 seconds for it to start. . . . . .
> Alright, now let's run the "wordcount" example via:
>
>   $ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount   <
> https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/mesos>
> out
>
>
> Exception in thread "main" java.io.IOException: Error opening job jar:
> hadoop-examples-0.20.205.0.jar
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
> Caused by: java.util.zip.ZipException: error in opening zip file
>         at java.util.zip.ZipFile.open(Native Method)
>         at java.util.zip.ZipFile.<init>(ZipFile.java:114)
>         at java.util.jar.JarFile.<init>(JarFile.java:135)
>         at java.util.jar.JarFile.<init>(JarFile.java:72)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:88)
>
> Oh no, it failed! Try running the JobTracker and wordcount
> example manually ... it might be an issue with your environment that
> this tutorial didn't cover (if you find this to be the case, please
> create a JIRA for us and/or send us a code review).
>
> ./TUTORIAL.sh: line 704: kill: (4661) - No such process
> make: *** [hadoop-0.20.205.0] Error 1
> Build step 'Execute shell' marked build as failure
>

Reply via email to