See
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/16/changes>
Changes:
[vinod] Fixed scheduler driver to call disconnected() when master fails over.
[vinod] Fixed master to send a FrameworkReregistered message when the
[vinod] Fixed release script to use git instead of svn.
[vinod] Fixed a bug in the release script regarding tag creation.
[benh] Duration related refactoring changes.
[benh] Time related refactoring changes.
[benh] Duration-Time related refactoring changes.
[benh] Performed GTEST_IS_THREADSAFE check.
[benh] Removed unnecessary use of Option.
[benh] Refactored zookeeper_tests.cpp into master_detector_tests.cpp and
[benh] Made tests flags inherit from logging.
[benh] Moved 'tests::mkdtemp' to Environment.
[benh] Minor cleanup in stout/exit.hpp.
[benh] Improved library for using JVM/JNI and updated uses (in tests).
[benh] Removed unused variables in AllocatorZooKeeperTest.
[benh] Renamed tests/zookeeper_test.hpp|cpp to zookeeper.hpp|cpp.
[benh] Refactored MesosTest/MesosClusterTest into a generic fixture for
[benh] Cleaned up the output from running and automagically disabling the
[benh] Used Milliseconds rather than Duration::parse.
[benh] Removed unnecessary TestingIsolators.
[benh] A little spring cleaning in the allocator tests.
[benh] More spring cleaning, this time in the slave recovery tets.
[benh] Replaced local::launch in tests with MesosTest.
[benh] Removed unused local::launch overload.
[benh] Cleanups in configure.ac.
[benh] Refactored base 'State' implementation to be serialization agnostic
[benh] Added a 'port' field to SlaveInfo and updated default master and slave
[benh] Fixed output bug with CHECK_SOME.
[benh] Added some helpers for failing a collection of futures.
[benh] Fixed synchronization bug when waiting for a process.
[benh] Fixed bug where we didn't stop all MesosExecutorDrivers when using the
[benh] Updated MonitorTest.WatchUnwatch to be deterministic.
[benh] Removed a using directive that causes compilation to fail when
[benh] Only build group tests with Java.
[vinod] Added DISCLAIMER to the distribution.
[vinod] Fixed NOTICE and LICENSE.
[benh] Fix for bug using 'TRUE' and 'FALSE' as identifiers on OS X.
[benh] Moved flags to stout.
[benh] Replaced flags and configurator in Mesos with flags in stout.
[benh] Added a 'Logging' process to libprocess.
[benh] Removed logging process from Mesos (now in libprocess).
[benh] Updated libprocess to use '3rdparty' instead of 'third_party'.
[benh] Renamed 'third_party' to '3rdparty'.
[benh] Added stout specific 'CHECK' constructs.
[benh] Replaced Mesos CHECK_SOME with stout CHECK_SOME.
[benh] Added 'ThreadLocal' to stout/thread.hpp.
[benh] Used ThreadLocal from stout.
[benh] Fixes os::environ for OS X.
[vinod] Updated version to 0.14.0.
[benh] Add Slave and Framework struct to HierarchicalAllocatorProcess. Cleans
[benh] Added a retry option to cgroups::mount in order to deal with a bug
[vinod] Fixed Zookeeper to recursively create parent paths as necessary.
[vinod] Exposed version in "/vars" and "/state.json" endpoints.
[vinod] Fixed slave to not send tasks and executor info of a terminated
executor,
[bmahler] Updated the NOTICE to include the correct year, and to fix line
[vinod] Fixed slave to properly handle terminated tasks that have pending
[vinod] Fixed master to properly do task reconciliation when slave re-registers.
[vinod] Added a new 'statistics.json' endpoint to the ResourceMonitor, this
[bmahler] Updated 3rd_party licences in the LICENCE file.
[bmahler] Updated the CHANGELOG for 0.12.0.
[bmahler] Added a master detector document.
[bmahler] Fixed the name of the master detector document.
[bmahler] Added an Upgrade document.
[bmahler] Updated the CHANGELOG with additional tickets fixed in 0.12.0.
[bmahler] Fixed a typo in the Master Detection filename.
[vinod] Updated CHANGELOG for 0.13.0.
[bmahler] Updated release tag format in the release script to use the new
[brenden.matthews] Run Hadoop tutorial binaries from within build dir.
[brenden.matthews] Build fix for HadoopPipes.cc with GCC 4.7.
[brenden.matthews] Hadoop tutorial version bump (CDH4.2.0 -> 4.2.1).
[vinod] Fixed slave to properly handle duplicate terminal updates for the
[vinod] Updated CHANGELOG for 0.13.0 (rc2).
[brenden.matthews] WebUI: Use slave hostname rather than libprocess.
[bmahler] Fixed libprocess tests to write to stderr when using newer versions
[bmahler] Fixed mesos tests to write to stderr when using newer versions of
[bmahler] Fixed a bug in the Slave's logging.
[woggle] Update deploy scripts.
[benh] Added os::sysctl to stout.
------------------------------------------
[...truncated 7348 lines...]
[get] To:
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
[get] Not modified - so not downloaded
ivy-probe-antlib:
ivy-init-antlib:
ivy-init:
[ivy:configure] :: loading settings :: file =
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>
ivy-resolve-common:
[ivy:resolve] :: resolving dependencies ::
org.apache.hadoop#streaming;working@janus
[ivy:resolve] confs: [common]
[ivy:resolve] found commons-cli#commons-cli;1.2 in default
[ivy:resolve] found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] found junit#junit;4.5 in maven2
[ivy:resolve] found org.mortbay.jetty#jetty-util;6.1.26 in maven2
[ivy:resolve] found org.mortbay.jetty#jetty;6.1.26 in maven2
[ivy:resolve] found org.mortbay.jetty#servlet-api;2.5-20081211 in maven2
[ivy:resolve] found asm#asm;3.2 in maven2
[ivy:resolve] found com.sun.jersey#jersey-core;1.8 in maven2
[ivy:resolve] found com.sun.jersey#jersey-json;1.8 in maven2
[ivy:resolve] found com.sun.jersey#jersey-server;1.8 in maven2
[ivy:resolve] found commons-httpclient#commons-httpclient;3.0.1 in maven2
[ivy:resolve] found log4j#log4j;1.2.15 in maven2
[ivy:resolve] found commons-codec#commons-codec;1.4 in maven2
[ivy:resolve] found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in maven2
[ivy:resolve] found org.codehaus.jackson#jackson-core-asl;1.0.1 in maven2
[ivy:resolve] found commons-configuration#commons-configuration;1.6 in maven2
[ivy:resolve] found commons-collections#commons-collections;3.2.1 in default
[ivy:resolve] found commons-lang#commons-lang;2.4 in default
[ivy:resolve] found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] found commons-digester#commons-digester;1.8 in maven2
[ivy:resolve] found commons-beanutils#commons-beanutils;1.7.0 in maven2
[ivy:resolve] found commons-beanutils#commons-beanutils-core;1.8.0 in maven2
[ivy:resolve] found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 113ms :: artifacts dl 7ms
[ivy:resolve] :: evicted modules:
[ivy:resolve] commons-logging#commons-logging;1.0.4 by
[commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] commons-logging#commons-logging;1.0.3 by
[commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] commons-logging#commons-logging;1.1 by
[commons-logging#commons-logging;1.1.1] in [common]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| common | 25 | 0 | 0 | 3 || 22 | 0 |
---------------------------------------------------------------------
ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#streaming [sync]
[ivy:retrieve] confs: [common]
[ivy:retrieve] 0 artifacts copied, 22 already retrieved (0kB/4ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use
'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file =
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>
compile:
[echo] contrib: streaming
[javac]
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185:
warning: 'includeantruntime' was not set, defaulting to
build.sysclasspath=last; set to false for repeatable builds
jar:
[jar] Building jar:
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/streaming/hadoop-streaming-0.20.205.0.jar>
compile-examples:
jar-examples:
package:
[mkdir] Created dir:
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming>
[copy] Copying 1 file to
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/streaming>
check-contrib:
init:
[echo] contrib: thriftfs
init-contrib:
ivy-download:
[get] Getting:
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
[get] To:
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
[get] Not modified - so not downloaded
ivy-probe-antlib:
ivy-init-antlib:
ivy-init:
[ivy:configure] :: loading settings :: file =
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>
ivy-resolve-common:
[ivy:resolve] :: resolving dependencies ::
org.apache.hadoop#thriftfs;working@janus
[ivy:resolve] confs: [common]
[ivy:resolve] found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] found log4j#log4j;1.2.15 in maven2
[ivy:resolve] found commons-configuration#commons-configuration;1.6 in maven2
[ivy:resolve] found commons-collections#commons-collections;3.2.1 in default
[ivy:resolve] found commons-lang#commons-lang;2.4 in default
[ivy:resolve] found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] found commons-digester#commons-digester;1.8 in maven2
[ivy:resolve] found commons-beanutils#commons-beanutils;1.7.0 in maven2
[ivy:resolve] found commons-beanutils#commons-beanutils-core;1.8.0 in maven2
[ivy:resolve] found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 54ms :: artifacts dl 3ms
[ivy:resolve] :: evicted modules:
[ivy:resolve] commons-logging#commons-logging;1.0.4 by
[commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] commons-logging#commons-logging;1.0.3 by
[commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] commons-logging#commons-logging;1.1 by
[commons-logging#commons-logging;1.1.1] in [common]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| common | 12 | 0 | 0 | 3 || 9 | 0 |
---------------------------------------------------------------------
ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#thriftfs [sync]
[ivy:retrieve] confs: [common]
[ivy:retrieve] 0 artifacts copied, 9 already retrieved (0kB/3ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use
'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file =
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>
compile:
[echo] contrib: thriftfs
[javac]
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185:
warning: 'includeantruntime' was not set, defaulting to
build.sysclasspath=last; set to false for repeatable builds
jar:
[jar] Building jar:
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/thriftfs/hadoop-thriftfs-0.20.205.0.jar>
compile-examples:
jar-examples:
package:
[copy] Copying 1 file to
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/lib>
init:
ivy-download:
[get] Getting:
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
[get] To:
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivy-2.1.0.jar>
[get] Not modified - so not downloaded
ivy-probe-antlib:
ivy-init-antlib:
ivy-init:
[ivy:configure] :: loading settings :: file =
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>
ivy-resolve-common:
[ivy:resolve] :: resolving dependencies ::
org.apache.hadoop#vaidya;working@janus
[ivy:resolve] confs: [common]
[ivy:resolve] found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 14ms :: artifacts dl 1ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| common | 2 | 0 | 0 | 0 || 2 | 0 |
---------------------------------------------------------------------
ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#vaidya [sync]
[ivy:retrieve] confs: [common]
[ivy:retrieve] 0 artifacts copied, 2 already retrieved (0kB/1ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use
'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file =
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/ivy/ivysettings.xml>
compile:
[echo] contrib: vaidya
[javac]
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/src/contrib/build-contrib.xml>:185:
warning: 'includeantruntime' was not set, defaulting to
build.sysclasspath=last; set to false for repeatable builds
jar:
[echo] contrib: vaidya
[jar] Building jar:
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/contrib/vaidya/hadoop-vaidya-0.20.205.0.jar>
package:
[mkdir] Created dir:
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya>
[copy] Copying 3 files to
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/contrib/vaidya>
[copy] Copying 35 files to
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps>
[copy] Copied 13 empty directories to 2 empty directories under
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop/webapps>
[copy] Copying 5 files to
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/hadoop>
[copy] Copying 1 file to
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/bin>
[copy] Copying 16 files to
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin>
[copy] Copying 1 file to
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/libexec>
[copy] Copying 16 files to
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/etc/hadoop>
[copy] Copying 4 files to
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/share/doc/hadoop>
[copy] Copying 7 files to
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/hadoop/hadoop-0.20.205.0/build/hadoop-0.20.205.0/sbin>
BUILD SUCCESSFUL
Total time: 1 minute 1 second
To build the Mesos executor package, we first copy the
necessary Mesos libraries.
$ cd build/hadoop-0.20.205.0
$ mkdir -p lib/native/Linux-amd64-64
$ cp
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/.libs/libmesos.so>
lib/native/Linux-amd64-64
Finally, we will build the Mesos executor package as follows:
$ cd ..
$ mv hadoop-0.20.205.0 hadoop-0.20.205.0-mesos
$ tar czf hadoop-0.20.205.0-mesos.tar.gz hadoop-0.20.205.0-mesos/
Build success!
The Mesos distribution is now built in 'hadoop-0.20.205.0-mesos'
Now let's run something!
We'll try and start the JobTracker from the Mesos distribution path via:
$ cd hadoop-0.20.205.0-mesos
$ ./bin/hadoop jobtracker
JobTracker started at 4901.
Waiting 5 seconds for it to start. . . . . .
Alright, now let's run the "wordcount" example via:
$ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount
<https://builds.apache.org/job/Mesos-Trunk-Ubuntu-Hadoop-0.20.205.0/ws/build/src/mesos>
out
Exception in thread "main" java.io.IOException: Error opening job jar:
hadoop-examples-0.20.205.0.jar
at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
Caused by: java.util.zip.ZipException: error in opening zip file
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:114)
at java.util.jar.JarFile.<init>(JarFile.java:135)
at java.util.jar.JarFile.<init>(JarFile.java:72)
at org.apache.hadoop.util.RunJar.main(RunJar.java:88)
Oh no, it failed! Try running the JobTracker and wordcount
example manually ... it might be an issue with your environment that
this tutorial didn't cover (if you find this to be the case, please
create a JIRA for us and/or send us a code review).
./TUTORIAL.sh: line 704: kill: (4901) - No such process
make: *** [hadoop-0.20.205.0] Error 1
Build step 'Execute shell' marked build as failure