Well, I can't get into the Hadoop installation to answer the question … This 
afternoon, the build is failing earlier:

make[4]: Entering directory `/usr/local/var/mesos/third_party/leveldb'
g++ -pthread -shared -Wl,-soname 
-Wl,/usr/local/var/mesos/third_party/leveldb/libleveldb.so.1 -I. -I./include 
-fno-builtin-memcmp -pthread -DOS_LINUX -DLEVELDB_PLATFORM_POSIX -g -g2 -O2 
-fPIC -fPIC db/builder.cc db/c.cc db/dbformat.cc db/db_impl.cc db/db_iter.cc 
db/filename.cc db/log_reader.cc db/log_writer.cc db/memtable.cc db/repair.cc 
db/table_cache.cc db/version_edit.cc db/version_set.cc db/write_batch.cc 
table/block_builder.cc table/block.cc table/filter_block.cc table/format.cc 
table/iterator.cc table/merger.cc table/table_builder.cc table/table.cc 
table/two_level_iterator.cc util/arena.cc util/bloom.cc util/cache.cc 
util/coding.cc util/comparator.cc util/crc32c.cc util/env.cc util/env_posix.cc 
util/filter_policy.cc util/hash.cc util/histogram.cc util/logging.cc 
util/options.cc util/status.cc port/port_posix.cc -o libleveldb.so.1.4
ln -fs libleveldb.so.1.4 libleveldb.so
ln -fs libleveldb.so.1.4 libleveldb.so.1
g++ -I. -I./include -fno-builtin-memcmp -pthread -DOS_LINUX 
-DLEVELDB_PLATFORM_POSIX -g -g2 -O2 -fPIC -c db/builder.cc -o db/builder.o
g++ -I. -I./include -fno-builtin-memcmp -pthread -DOS_LINUX 
-DLEVELDB_PLATFORM_POSIX -g -g2 -O2 -fPIC -c db/c.cc -o db/c.o
g++ -I. -I./include -fno-builtin-memcmp -pthread -DOS_LINUX 
-DLEVELDB_PLATFORM_POSIX -g -g2 -O2 -fPIC -c db/dbformat.cc -o db/dbformat.o
g++ -I. -I./include -fno-builtin-memcmp -pthread -DOS_LINUX 
-DLEVELDB_PLATFORM_POSIX -g -g2 -O2 -fPIC -c db/db_impl.cc -o db/db_impl.o
Assembler messages:
Fatal error: can't create db/db_impl.o: No such file or directory
make[4]: *** [db/db_impl.o] Error 1
make[4]: Leaving directory `/usr/local/var/mesos/third_party/leveldb'
make[3]: *** [leveldb/libleveldb.a] Error 2
make[3]: Leaving directory `/usr/local/var/mesos/third_party'
make[2]: *** [all-recursive] Error 1
make[2]: Leaving directory `/usr/local/var/mesos/third_party'
make[1]: *** [all] Error 2
make[1]: Leaving directory `/usr/local/var/mesos/third_party'
make: *** [all-recursive] Error 1



Jim


On 4/19/13 7:23 PM, "Benjamin Mahler" 
<[email protected]<mailto:[email protected]>> wrote:

This was committed, as for your TUTORIAL.sh issues, can you confirm those
are consistently failing?

'make hadoop-0.20.205.0' is working correctly on my end.


On Fri, Apr 19, 2013 at 7:18 PM, Benjamin Mahler
<[email protected]<mailto:[email protected]>>wrote:

Alright, I'm seeing a separate issue with the patch:

Patch conf/hadoop-env.sh? [N]

   $ patch -p1 <../hadoop-0.20.205.0_hadoop-env.sh.patch

Hit enter to continue.
patching file conf/hadoop-env.sh
Hunk #1 FAILED at 9.
1 out of 1 hunk FAILED -- saving rejects to file conf/hadoop-env.sh.rej

I have a fix that I will be committing shortly:
https://reviews.apache.org/r/10668/


On Fri, Apr 19, 2013 at 6:45 PM, Jim Donahue 
<[email protected]<mailto:[email protected]>> wrote:

Yup, I tried it on both a 32-bit and 64-bit Amazon Linux instance and got
the same behavior.  It is possible that I inadvertently made some change
in my build scripts that caused it to fail, but the scripts are pretty
stable and I can't think of any change that I made to both (32 and 64-bit)
of them that would cause a problem.

Jim

On 4/19/13 6:39 PM, "Benjamin Mahler" 
<[email protected]<mailto:[email protected]>> wrote:

>Well that is unexpected. Is that consistently failing for you?
>
>
>On Fri, Apr 19, 2013 at 6:29 PM, Jim Donahue 
><[email protected]<mailto:[email protected]>> wrote:
>
>> Also the example seems to have stopped working Š  That's not a serious
>> problem (I can just ignore it) but it did work last week as far as I
can
>> remember. :-)
>>
>> Waiting 5 seconds for it to start. . . . . .
>> Alright, now let's run the "wordcount" example via:
>>
>> $ ./bin/hadoop jar hadoop-examples-0.20.205.0.jar wordcount
>> src/contrib/mesos/src/java/org/apache/hadoop/mapred out
>>
>> Hit enter to continue.
>> 13/04/19 23:31:54 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 0 time(s).
>> 13/04/19 23:31:55 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 1 time(s).
>> 13/04/19 23:31:56 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 2 time(s).
>> 13/04/19 23:31:57 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 3 time(s).
>> 13/04/19 23:31:58 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 4 time(s).
>> 13/04/19 23:31:59 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 5 time(s).
>> 13/04/19 23:32:00 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 6 time(s).
>> 13/04/19 23:32:01 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 7 time(s).
>> 13/04/19 23:32:02 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 8 time(s).
>> 13/04/19 23:32:03 INFO ipc.Client: Retrying connect to server:
>>localhost/
>> 127.0.0.1:54311. Already tried 9 time(s).
>> java.net.ConnectException: Call to localhost/127.0.0.1:54311 failed on
>> connection exception: java.net.ConnectException: Connection refused
>> at org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>> at org.apache.hadoop.mapred.$Proxy1.getProtocolVersion(Unknown Source)
>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>> at
org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:478)
>> at org.apache.hadoop.mapred.JobClient.init(JobClient.java:472)
>> at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:455)
>> at org.apache.hadoop.mapreduce.Job$1.run(Job.java:478)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:416)
>> at
>>

>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation
>>.java:1059)
>> at org.apache.hadoop.mapreduce.Job.connect(Job.java:476)
>> at org.apache.hadoop.mapreduce.Job.submit(Job.java:464)
>> at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
>> at org.apache.hadoop.examples.WordCount.main(WordCount.java:67)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>

>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
>>:57)
>> at
>>

>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
>>mpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:616)
>> at
>>

>>org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDri
>>ver.java:68)
>> at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>> at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>

>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
>>:57)
>> at
>>

>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
>>mpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:616)
>> at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> Caused by: java.net.ConnectException: Connection refused
>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>> at
>>sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:592)
>> at
>>

>>org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.jav
>>a:206)
>> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:604)
>> at
>>org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
>> at
>>org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
>> at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
>> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1202)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1046)
>> ... 27 more
>>
>> Oh no, it failed! Try running the JobTracker and wordcount
>> example manually ... it might be an issue with your environment that
>> this tutorial didn't cover (if you find this to be the case, please
>> create a JIRA for us and/or send us a code review).
>>
>> ./TUTORIAL.sh: line 662: kill: (1522) - No such process
>> make: *** [hadoop-0.20.205.0] Error 1
>>
>>
>> On 4/19/13 4:29 PM, "Benjamin Mahler" 
>> <[email protected]<mailto:[email protected]>
<mailto:
>> [email protected]<mailto:[email protected]>>> wrote:
>>
>> Brenden: It looks like Maven isn't required when building
>> hadoop-0.20.205.0, can you send a patch to fix your change to only
check
>> for Maven when building the CDH releases?
>>
>> Jim: Thanks for the report.
>>
>> I committed a recent change by Brenden here, which enforces that both
>>'ant'
>> and 'mvn' are present when building the hadoop port:
>> https://reviews.apache.org/r/10558/
>>
>>
>> On Fri, Apr 19, 2013 at 3:51 PM, Jim Donahue 
>> <[email protected]<mailto:[email protected]>
<mailto:
>> [email protected]<mailto:[email protected]>>> wrote:
>>
>> I was -- the last build I did was ten days ago.  Somebody broke the
>>build
>> scripts that I've been using for quite a while.
>>
>> Jim
>>
>>
>>
>> On 4/19/13 3:48 PM, "Benjamin Mahler" 
>> <[email protected]<mailto:[email protected]>
<mailto:
>> [email protected]<mailto:[email protected]>>> wrote:
>>
>> >You can fix this by installing Maven.
>> >
>> >However, I was under the assumption that we required Maven in order to
>>run
>> >the Hadoop tutorial. You were successfully building hadoop without
>>Maven
>> >installed?
>> >
>> >
>> >On Fri, Apr 19, 2013 at 3:44 PM, Jim Donahue
>><[email protected]<mailto:[email protected]><mailto:
>> [email protected]<mailto:[email protected]>>> wrote:
>> >
>> >> I'm trying to build Mesos on Amazon Linux and it appears that the
>>Hadoop
>> >> build script has changed.  It worked just fine a few days ago, but
>>now
>> >>I'm
>> >> getting:
>> >>
>> >> sudo make hadoop-0.20.205.0
>> >> if test ".." != ".."; then \
>> >> cp -p ./TUTORIAL.sh .; \
>> >> cp -p ./hadoop-gridmix.patch .; \
>> >> cp -p ./hadoop-7698-1.patch .; \
>> >> cp -p ./hadoop-0.20.205.0_hadoop-env.sh.patch .; \
>> >> cp -p ./hadoop-0.20.205.0_mesos.patch .; \
>> >> cp -p ./mapred-site.xml.patch .; \
>> >> cp -rp ./mesos .; \
>> >> cp -p ./mesos-executor .; \
>> >> fi
>> >> rm -rf hadoop-0.20.205.0
>> >> which: no mvn in (/sbin:/bin:/usr/sbin:/usr/bin)
>> >>
>> >> We seem to be missing mvn from the path. Please install
>> >> mvn and re-run this tutorial. If you still have troubles, please
>>report
>> >> this to:
>> >>
>> >> [email protected]<mailto:[email protected]><mailto:
[email protected]<mailto:[email protected]>>
>> >>
>> >> (Remember to include as much debug information as possible.)
>> >>
>> >> Help, please!
>> >>
>> >>
>> >> Jim
>> >>
>>
>>
>>
>>




Reply via email to