Hi,
Is there any good documentation to start with fault injection. Please share
if there is any link to any examples that demonstrate the use of fault
injection.
Thanks,
Amit
--
View this message in context:
I suggest to start with fault injection tests. They can found under
src/test/aop/org/apache/hadoop
for HDFS in 0.22. Hdfs has been has the best coverage by fault injection.
Test exists in the similar location in the trunk, but they aren't hooked up to
maven build system yet.
Cos
On Thu, Dec
To follow up on what I have found:
I opened up some of the logs on the datanodes and found this message:
Can not start task tracker because java.net.BindException: Address
already in use
It was using the default port setting from mapred-default.xml, which was 50060.
I decided to try an add
Hey Praveenesh,
What do you mean by multiuser? Do you want to support multiple users
starting/stopping daemons?
-Joey
On Dec 29, 2011, at 2:49, praveenesh kumar praveen...@gmail.com wrote:
Guys,
Did someone try this thing ?
Thanks
On Tue, Dec 27, 2011 at 4:36 PM, praveenesh kumar
yup.. exactly that... :-) And I also want multiple users to submit jobs.
Thanks,
Praveenesh
On Thu, Dec 29, 2011 at 4:46 PM, Joey Echeverria j...@cloudera.com wrote:
Hey Praveenesh,
What do you mean by multiuser? Do you want to support multiple users
starting/stopping daemons?
-Joey
Why do you want multiple users starting daemons?
As for submitting jobs, that should work out of the box. If you want the child
JVMs running the map and reduce tasks to execute as the submitting user, then
you need to configure your cluster with Kerberos.
The CDH3 security guide
I got hadoop 0.22.0 running with Windows. The most useful instructions
I found where these:
http://knowlspace.wordpress.com/2011/06/21/setting-up-hadoop-on-windows/
I was able to run examples grep, pi and WordCount.
Saw that 1.0 just came out. Downloaded it; installed; and tried it
out.
We don't want newcomers to JIRA shy away at the colossal numbers, and
give expressions such as this:
http://www.youtube.com/watch?v=SiMHTK15Pik :-)
Sorry about all that email noise though, can't find a way to avoid
that. There's still a lot left in each. But I think this is good
learning for
Hi everyone,
I am trying to build Pig from SVN trunk on hadoop 0.20.205.
While doing that, I am getting the following error : Any idea why its
happening ?
Thanks,
Praveenesh
root@lxe [/usr/local/hadoop/pig/new/trunk] $ -- ant jar-withouthadoop
-verbose
Apache Ant version 1.6.5 compiled on June
Try pinging http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.2.0/ivy-2.2.0.jar
to see if your server can connect to that URL.
If not you have some kind of connection issue with outgoing requests.
--Joey
On Thu, Dec 29, 2011 at 11:28 PM, praveenesh kumar praveen...@gmail.com wrote:
Hi
Check the `mapreduce.job.reduce.slowstart.completedmaps` parameter. The
reducers cannot start processing the data from the mappers until the all
the map tasks are complete, but the reducers can start fetching the data
from the nodes on which the map tasks have completed.
Praveen
On Thu, Dec 29,
@Praveen
Thanks . I got it .
--
**
JAGANADH G
http://jaganadhg.in
*ILUGCBE*
http://ilugcbe.org.in
When I am pinging its saying Unknown host...
Is there any kind of proxy setting we need to do, when building from ant ?
Thanks,
Praveenesh
On Fri, Dec 30, 2011 at 11:02 AM, Joey Krabacher jkrabac...@gmail.comwrote:
Try pinging
Can you ping any URL successfully?
Try www.google.com, www.yahoo.com or something like that.
If you can't ping any of those then you are probably behind a firewall
and you'll have to poke a hole into it to get to the outside world.
Or you can download the jar that it is trying to find
I set up proxy, Now I am getting the following error :
root@lxe9700 [/usr/local/hadoop/pig/new/trunk] $ -- ant jar-withouthadoop
-verbose
Apache Ant version 1.6.5 compiled on June 5 2007
Buildfile: build.xml
Detected Java version: 1.5 in: /usr/java/jdk1.6.0_25/jre
Detected OS: Linux
parsing
Looks like you may have an unsupported/older version of ant.
May try upgrading ant to something over 1.7
--Joey
On Thu, Dec 29, 2011 at 11:54 PM, praveenesh kumar praveen...@gmail.com wrote:
I set up proxy, Now I am getting the following error :
root@lxe9700 [/usr/local/hadoop/pig/new/trunk]
Yeah.. It was ANT issue. Upgrading ANT to 1.7 worked fine.
Thanks,
Praveenesh
On Fri, Dec 30, 2011 at 12:13 AM, Joey Krabacher jkrabac...@gmail.comwrote:
Looks like you may have an unsupported/older version of ant.
May try upgrading ant to something over 1.7
--Joey
On Thu, Dec 29, 2011 at
17 matches
Mail list logo