Unsubscribe
,
Chris MacKenzie
telephone: 0131 332 6967
email: stu...@chrismackenziephotography.co.uk
http://www.chrismackenziephotography.co.uk/
http://plus.google.com/+ChrismackenziephotographyCoUk/posts
http://www.linkedin.com/in/chrismackenziephotography/
From: Krish Donald gotomyp...@gmail.com
Reply
Hi have you set a class in your code ?
WARN mapred.JobClient: No job jar file set. User classes may not be found.
See JobConf(Class) or JobConf#setJar(String).
Also you need to check the path for your input file
Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
Thanks for the update ;O)
Regards,
Chris MacKenzie
http://www.chrismackenziephotography.co.uk/Expert in all aspects of
photography
telephone: 0131 332 6967 tel:0131 332 6967
email: stu...@chrismackenziephotography.co.uk
corporate: www.chrismackenziephotography.co.uk
http
It's my understanding that you don't get map tasks as such but containers.
My experience is with version 2 +
And if that's true containers are based on memory tuning in mapred-site.xml
Otherwise I'd love to learn more.
Sent from my iPhone
On 27 Aug 2014, at 12:14, Stijn De Weirdt
Hi,
The requirement is simply to have the slaves and masters files on the resource
manager it's used by the shell script that starts the demons :-)
Sent from my iPhone
On 23 Aug 2014, at 16:02, S.L simpleliving...@gmail.com wrote:
Ok, Ill copy the slaves file to the other slave nodes as
/mapreduce/job/
job_1408007466921_0002
Drill down through the application master to the job.
If you don¹t have the history server running the job data is not
persistent.
Hope this helps.
Regards,
Chris MacKenzie
telephone: 0131 332 6967
email: stu...@chrismackenziephotography.co.uk
corporate
to solve the problems you
encounter.
Buy Tom Whites book, it isn¹t perfect and a couple of years out of date
but it gives you enough detail and structure to build an impression you
can work from. The downloadable source code is a great help when trying to
get started.
Good luck.
Regards,
Chris
Hi Ravi,
I resolved this. Many thanks.
Regards,
Chris MacKenzie
telephone: 0131 332 6967
email: stu...@chrismackenziephotography.co.uk
corporate: www.chrismackenziephotography.co.uk
http://www.chrismackenziephotography.co.uk/
http://plus.google.com/+ChrismackenziephotographyCoUk/posts
http
Hi Zhijie,
ulimit is common between hard and soft ulimit
The hard limit can only be set by a sys admin. It can be used for a fork
bomb dos attack.
The sys admin hard ulimit can be set per user i.e hadoop_user
A user can add a line to their .profile file setting a soft -ulimit up to
the hard
Hi,
I¹ve scrabbled around looking for a fix for a while and have set the soft
ulimit size to 13172.
I¹m using Hadoop 2.4.1
Thanks in advance,
Chris MacKenzie
telephone: 0131 332 6967
email: stu...@chrismackenziephotography.co.uk
corporate: www.chrismackenziephotography.co.uk
http
storage spec
Network speed.
Can you help me out with that ?
Thanks in advance,
Regards,
Chris MacKenzie
telephone: 0131 332 6967
email: stu...@chrismackenziephotography.co.uk
corporate: www.chrismackenziephotography.co.uk
http://www.chrismackenziephotography.co.uk/
http://plus.google.com
Hi,
I¹d really appreciate it if someone could let me know the current
preferred specification for a cluster set up.
On average how many nodes
Disk space
Memory
Switch size
A link to a paper or discussion would be much appreciated.
Thanks in advance
Regards,
Chris MacKenzie
telephone: 0131
Hi,
I can probably help you out with that. I don¹t want to sound patronising
though. What is your IDE and have you included the hadoop libraries in
your jar ?
Regards,
Regards,
Chris MacKenzie
telephone: 0131 332 6967
email: stu...@chrismackenziephotography.co.uk
corporate
it existed. I plan to get round to trying that at some point
in the near future.
Many thanks,
Chris MacKenzie
telephone: 0131 332 6967
email: stu...@chrismackenziephotography.co.uk
corporate: www.chrismackenziephotography.co.uk
http://www.chrismackenziephotography.co.uk/
http://plus.google.com
feel like a real idiot some times, but there is so much conflicting
information out there that later you realise that questions asked seem non
sensical but at the time they feel valid ;O)
Thanks for your tolerance,
Chris MacKenzie
telephone: 0131 332 6967
email: stu
/hadoop/masters
Thanks in advance,
Chris MacKenzie
telephone: 0131 332 6967
email: stu...@chrismackenziephotography.co.uk
corporate: www.chrismackenziephotography.co.uk
http://www.chrismackenziephotography.co.uk/
http://plus.google.com/+ChrismackenziephotographyCoUk/posts
http://www.linkedin.com
/capacity-scheduler.xml
etc/hadoop/hadoop-env.sh
etc/hadoop/slaves
etc/hadoop/masters
Thanks in advance,
Chris MacKenzie
telephone: 0131 332 6967
email: stu...@chrismackenziephotography.co.uk
corporate: www.chrismackenziephotography.co.uk
http://www.chrismackenziephotography.co.uk/
http
Thanks Ozawa
Regards,
Chris MacKenzie
http://www.chrismackenziephotography.co.uk/Expert in all aspects of
photography
telephone: 0131 332 6967 tel:0131 332 6967
email: stu...@chrismackenziephotography.co.uk
corporate: www.chrismackenziephotography.co.uk
http
would use settings I had no idea existed and that may not be how I would
choose to set them up.
Regards,
Chris MacKenzie
telephone: 0131 332 6967
email: stu...@chrismackenziephotography.co.uk
corporate: www.chrismackenziephotography.co.uk
http://www.chrismackenziephotography.co.uk/
http
just get the message “Killed” without any explanation.
Unfortunately, where I was applying changes incrementally and testing I’ve
applied all the changes all at once.
I’m now backing out the changes I made slowly to see where it starts to
reflect what I expect.
Regards,
Chris MacKenzie
telephone
/application_1405538067846
_0006/container_1405538067846_0006_01_04/stderr
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
Regards,
Chris MacKenzie
telephone: 0131 332 6967
email: stu...@chrismackenziephotography.co.uk
corporate
);
control.addJob(doConcatenation);
When it comes to an end I have:
jobcontrol.ControlledJob (ControlledJob.java:submit(338)) - Concordance
Phase got an error while submitting
java.lang.IllegalStateException: Job in state RUNNING instead of DEFINE
Thanks in advance,
Chris MacKenzie
telephone: 0131 332
on your first big Hadoop map
reduce job what would would you differently ? What advice would you give
me starting out ?
Thanks again, I really appreciate your support.
Best Chris
Regards,
Chris MacKenzie
http://www.chrismackenziephotography.co.uk/
http://www.chrismackenziephotography.co.uk
²);
When I test the value in the driver, it isn¹t updated following the reduce
Best,
Chris MacKenzie
http://www.chrismackenziephotography.co.uk/
http://www.chrismackenziephotography.co.uk/Expert
Hi,
What is the anticipated usage of the above with the new api ? Is there
another way to remove the empty part-r files
When using it with MultipleOutputs to remove empty part-r files I have no
output ;O)
Regards,
Chris MacKenzie
http://www.chrismackenziephotography.co.uk/
Hi Markus And Shahab,
Thanks for getting back to me, I really appreciate it. LazyOutputFormat did
the trick. I tried NUllOutputFormat
(job.setOutputFormatClass(NullOutputFormat.class);) before writing to the
group but was getting an empty folder.
I looked at LazyOutputFormat, in fact, my mos is
, 2014, at 5:15 AM, Chris MacKenzie
stu...@chrismackenziephotography.co.uk wrote:
Hi,
I realise my previous question may have been a bit naïve and I also realise I
am asking an awful lot here, any advice would be greatly appreciated.
* I have been using Hadoop 2.4 in local mode and am sticking
and setup errors
Probably my fault. I was looking for the
extends Configurable implements Tool
part. I will double check when I get home rather than send you on a wild
goose chase.
Cheers
Chris
On Jun 27, 2014 8:16 AM, Chris MacKenzie
stu...@chrismackenziephotography.co.uk wrote:
Hi,
I realise my
issues I have had is staying true to the mapreduce.*
format
Best wishes,
Chris MacKenzie
From: Chris Mawata chris.maw...@gmail.com
Reply-To: user@hadoop.apache.org
Date: Friday, 27 June 2014 14:11
To: user@hadoop.apache.org
Subject: Re: Partitioning and setup errors
The new Configuration
got one. Is there a way to
split the reduce output in the same way that I can split the map input.
Thanks in advance,
Regards,
Chris MacKenzie
31 matches
Mail list logo