Hadoop YARM Cluster Setup Questions

2014-08-22 Thread S.L
Hi Folks, I was not able to find a clear answer to this , I know that on the master node we need to have a slaves file listing all the slaves , but do we need to have the slave nodes have a master file listing the single name node( I am not using a secondary name node). I only have the slaves fil

Re: Appending to HDFS file

2014-08-22 Thread rab ra
Hi By default, it is true in hadoop 2.4.1. Nevertheless, I have set it to true explicitly in hdfs-site.xml. Still, I am not able to achieve append. Regards On 23 Aug 2014 11:20, "Jagat Singh" wrote: > What is value of dfs.support.append in hdfs-site.cml > > > https://hadoop.apache.org/docs/r2.3

Re: Appending to HDFS file

2014-08-22 Thread Jagat Singh
What is value of dfs.support.append in hdfs-site.cml https://hadoop.apache.org/docs/r2.3.0/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml On Sat, Aug 23, 2014 at 1:41 AM, rab ra wrote: > Hello, > > I am currently using Hadoop 2.4.1.I am running a MR job using hadoop > streaming utility. >

Re: hdfs dfsclient, possible to force storage datanode ?

2014-08-22 Thread norbi
can't find infos about that in 2.5 documentation and changelog? Am 22.08.2014 um 09:27 schrieb Tirupati Reddy: Hadoop 2.5 On Thursday, August 21, 2014, norbi > wrote: hadoop 2.0 (cloudera cdh 4.7) Am 21.08.2014 um 16:23 schrieb Liu, Yi A: Which versi

Re: job.getCounters returns null in Yarn-based job

2014-08-22 Thread Shahab Yunus
For those who are interested, this got resolved. The issue was that I was creating more counters than what was configured in the settings. I upped mapreduce.job.counters.max property to a larger number. The default was 120. The job finishes now and I am able to print and get counters as well. On

Re: How to serialize very large object in Hadoop Writable?

2014-08-22 Thread Alexander Pivovarov
Usually Hadoop Map Reduce deals with row based data. ReduceContext if you need to write a lot to hdfs file you can get OutputStream to hdfs file and write bytes. On Fri, Aug 22, 2014 at 3:30 PM, Yuriy wrote: > Thank you, Alexander. That, at least, explains the problem. And what > should be the

Re: How to serialize very large object in Hadoop Writable?

2014-08-22 Thread Yuriy
Thank you, Alexander. That, at least, explains the problem. And what should be the workaround if the combined set of data is larger than 2 GB? On Fri, Aug 22, 2014 at 1:50 PM, Alexander Pivovarov wrote: > Max array size is max integer. So, byte array can not be bigger than 2GB > On Aug 22, 2014

Re: How to serialize very large object in Hadoop Writable?

2014-08-22 Thread Alexander Pivovarov
Max array size is max integer. So, byte array can not be bigger than 2GB On Aug 22, 2014 1:41 PM, "Yuriy" wrote: > Hadoop Writable interface relies on "public void write(DataOutput out)" > method. > It looks like behind DataOutput interface, Hadoop uses DataOutputStream, > which uses a simple ar

How to serialize very large object in Hadoop Writable?

2014-08-22 Thread Yuriy
Hadoop Writable interface relies on "public void write(DataOutput out)" method. It looks like behind DataOutput interface, Hadoop uses DataOutputStream, which uses a simple array under the cover. When I try to write a lot of data in DataOutput in my reducer, I get: Caused by: java.lang.OutOfMemor

Job keeps running in LocalJobRunner under Cloudera 5.1

2014-08-22 Thread Something Something
Need some quick help. Our job runs fine under MapR, but when we start the same job on Cloudera 5.1, it keeps running in Local mode. I am sure this is some kind of configuration issue. Any quick tips? 14/08/22 12:16:58 INFO mapreduce.Job: map 0% reduce 0% 14/08/22 12:17:03 INFO mapred.LocalJobRun

Issues installing Cloudera Manager 5.1.1 on Amazon EC2 - Cloud Express Wizard

2014-08-22 Thread Adam Pritchard
Hi everyone, *Problem* I am having some trouble spinning up additional instances on Amazon using Cloudera Express / Cloudera Manager 5.1.1. I am able to install Cloudera manager on the Host machine through the Cloudera installation wizard. But I cannot spin up additional machines due to an autho

Basic Hadoop 2.3 32-bit VM for general Hadoop Users

2014-08-22 Thread Support Team
We have released a very basic 32-bit VM (VirtualBox Image) for those users who want to get started with Hadoop, without worrying about configuration and dependencies. We have used CDH5.1 for this release which contains Hadoop 2.3 (YARN), Pig 0.12, Hive 0.12, Sqoop 1.4.4 along with MySQL with 8

Appending to HDFS file

2014-08-22 Thread rab ra
Hello, I am currently using Hadoop 2.4.1.I am running a MR job using hadoop streaming utility. The executable needs to write large amount of information in a file. However, this write is not done in single attempt. The file needs to be appended with streams of information generated. In the code,

Hadoop 2.5.0 - HDFS browser-based file view

2014-08-22 Thread Brian C. Huffman
All, I noticed that that on Hadoop 2.5.0, when browsing the HDFS filesystem on port 50070, you can't view a file in the browser. Clicking a file gives a little popup with metadata and a download link. Can HDFS be configured to show plaintext file contents in the browser? Thanks, Brian

job.getCounters returns null in Yarn-based job

2014-08-22 Thread Shahab Yunus
Hello. I am trying to access custom counters that I have created in an mapreduce job on Yarn. After job.waitForCompletion(true) call, I try to do job.getCounters() but I get a null. This only happens if I run a heavy job meaning a) a lot of data and b) lot of reducers. E.g. for 10million record

WebHdfs config problem

2014-08-22 Thread Charles Robertson
Hi all, I've installed HDP 2.1 on CentOS 6.5, but I'm having a problem with WebHDFS. When I try to use the file browser or design an oozie workflow in Hue, I get a WebHdfs error. Attached is the error for the filebrowser. It appears to be some kind of permissions error, but I have hdfs security t

Re: hdfs dfsclient, possible to force storage datanode ?

2014-08-22 Thread Tirupati Reddy
Hadoop 2.5 On Thursday, August 21, 2014, norbi wrote: > hadoop 2.0 (cloudera cdh 4.7) > > Am 21.08.2014 um 16:23 schrieb Liu, Yi A: > >> Which version are you using? >> >> Regards, >> Yi Liu >> >> >> -Original Message- >> From: norbi [mailto:no...@rocknob.de] >> Sent: Wednesday, August 2