Hadoop 2.4.0 to Hadoop 2.7.1

2016-04-29 Thread kumar, Senthil(AWF)
Hi Everyone , We are Planning to upgrade our Clusters to Hadoop 2.7.1 Stable Release .. Can someone help me to get List of Deprecated API's or Configurations ? Hopefully the Jobs which are running against 2.4.0 should run in 2.7.1 smoothly.. --Senthil

Re: hadoop 2.4.0 streaming generic parser options using TAB as separator

2015-06-10 Thread Kiran Dangeti
\bbb On Jun 10, 2015 10:58 AM, anvesh ragi annunarc...@gmail.com wrote: Hello all, I know that the tab is default input separator for fields : stream.map.output.field.separator stream.reduce.input.field.separator stream.reduce.output.field.separator mapreduce.textoutputformat.separator

Re: hadoop 2.4.0 streaming generic parser options using TAB as separator

2015-06-10 Thread anvesh ragi
That did not work either. Thanks Regards, Anvesh R On Tue, Jun 9, 2015 at 11:12 PM, Kiran Dangeti kirandkumar2...@gmail.com wrote: \bbb On Jun 10, 2015 10:58 AM, anvesh ragi annunarc...@gmail.com wrote: Hello all, I know that the tab is default input separator for fields :

hadoop 2.4.0 streaming generic parser options using TAB as separator

2015-06-09 Thread anvesh ragi
Hello all, I know that the tab is default input separator for fields : stream.map.output.field.separator stream.reduce.input.field.separator stream.reduce.output.field.separator mapreduce.textoutputformat.separator but if i try to write the generic parser option :

Why won't Hadoop 2.4.0 kill my containers using more memory than allocated?

2014-11-06 Thread Eric Jacobson
I'm running a single node Apache Hadoop 2.4.0 cluster and trying to test my application's behavior when it exceeds the memory allocated for the containers. No matter what I do I can't seem to get the containers to be killed when they start exceeding physical memory allocated. Any suggestions

[HDFS] DFSClient does not closing a closed socket resulting in thousand of CLOSE_WAIT sockets with HDP 2.1/HBase 0.98.0/Hadoop/2.4.0

2014-08-29 Thread Steven Xu
Hello Hadoopers, When I run HDP 2.1/HBase 0.98.0/Hadoop/2.4.0, I always got the fatal problem: DFSClient does not closing a closed socket resulting in thousand of CLOSE_WAIT sockets. Have you guys got same issue, if that please share to me? Thanks a lot. I also create a issue HDFS-6973

Re: Hadoop 2.4.0 How to change Configured Capacity

2014-08-02 Thread arthur.hk.c...@gmail.com
, arthur.hk.c...@gmail.com arthur.hk.c...@gmail.com wrote: Hi, I have installed Hadoop 2.4.0 with 5 nodes, each node physically has 4T hard disk, when checking the configured capacity, I found it is about 49.22 GB per node, can anyone advise how to set bigger “configured capacity” e.g. 2T

Re: Hadoop 2.4.0 How to change Configured Capacity

2014-08-02 Thread Harsh J
. On Jul 28, 2014 5:14 AM, arthur.hk.c...@gmail.com arthur.hk.c...@gmail.com wrote: Hi, I have installed Hadoop 2.4.0 with 5 nodes, each node physically has 4T hard disk, when checking the configured capacity, I found it is about 49.22 GB per node, can anyone advise how to set bigger “configured

Re: Hadoop 2.4.0 How to change Configured Capacity

2014-07-28 Thread hadoop hive
You need to add each disk inside dfs.name.data.dir parameter. On Jul 28, 2014 5:14 AM, arthur.hk.c...@gmail.com arthur.hk.c...@gmail.com wrote: Hi, I have installed Hadoop 2.4.0 with 5 nodes, each node physically has 4T hard disk, when checking the configured capacity, I found it is about

Re: Bugs while installing apache hadoop 2.4.0

2014-07-06 Thread Ritesh Kumar Singh
the library in 'hadoop-dist/target/hadoop-2.4.0/lib/native' Thanks, Akira (2014/07/03 15:32), Ritesh Kumar Singh wrote: @Akira : if i delete my native library, how exactly do i generate my own copy of it? @Chris : This is the content of my /etc/hosts file : 127.0.0.1localhost 127.0.1.1hduser

Re: Bugs while installing apache hadoop 2.4.0

2014-07-06 Thread Akira AJISAKA
Did you move your native library to /usr/local/hadoop/lib/native ? Thanks, Akira (2014/07/07 0:59), Ritesh Kumar Singh wrote: My hadoop is still giving the above mentioned error. Please help. On Thu, Jul 3, 2014 at 12:50 PM, Akira AJISAKA ajisa...@oss.nttdata.co.jp

Re: Bugs while installing apache hadoop 2.4.0

2014-07-06 Thread Vikas Srivastava
wrote: You can download the source code and generate your own native library by $ mvn package -Pdist,native -Dtar -DskipTests You should see the library in 'hadoop-dist/target/hadoop-2.4.0/lib/native' Thanks, Akira (2014/07/03 15:32), Ritesh Kumar Singh wrote: @Akira : if i

Bugs while installing apache hadoop 2.4.0

2014-07-03 Thread Ritesh Kumar Singh
When I try to start dfs using start-dfs.sh I get this error message: 14/07/03 11:03:21 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Starting namenodes on [OpenJDK 64-Bit Server VM warning: You have loaded library

Re: Bugs while installing apache hadoop 2.4.0

2014-07-03 Thread Akira AJISAKA
It looks like the native library is not compatible with your environment. You should delete '/usr/local/hadoop/lib/native' directory or compile source code to get your own native library. Thanks, Akira (2014/07/03 15:05), Ritesh Kumar Singh wrote: When I try to start dfs using start-dfs.sh I

Re: Bugs while installing apache hadoop 2.4.0

2014-07-03 Thread Ritesh Kumar Singh
@Akira : if i delete my native library, how exactly do i generate my own copy of it? @Chris : This is the content of my /etc/hosts file : 127.0.0.1 localhost 127.0.1.1 hduser # The following lines are desirable for IPv6 capable hosts ::1 localhost ip6-localhost ip6-loopback fe00::0

Re: Bugs while installing apache hadoop 2.4.0

2014-07-03 Thread Akira AJISAKA
You can download the source code and generate your own native library by $ mvn package -Pdist,native -Dtar -DskipTests You should see the library in 'hadoop-dist/target/hadoop-2.4.0/lib/native' Thanks, Akira (2014/07/03 15:32), Ritesh Kumar Singh wrote: @Akira : if i delete my native

Re: Install Hadoop 2.4.0 from Source - Compile error

2014-05-12 Thread Silvina Caíno Lores
...@gmail.com wrote: Are you sure that CMake is installed? Best, Silvina On 28 April 2014 13:05, ascot.m...@gmail.com ascot.m...@gmail.com wrote: Hi, I am trying to install Hadoop 2.4.0 from source, I got the following error, please help!! Can anyone share the apache-maven-3.1.1/conf

Re: HttpConfig API changes in hadoop 2.4.0

2014-05-01 Thread Hardik Pandya
You are hitting this http://hadoop.apache.org/docs/r2.4.0/hadoop-project-dist/hadoop-hdfs/CHANGES.txtHDFS-5308. Replace HttpConfig#getSchemePrefix with implicit schemes in HDFS JSP. (Haohui Mai via jing9) On Wed, Apr 30, 2014 at 6:08 PM, Gaurav Gupta gaurav.gopi...@gmail.comwrote: I am trying

Re: HttpConfig API changes in hadoop 2.4.0

2014-05-01 Thread Hardik Pandya
https://issues.apache.org/jira/browse/HDFS-5308 On Thu, May 1, 2014 at 8:58 AM, Hardik Pandya smarty.ju...@gmail.comwrote: You are hitting this http://hadoop.apache.org/docs/r2.4.0/hadoop-project-dist/hadoop-hdfs/CHANGES.txtHDFS-5308. Replace HttpConfig#getSchemePrefix with implicit

HttpConfig API changes in hadoop 2.4.0

2014-04-30 Thread Gaurav Gupta
Hi, I was using hadoop 2.2.0 to build my application. I was using HttpConfig.getSchemaPrefix() api call. When I updated hadoop to 2.4.0, the compilation fails for my application and I see that HttpConfig (org.apache.hadoop.http.HttpConfig) APIs have changed. How do I get the schrema Prefix in

Re: HttpConfig API changes in hadoop 2.4.0

2014-04-30 Thread Haohui Mai
Hi, Can you describe your use cases, that is, how the prefix is used? Usually you can get around with it by generating relative URLs, which starts at //. ~Haohui On Wed, Apr 30, 2014 at 2:31 PM, Gaurav Gupta gaurav.gopi...@gmail.comwrote: Hi, I was using hadoop 2.2.0 to build my

Re: HttpConfig API changes in hadoop 2.4.0

2014-04-30 Thread Gaurav Gupta
I am trying to get the container logs url and here is the code snippet containerLogsUrl = HttpConfig.getSchemePrefix() + this.container.nodeHttpAddress + /node/containerlogs/ + id + / + System.getenv(ApplicationConstants.Environment.USER.toString()); Thanks Gaurav On Wed, Apr 30, 2014 at

Install Hadoop 2.4.0 from Source - Compile error

2014-04-28 Thread ascot.m...@gmail.com
Hi, I am trying to install Hadoop 2.4.0 from source, I got the following error, please help!! Can anyone share the apache-maven-3.1.1/conf/settings.xml” setting? Regards O/S Ubuntu: 12.04 (64-bit) Java: java version 1.6.0_45 protoc —version: libprotoc 2.5.0 Command: mvn package -Pdist

Re: Install Hadoop 2.4.0 from Source - Compile error

2014-04-28 Thread Silvina Caíno Lores
Are you sure that CMake is installed? Best, Silvina On 28 April 2014 13:05, ascot.m...@gmail.com ascot.m...@gmail.com wrote: Hi, I am trying to install Hadoop 2.4.0 from source, I got the following error, please help!! Can anyone share the apache-maven-3.1.1/conf/settings.xml” setting

Re: Install Hadoop 2.4.0 from Source - Compile error

2014-04-28 Thread ascot.m...@gmail.com
Hadoop 2.4.0 from source, I got the following error, please help!! Can anyone share the apache-maven-3.1.1/conf/settings.xml” setting? Regards O/S Ubuntu: 12.04 (64-bit) Java: java version 1.6.0_45 protoc —version: libprotoc 2.5.0 Command: mvn package -Pdist,native -DskipTests -Dtar

Re: hadoop 2.4.0?

2014-04-18 Thread Tsuyoshi OZAWA
-1934 can be critical if you're using RM-HA on some situation. Thanks, - Tsuyoshi On Fri, Apr 18, 2014 at 2:07 AM, MrAsanjar . afsan...@gmail.com wrote: Hi all, How stable is hadoop 2.4.0? what are the known issues? anyone has done an extensive testing on it? Thanks in advance..

hadoop 2.4.0?

2014-04-17 Thread MrAsanjar .
Hi all, How stable is hadoop 2.4.0? what are the known issues? anyone has done an extensive testing on it? Thanks in advance..

Re: hadoop 2.4.0?

2014-04-17 Thread Azuryy Yu
Hadoop 2.4.0 doesn't has the known issue now. I think it's a stable release even if it's not in the stable download list. the only one issue I met is that you should upgrade Hive to Hive-0.12.0 after upgrade to 2.4.0 for the API compatible. On Fri, Apr 18, 2014 at 1:07 AM, MrAsanjar . afsan