Re: Building Hadoop 0.21.0 on Eclipse

2012-02-01 Thread brisk
Hi Harsh,

So by trunk, do you mean 1.0.0 version?

Thanks,
Ethan

On Wed, Feb 1, 2012 at 7:34 AM, Harsh J ha...@cloudera.com wrote:

 Pavlos,

 0.21 was abandoned and you'd hardly find support for it from the major
 part of the community. What are you looking to build it for? Perhaps
 you wanted to use the trunk or the 0.23 branch?

 On Fri, Jan 20, 2012 at 5:15 AM, Pavlos Mitsoulis - Ntompos
 p.mitsou...@gmail.com wrote:
  Hello,
 
  I am trying to compile hadoop 0.21.0 on eclipse but I get some errors.
 In particular, the Java classes in package
 org.apache.hadoop.mapreduce.jobhistory complain that some classes does not
 exist. For instance, in the class EventReader.java I get the following
 error: Event cannot be resolved.
 
  Thanks,
  Pavlos



 --
 Harsh J
 Customer Ops. Engineer
 Cloudera | http://tiny.cloudera.com/about



Re: Building Hadoop 0.21.0 on Eclipse

2012-02-01 Thread Harsh J
Brisk,

No, trunk is currently numbered 0.24.

The branch-1 is a number revisioning over the earlier
branch-0.20-security stable line, and is not trunk. Check out this
blog post on the Bigtop project:
https://blogs.apache.org/bigtop/entry/all_you_wanted_to_know

On Wed, Feb 1, 2012 at 11:31 PM, brisk mylinq...@gmail.com wrote:
 Hi Harsh,

 So by trunk, do you mean 1.0.0 version?

 Thanks,
 Ethan

 On Wed, Feb 1, 2012 at 7:34 AM, Harsh J ha...@cloudera.com wrote:

 Pavlos,

 0.21 was abandoned and you'd hardly find support for it from the major
 part of the community. What are you looking to build it for? Perhaps
 you wanted to use the trunk or the 0.23 branch?

 On Fri, Jan 20, 2012 at 5:15 AM, Pavlos Mitsoulis - Ntompos
 p.mitsou...@gmail.com wrote:
  Hello,
 
  I am trying to compile hadoop 0.21.0 on eclipse but I get some errors.
 In particular, the Java classes in package
 org.apache.hadoop.mapreduce.jobhistory complain that some classes does not
 exist. For instance, in the class EventReader.java I get the following
 error: Event cannot be resolved.
 
  Thanks,
  Pavlos



 --
 Harsh J
 Customer Ops. Engineer
 Cloudera | http://tiny.cloudera.com/about




-- 
Harsh J
Customer Ops. Engineer
Cloudera | http://tiny.cloudera.com/about


Issues while building hadoop trunk

2012-01-23 Thread rajesh putta
While building hadoop trunk i came across the following error.Can any
one guide me what is the issue behind this failure.

main:
 [exec] protoc: error while loading shared libraries:
libprotobuf.so.7: cannot open shared object file: No such file or
directory
[INFO] 
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main  SUCCESS [1:20.348s]
[INFO] Apache Hadoop Project POM . SUCCESS [1:13.751s]
[INFO] Apache Hadoop Annotations . SUCCESS [1:26.350s]
[INFO] Apache Hadoop Project Dist POM  SUCCESS [3.190s]
[INFO] Apache Hadoop Assemblies .. SUCCESS [0.569s]
[INFO] Apache Hadoop Auth  SUCCESS [2:07.427s]
[INFO] Apache Hadoop Auth Examples ... SUCCESS [20.210s]
[INFO] Apache Hadoop Common .. FAILURE [11:11.204s]
[INFO] Apache Hadoop Common Project .. SKIPPED
[INFO] Apache Hadoop HDFS  SKIPPED
[INFO] Apache Hadoop HttpFS .. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal . SKIPPED
[INFO] Apache Hadoop HDFS Project  SKIPPED
[INFO] hadoop-yarn ... SKIPPED
[INFO] hadoop-yarn-api ... SKIPPED
[INFO] hadoop-yarn-common  SKIPPED
[INFO] hadoop-yarn-server  SKIPPED
[INFO] hadoop-yarn-server-common . SKIPPED
[INFO] hadoop-yarn-server-nodemanager  SKIPPED
[INFO] hadoop-yarn-server-web-proxy .. SKIPPED
[INFO] hadoop-yarn-server-resourcemanager  SKIPPED
[INFO] hadoop-yarn-server-tests .. SKIPPED
[INFO] hadoop-mapreduce-client ... SKIPPED
[INFO] hadoop-mapreduce-client-core .. SKIPPED
[INFO] hadoop-yarn-applications .. SKIPPED
[INFO] hadoop-yarn-applications-distributedshell . SKIPPED
[INFO] hadoop-yarn-site .. SKIPPED
[INFO] hadoop-mapreduce-client-common  SKIPPED
[INFO] hadoop-mapreduce-client-shuffle ... SKIPPED
[INFO] hadoop-mapreduce-client-app ... SKIPPED
[INFO] hadoop-mapreduce-client-hs  SKIPPED
[INFO] hadoop-mapreduce-client-jobclient . SKIPPED
[INFO] Apache Hadoop MapReduce Examples .. SKIPPED
[INFO] hadoop-mapreduce .. SKIPPED
[INFO] Apache Hadoop MapReduce Streaming . SKIPPED
[INFO] Apache Hadoop Archives  SKIPPED
[INFO] Apache Hadoop Rumen ... SKIPPED
[INFO] Apache Hadoop Extras .. SKIPPED
[INFO] Apache Hadoop Tools Dist .. SKIPPED
[INFO] Apache Hadoop Tools ... SKIPPED
[INFO] Apache Hadoop Distribution  SKIPPED
[INFO] 
[INFO] BUILD FAILURE
[INFO] 
[INFO] Total time: 17:46.677s
[INFO] Finished at: Tue Jan 24 00:51:45 IST 2012
[INFO] Final Memory: 32M/177M
[INFO] 
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto)
on project hadoop-common: An Ant BuildException has occured: exec
returned: 127 - [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with
the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn goals -rf :hadoop-common

Thanks
Rajesh Putta


Re: Issues while building hadoop trunk

2012-01-23 Thread Harsh J
Hi,

On Tue, Jan 24, 2012 at 1:00 AM, rajesh putta rajesh.p...@gmail.com wrote:
 While building hadoop trunk i came across the following error.Can any
 one guide me what is the issue behind this failure.

 main:
     [exec] protoc: error while loading shared libraries:
 libprotobuf.so.7: cannot open shared object file: No such file or
 directory

You need the dir that contains libprotobuf.so.7 on your
LD_LIBRARY_PATH. For instance, if /usr/local/lib carries it, ensure
you do:

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib

Before you run the maven build.

-- 
Harsh J
Customer Ops. Engineer, Cloudera


Re: Issues while building hadoop trunk

2012-01-23 Thread rajesh putta
Hi Harsh,
 even after setting LD_LIBRARY_PATH variable still i am
getting the same error.

Thanks
Rajesh Putta

On Tue, Jan 24, 2012 at 1:11 AM, Harsh J ha...@cloudera.com wrote:
 Hi,

 On Tue, Jan 24, 2012 at 1:00 AM, rajesh putta rajesh.p...@gmail.com wrote:
 While building hadoop trunk i came across the following error.Can any
 one guide me what is the issue behind this failure.

 main:
     [exec] protoc: error while loading shared libraries:
 libprotobuf.so.7: cannot open shared object file: No such file or
 directory

 You need the dir that contains libprotobuf.so.7 on your
 LD_LIBRARY_PATH. For instance, if /usr/local/lib carries it, ensure
 you do:

 export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib

 Before you run the maven build.

 --
 Harsh J
 Customer Ops. Engineer, Cloudera


Building Hadoop 0.21.0 on Eclipse

2012-01-19 Thread Pavlos Mitsoulis - Ntompos
Hello,

I am trying to compile hadoop 0.21.0 on eclipse but I get some errors. In 
particular, the Java classes in package org.apache.hadoop.mapreduce.jobhistory 
complain that some classes does not exist. For instance, in the class 
EventReader.java I get the following error: Event cannot be resolved.

Thanks,
Pavlos

[jira] [Resolved] (MAPREDUCE-2951) Problem while building hadoop trunk on Windows 7

2011-11-29 Thread Abhijit Suresh Shingate (Resolved) (JIRA)

 [ 
https://issues.apache.org/jira/browse/MAPREDUCE-2951?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Abhijit Suresh Shingate resolved MAPREDUCE-2951.


Resolution: Invalid

Issue is no longer happening.
Hence invalidating it.



 Problem while building hadoop trunk on Windows 7
 

 Key: MAPREDUCE-2951
 URL: https://issues.apache.org/jira/browse/MAPREDUCE-2951
 Project: Hadoop Map/Reduce
  Issue Type: Bug
  Components: build, mrv2
Affects Versions: 0.23.0, 0.24.0
 Environment: Windows 7
 Cygwin-1.7
 apache-maven-3.0.3
 java version 1.6.0_17
Reporter: Abhijit Suresh Shingate

 Hi All,
 I am facing problem with generating tar files for all hadoop modbles.
 The generated tar files are not correct. 
 For example, for hadoop-common-0.24.0-SNAPSHOT.tar.gz file/bin folder is 
 missing all sh files.
 Because of this i am not able to launch hdfs or mapreduce.
 To generate the tar file i used following command
 *mvn package -Pdist -Dtar -DskipTests -P-cbuild 
 -Dcommons.daemon.os.name=linux -Dcommons.daemon.os.arch=i686 -X*
 I am not maven expert but following is the part of the debug information 
 generated by above command talks about excluding *.sh file from assembly.
 [DEBUG]   (s) siteDirectory = 
 D:\iSAP\Hadoop\SVN\trunk\hadoop-common-project\hadoop-common\target\site
 [DEBUG]   (f) skipAssembly = false
 [DEBUG]   (s) tarLongFileMode = warn
 [DEBUG]   (s) tempRoot = 
 D:\iSAP\Hadoop\SVN\trunk\hadoop-common-project\hadoop-common\target\archive-tmp
 [DEBUG]   (s) workDirectory = 
 D:\iSAP\Hadoop\SVN\trunk\hadoop-common-project\hadoop-common\target\assembly\work
 [DEBUG] -- end configuration --
 [DEBUG] Before assembly is interpolated:
 ?xml version=1.0 encoding=UTF-8?assembly
   idhadoop-distro/id
   formats
 formatdir/format
   /formats
   includeBaseDirectoryfalse/includeBaseDirectory
   fileSets
 fileSet
   directory${basedir}/src/main/bin/directory
   outputDirectory/bin/outputDirectory
   excludes
 exclude*.sh/exclude
   /excludes
   fileMode0755/fileMode
 /fileSet
 Anyone has any idea about this?
 Thanks  Regards,
 Abhijit

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Created] (MAPREDUCE-2951) Problem while building hadoop trunk on Windows 7

2011-09-08 Thread Abhijit Suresh Shingate (JIRA)
Problem while building hadoop trunk on Windows 7


 Key: MAPREDUCE-2951
 URL: https://issues.apache.org/jira/browse/MAPREDUCE-2951
 Project: Hadoop Map/Reduce
  Issue Type: Bug
  Components: build, mrv2
Affects Versions: 0.23.0, 0.24.0
 Environment: Windows 7
Cygwin-1.7
apache-maven-3.0.3
java version 1.6.0_17
Reporter: Abhijit Suresh Shingate


Hi All,
I am facing problem with generating tar files for all hadoop modbles.
The generated tar files are not correct. 
For example, for hadoop-common-0.24.0-SNAPSHOT.tar.gz file/bin folder is 
missing all sh files.

Because of this i am not able to launch hdfs or mapreduce.

To generate the tar file i used following command

*mvn package -Pdist -Dtar -DskipTests -P-cbuild -Dcommons.daemon.os.name=linux 
-Dcommons.daemon.os.arch=i686 -X*

I am not maven expert but following is the part of the debug information 
generated by above command talks about excluding *.sh file from assembly.

[DEBUG]   (s) siteDirectory = 
D:\iSAP\Hadoop\SVN\trunk\hadoop-common-project\hadoop-common\target\site
[DEBUG]   (f) skipAssembly = false
[DEBUG]   (s) tarLongFileMode = warn
[DEBUG]   (s) tempRoot = 
D:\iSAP\Hadoop\SVN\trunk\hadoop-common-project\hadoop-common\target\archive-tmp
[DEBUG]   (s) workDirectory = 
D:\iSAP\Hadoop\SVN\trunk\hadoop-common-project\hadoop-common\target\assembly\work
[DEBUG] -- end configuration --
[DEBUG] Before assembly is interpolated:

?xml version=1.0 encoding=UTF-8?assembly
  idhadoop-distro/id
  formats
formatdir/format
  /formats
  includeBaseDirectoryfalse/includeBaseDirectory
  fileSets
fileSet
  directory${basedir}/src/main/bin/directory
  outputDirectory/bin/outputDirectory
  excludes
exclude*.sh/exclude
  /excludes
  fileMode0755/fileMode
/fileSet


Anyone has any idea about this?


Thanks  Regards,
Abhijit


--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira




Re: Error while building Hadoop-Yarn

2011-08-29 Thread Vinod Kumar Vavilapalli
Added this to FAQ section in
http://wiki.apache.org/hadoop/DevelopingOnTrunkAfter279Merge

Thanks,
+Vinod


On Fri, Aug 19, 2011 at 1:39 PM, rajesh putta rajesh.p...@gmail.com wrote:

 Thanks Arun,
 Now its working fine




 Thanks  Regards
 Rajesh Putta
 Development Engineer
 Pramati Technologies

 On Fri, Aug 19, 2011 at 12:25 PM, Arun Murthy a...@hortonworks.com wrote:

  That means you don't have the autotool chain necessary for build the
  native code.
 
  For now pass -P-cbuild to skip them.
 
  Arun
 
  Sent from my iPhone
 
  On Aug 18, 2011, at 11:26 PM, rajesh putta rajesh.p...@gmail.com
 wrote:
 
   Hi,
   I am using apache-maven-3.0.3 and i have set
  LD_LIBRARY_PATH=/usr/local/lib
   which has google protocbuf library.
   I am getting following error while building hadoop-yarn using mvn clean
   install -DskipTests=true
  
   [INFO] hadoop-yarn-api ... SUCCESS
  [14.904s]
   [INFO] hadoop-yarn-common  SUCCESS
  [8.787s]
   [INFO] hadoop-yarn-server-common . SUCCESS
  [4.691s]
   [INFO] hadoop-yarn-server-nodemanager  FAILURE
  [6.051s]
   [INFO] hadoop-yarn-server-resourcemanager  SKIPPED
   [INFO] hadoop-yarn-server-tests .. SKIPPED
   [INFO] hadoop-yarn-server  SKIPPED
   [INFO] hadoop-yarn ... SKIPPED
   [INFO]
  
 
   [INFO] BUILD FAILURE
   [INFO]
  
 
   [INFO] Total time: 34.870s
   [INFO] Finished at: Fri Aug 19 11:48:22 IST 2011
   [INFO] Final Memory: 44M/107M
  
   [ERROR] Failed to execute goal
   org.codehaus.mojo:make-maven-plugin:1.0-beta-1:autoreconf (autoreconf)
 on
   project hadoop-yarn-server-nodemanager: autoreconf command returned an
  exit
   value != 0. Aborting build; see debug output for more information. -
  [Help
   1]
  
   Thanks in advance
  
   Thanks  Regards
   Rajesh Putta
 



Re: Error while building Hadoop-Yarn

2011-08-19 Thread rajesh putta
Thanks Arun,
 Now its working fine




Thanks  Regards
Rajesh Putta
Development Engineer
Pramati Technologies

On Fri, Aug 19, 2011 at 12:25 PM, Arun Murthy a...@hortonworks.com wrote:

 That means you don't have the autotool chain necessary for build the
 native code.

 For now pass -P-cbuild to skip them.

 Arun

 Sent from my iPhone

 On Aug 18, 2011, at 11:26 PM, rajesh putta rajesh.p...@gmail.com wrote:

  Hi,
  I am using apache-maven-3.0.3 and i have set
 LD_LIBRARY_PATH=/usr/local/lib
  which has google protocbuf library.
  I am getting following error while building hadoop-yarn using mvn clean
  install -DskipTests=true
 
  [INFO] hadoop-yarn-api ... SUCCESS
 [14.904s]
  [INFO] hadoop-yarn-common  SUCCESS
 [8.787s]
  [INFO] hadoop-yarn-server-common . SUCCESS
 [4.691s]
  [INFO] hadoop-yarn-server-nodemanager  FAILURE
 [6.051s]
  [INFO] hadoop-yarn-server-resourcemanager  SKIPPED
  [INFO] hadoop-yarn-server-tests .. SKIPPED
  [INFO] hadoop-yarn-server  SKIPPED
  [INFO] hadoop-yarn ... SKIPPED
  [INFO]
  
  [INFO] BUILD FAILURE
  [INFO]
  
  [INFO] Total time: 34.870s
  [INFO] Finished at: Fri Aug 19 11:48:22 IST 2011
  [INFO] Final Memory: 44M/107M
 
  [ERROR] Failed to execute goal
  org.codehaus.mojo:make-maven-plugin:1.0-beta-1:autoreconf (autoreconf) on
  project hadoop-yarn-server-nodemanager: autoreconf command returned an
 exit
  value != 0. Aborting build; see debug output for more information. -
 [Help
  1]
 
  Thanks in advance
 
  Thanks  Regards
  Rajesh Putta



Building hadoop

2011-07-12 Thread Sudharsan Sampath

Hi,

I am getting the following error while attempting to run ant build in eclipse. 
I am connecting to internet thru proxy and then configured the ANT_OPTS with 
the proxy settings.

Can someone guide me how should I resolve this.

Buildfile: C:\workspace\hadoop-trunk\hadoop-trunk\mapreduce\build.xml
clover.setup:
clover.info:
 [echo]
 [echo]  Clover not found. Code coverage reports disabled.
 [echo]
clover:
ivy-download:
  [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
  [get] To: 
C:\workspace\hadoop-trunk\hadoop-trunk\mapreduce\ivy\ivy-2.1.0.jar
ivy-init-dirs:
[mkdir] Created dir: 
C:\workspace\hadoop-trunk\hadoop-trunk\mapreduce\build\ivy
[mkdir] Created dir: 
C:\workspace\hadoop-trunk\hadoop-trunk\mapreduce\build\ivy\lib
[mkdir] Created dir: 
C:\workspace\hadoop-trunk\hadoop-trunk\mapreduce\build\ivy\report
[mkdir] Created dir: 
C:\workspace\hadoop-trunk\hadoop-trunk\mapreduce\build\ivy\maven
ivy-probe-antlib:
ivy-init-antlib:
ivy-init:
[ivy:configure] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:configure] :: loading settings :: file = 
C:\workspace\hadoop-trunk\hadoop-trunk\mapreduce\ivy\ivysettings.xml
ivy-resolve-common:
[ivy:resolve] :: problems summary ::
[ivy:resolve]  WARNINGS
[ivy:resolve]   module not found: 
org.apache.hadoop#hadoop-common;0.23.0-SNAPSHOT
[ivy:resolve]    apache-snapshot: tried
[ivy:resolve] 
https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-common/0.23.0-SNAPSHOT/hadoop-common-0.23.0-SNAPSHOT.pom
[ivy:resolve] -- artifact 
org.apache.hadoop#hadoop-common;0.23.0-SNAPSHOT!hadoop-common.jar:
[ivy:resolve] 
https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-common/0.23.0-SNAPSHOT/hadoop-common-0.23.0-SNAPSHOT.jar
[ivy:resolve]    maven2: tried
[ivy:resolve] 
http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-common/0.23.0-SNAPSHOT/hadoop-common-0.23.0-SNAPSHOT.pom
[ivy:resolve] -- artifact 
org.apache.hadoop#hadoop-common;0.23.0-SNAPSHOT!hadoop-common.jar:
[ivy:resolve] 
http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-common/0.23.0-SNAPSHOT/hadoop-common-0.23.0-SNAPSHOT.jar
[ivy:resolve]   module not found: 
org.apache.hadoop#hadoop-common-test;0.23.0-SNAPSHOT
[ivy:resolve]    apache-snapshot: tried
[ivy:resolve] 
https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-common-test/0.23.0-SNAPSHOT/hadoop-common-test-0.23.0-SNAPSHOT.pom
[ivy:resolve] -- artifact 
org.apache.hadoop#hadoop-common-test;0.23.0-SNAPSHOT!hadoop-common-test.jar:
[ivy:resolve] 
https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-common-test/0.23.0-SNAPSHOT/hadoop-common-test-0.23.0-SNAPSHOT.jar
[ivy:resolve]    maven2: tried
[ivy:resolve] 
http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-common-test/0.23.0-SNAPSHOT/hadoop-common-test-0.23.0-SNAPSHOT.pom
[ivy:resolve] -- artifact 
org.apache.hadoop#hadoop-common-test;0.23.0-SNAPSHOT!hadoop-common-test.jar:
[ivy:resolve] 
http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-common-test/0.23.0-SNAPSHOT/hadoop-common-test-0.23.0-SNAPSHOT.jar
[ivy:resolve]   module not found: 
org.apache.hadoop#hadoop-hdfs;0.23.0-SNAPSHOT
[ivy:resolve]    apache-snapshot: tried
[ivy:resolve] 
https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-hdfs/0.23.0-SNAPSHOT/hadoop-hdfs-0.23.0-SNAPSHOT.pom
[ivy:resolve] -- artifact 
org.apache.hadoop#hadoop-hdfs;0.23.0-SNAPSHOT!hadoop-hdfs.jar:
[ivy:resolve] 
https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-hdfs/0.23.0-SNAPSHOT/hadoop-hdfs-0.23.0-SNAPSHOT.jar
[ivy:resolve]    maven2: tried
[ivy:resolve] 
http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-hdfs/0.23.0-SNAPSHOT/hadoop-hdfs-0.23.0-SNAPSHOT.pom
[ivy:resolve] -- artifact 
org.apache.hadoop#hadoop-hdfs;0.23.0-SNAPSHOT!hadoop-hdfs.jar:
[ivy:resolve] 
http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-hdfs/0.23.0-SNAPSHOT/hadoop-hdfs-0.23.0-SNAPSHOT.jar
[ivy:resolve]   ::
[ivy:resolve]   ::  UNRESOLVED DEPENDENCIES ::
[ivy:resolve]   ::
[ivy:resolve]   :: org.apache.hadoop#hadoop-common;0.23.0-SNAPSHOT: not 
found
[ivy:resolve]   :: 
org.apache.hadoop#hadoop-common-test;0.23.0-SNAPSHOT: not found
[ivy:resolve]   :: org.apache.hadoop#hadoop-hdfs;0.23.0-SNAPSHOT: not 
found
[ivy:resolve]   ::
[ivy:resolve]
[ivy:resolve]  ERRORS
[ivy:resolve]   Server access Error: Connection timed out: connect 
url=https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-common/0.23.0-SNAPSHOT/maven-metadata.xml