you will need almost the entire hadoop client-side JAR set and dependencies for this, I'm afraid.
The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight and only need an HTTP client, but I'm not aware of any ultra-thin client yet (apache http components should suffice). If you are using any of the build tools with dependency management: Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies pulled in. If you aren't using any of the build tools w/ dependency management, now is the time. On 30 August 2012 09:32, Visioner Sadak <[email protected]> wrote: > Hi, > > I have a WAR which is deployed on tomcat server the WAR contains some > java classes which uploads files, will i be able to upload directly in to > hadoop iam using the below code in one of my java class > > Configuration hadoopConf=new Configuration(); > //get the default associated file system > FileSystem fileSystem=FileSystem.get(hadoopConf); > // HarFileSystem harFileSystem= new HarFileSystem(fileSystem); > //copy from lfs to hdfs > fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new > Path("/user/TestDir/")); > > but its throwing up this error > > java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration > > when this code is run independtly using a single jar deployed in hadoop > bin it wrks fine > > >
