Hi,
Can anyone offer any advice please? We are successfully running a Hadoop cluster (Hbase/Hive etc..) but need to access this from a Windows operating system (for various legacy reasons!). We need to be able to put files/gets in/out of HDFS and also access HBase. Has anyone achieved this? Am I on to a none-starter? I have managed to build a Java application using the latest hadoop-core-0.20.2+320.jar's etc.. this compiles/runs but gives various errors when trying to reference Hadoop. e.g. Opening an InputStream Exception in thread "main" java.net.MalformedURLException: unknown protocol: hdfs Would we need the full Hadoop on the Client (which isn't part of the cluster)? Will this actually work? Any advice would be gratefully received. Regards Stuart Scott System Architect emis intellectual technology Fulford Grange, Micklefield Lane Rawdon Leeds LS19 6BA E-mail: [email protected] <mailto:[email protected]> Website: www.emisit.com <outbind://26/www.emisit.com> Privileged and/or Confidential information may be contained in this message. If you are not the original addressee indicated in this message (or responsible for delivery of the message to such person), you may not copy or deliver this message to anyone. In such case, please delete this message, and notify us immediately. Opinions, conclusions and other information expressed in this message are not given or endorsed by EMIS nor can I conclude contracts on its behalf unless otherwise indicated by an authorised representative independently of this message. EMIS reserves the right to monitor, intercept and (where appropriate) read all incoming and outgoing communications. By replying to this message and where necessary you are taken as being aware of and giving consent to such monitoring, interception and reading. EMIS is a trading name of Egton Medical Information Systems Limited. Registered in England. No 2117205. Registered Office: Fulford Grange, Micklefield Lane, Rawdon, Leeds, LS19 6BA
