>
> config.addResource(new
> Path("/usr/local/hadoop-2.6.0/etc/hadoop/core-site.xml"));
>
> the path is org.apache.hadoop.fs.Path, so the resource should be in hdfs,
do you have the resource in hdfs?
can you try the API
> config.addResource(InputStream in)
2015-05-25 18:36 GMT+08:00 Carmen Manzulli <[email protected]>:
>
> Hi,
>
> I'm trying to run a servlet (QueryServlet) - using Tomcat 8- which submits
> a job called "ArqOnHadoop2" to hadoop 2.6.0... this last one is configured
> using single node setting up in /usr/local folder. This job works if i
> start it from command line but, when i try to execute the follow code from
> netbeans,I receive "HTTP Status 500 - Cannot initialize Cluster. Please
> check your configuration for mapreduce.framework.name and the correspond
> server addresses."
>
> Configuration config = new Configuration();
> config.addResource(new
> Path("/usr/local/hadoop-2.6.0/etc/hadoop/core-site.xml"));
> config.addResource(new
> Path("/usr/local/hadoop-2.6.0/etc/hadoop/hdfs-site.xml"));
> config.addResource(new
> Path("/usr/local/hadoop-2.6.0/etc/hadoop/yarn-site.xml"));
> config.addResource(new
> Path("/usr/local/hadoop-2.6.0/etc/hadoop/mapred-site.xml"));
>
> config.set("fs.hdfs.impl",
> org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
> config.set("yarn.resourcemanager.address","master:8032");
> config.set("mapreduce.framework.name","yarn");
> config.set("fs.defaultFS","hdfs://master:9000");
> //input query parameter as string
> config.set(ArqReducerConstants.MY_QUERY,args[0]);
> Job job = Job.getInstance(config);
> job.setJarByClass(QueryServlet.class);//input number of lines for mapper
> parameter as int, used to reduce number of splits and run map() method
> quicklyString N=args[4];int n=Integer.parseInt(N);
> job.getConfiguration().setInt(NLineInputFormat.LINES_PER_MAP, n);
>
> job.setMapperClass(MapperDoesNothing.class);
> job.setMapOutputKeyClass(NullWritable.class);
> job.setMapOutputValueClass(TripleWritable.class);
>
> job.setReducerClass(QueryReducer.class);
>
> job.setInputFormatClass( BlockedNTriplesInputFormat.class);
> job.setOutputFormatClass(TextOutputFormat.class);String
> in="hdfs://master:9000"+args[1];String
> out="hdfs://master:9000"+args[2];//input and output paths
> parametersFileInputFormat.setInputPaths(job, new
> Path(in));FileOutputFormat.setOutputPath(job, new Path(out));
>
> job.waitForCompletion(true);
>
> //where args is just a name of a String array with some input parameters..
>
> There are no problems with hadoop or in the .xml files; there are no
> problems with permissions, because i've disable its with:
> dfs.permission.enables=false in hdfs-site.xml; there are no problem with
> permissions also for my hadoop folder because i've used chmod -R 777;
>
> So...what does my project miss to obtain the goal? I need help...
>
> i think there is something missing when i set the Configuration Object, or
> problem could be due to something about jar in the classpath but in this
> case i don't know how to insert all hadoop jar using maven...
>
> Thanks in advance, Carmen.
>
>