ok it was a stupid question...just deleting NodeWritable as outputkey and
i've fixed it!!

2015-03-06 17:06 GMT+01:00 Carmen Manzulli <[email protected]>:

> Hi all,
> I'm Carmen and i'm getting this error when i try to execute
> TripleFilterBySubjectUriMapper; in particular this is what i see:
>
>
> 15/03/06 16:57:15 INFO mapreduce.Job: Task Id :
> attempt_1425649175991_0004_m_000000_0, Status : FAILED
> Error: java.io.IOException: Type mismatch in key from map: expected
> org.apache.jena.hadoop.rdf.types.NodeWritable, received
> org.apache.hadoop.io.LongWritable
>     at
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1069)
>     at
> org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:712)
>     at
> org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
>     at
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
>     at
> org.apache.jena.hadoop.rdf.mapreduce.filter.AbstractNodeTupleFilterMapper.map(AbstractNodeTupleFilterMapper.java:58)
>     at
> org.apache.jena.hadoop.rdf.mapreduce.filter.AbstractNodeTupleFilterMapper.map(AbstractNodeTupleFilterMapper.java:42)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
>
> i think the problem could be in my code whis is the followed:
>
> public class provaTool extends Configured implements Tool {
>
>     @Override
>     public int run(String[] args) throws Exception {
>
>             Configuration config = new Configuration(true);
>
>             config.set(RdfMapReduceConstants.FILTER_SUBJECT_URIS,"
> http://dbpedia.org/resource/Alfred_Hitchcock";);
>             // Create job
>
>             Job job = Job.getInstance(config);
>             job.setJarByClass(prova.class);
>             job.setJobName("RDF Triples Node Usage Count");
>
>             // Map/Reduce classes
>             job.setMapperClass(TripleFilterBySubjectUriMapper.class);
>
>             job.setMapOutputKeyClass(NodeWritable.class);
>             job.setMapOutputValueClass(TripleWritable.class);
>             job.setOutputKeyClass(NodeWritable.class);
>             // Input and Output
>             job.setInputFormatClass(TriplesInputFormat.class);
>             job.setOutputFormatClass(NTriplesOutputFormat.class);
>             FileInputFormat.setInputPaths(job, new Path("/input/"));
>             FileOutputFormat.setOutputPath(job, new Path("/output/"));
>
>             return job.waitForCompletion(true)?0:1;
>
>     }
>     public static void main(String[] args) throws Exception {
>          // Let ToolRunner handle generic command-line options
>          int res = ToolRunner.run(new Configuration(), new provaTool(),
> args);
>
>          System.exit(res);
>        }
> }
>
> anyone could help me?
>

Reply via email to