Hi Matei,

Thanks for your help, it works now.

I think this problem can be solved by making last parameter just: Class<?
extends OutputFormat> (without <?,?>).
For example, method setOutputFormatClass in org.apache.hadoop.mapreduce.Job
class works so.


Best regards,
Alex

2013/10/12 Matei Zaharia <[email protected]>

> Hi Alex,
>
> Unfortunately there seems to be something wrong with how the generics on
> that method get seen by Java. You can work around it by calling this with:
>
> plans.saveAsHadoopFiles("hdfs://localhost:8020/user/hue/output/completed",
> "csv", String.class, String.class, (Class) TextOutputFormat.class);
>
> The extra call to (Class) erases the generics and make this an unchecked
> call. Let's see if we can fix this in the next release (it might just
> require removing the "? extends OutputFormat" constraint).
>
> Matei
>
> On Oct 10, 2013, at 4:29 AM, Alex Levin <[email protected]> wrote:
>
> Hello!
>
> I'm using Spark Streaming rigth now, and I'd wanted to write output to
> hdfs folder.
> I have JavaPairDStream<String, String> plans
> After some code, I want to configure output:
> plans.saveAsHadoopFiles("hdfs://localhost:8020/user/hue/output/completed",
> "csv", String.class, String.class, TextOutputFormat.class);
>
> But I get this error:
>
> The method saveAsHadoopFiles(String, String, Class<?>, Class<?>, Class<?
> extends OutputFormat<?,?>>) in the type JavaPairDStream<String,String> is
> not applicable for the arguments (String, String, Class<String>,
> Class<String>, Class<TextOutputFormat>)
>
> As I understand, this is because of wrong last argument, but I don't
> understand what should I pass here.
> I use TextOutputFormat from org.apache.hadoop.mapred.TextOutputFormat;
>
> Please, help!)
>
>
>
> Best regards,
> Alex
>
>
>

Reply via email to