I tried
val pairVarOriRDD = sc.newAPIHadoopFile(path,
classOf[NetCDFFileInputFormat].asSubclass(
classOf[org.apache.hadoop.mapreduce.lib.input.FileInputFormat[WRFIndex,WRFVariable]]),
classOf[WRFIndex],
classOf[WRFVariable],
jobConf)
The compiler does not compla
OK, from the declaration you sent me separately:
public class NetCDFFileInputFormat extends ArrayBasedFileInputFormat
public abstract class ArrayBasedFileInputFormat extends
org.apache.hadoop.mapreduce.lib.input.FileInputFormat
It looks like you do not declare any generic types that
FileInputForm
This is the declaration of my custom inputformat
public class NetCDFFileInputFormat extends ArrayBasedFileInputFormat
public abstract class ArrayBasedFileInputFormat extends
org.apache.hadoop.mapreduce.lib.input.FileInputFormat
Best,
Patcharee
On 25. feb. 2015 10:15, patcharee wrote:
Hi,
I
Hi,
I am new to spark and scala. I have a custom inputformat (used before
with mapreduce) and I am trying to use it in spark.
In java api (the syntax is correct):
JavaPairRDD pairVarOriRDD = sc.newAPIHadoopFile(
path,
NetCDFFileInputFormat.class,
WRFIndex.c