Hi All,

I am using custom DataSourceV2 implementation (*Spark version 2.3.2*)

Here is how I am trying to pass in *date type *from spark shell.

scala> val df =
> sc.parallelize(Seq("2019-02-05")).toDF("datetype").withColumn("datetype",
> col("datetype").cast("date"))
> scala> df.write.format("com.shubham.MyDataSource").save


Below is the minimal write() method of my DataWriter implementation.

@Override
public void write(InternalRow record) throws IOException {
  ByteArrayOutputStream format = streamingRecordFormatter.format(record);
  System.out.println("MyDataWriter.write: " + record.get(0,
DataTypes.DateType));

}

It prints an integer as output:

MyDataWriter.write: 17039


Is this a bug?  or I am doing something wrong?

Thanks,
Shubham

Reply via email to