A little bit more color on this.
That field is nested inside a avro record like so (not syntactically
correct):
{
type: record,
fields: [
{
name: whatever,
type: record {
fields: [
{
"name": "somedate",
"type: {"type": "int", "logicalType": "date"}
}
]
}
]
The outer layer record uses SchemaCoderHelper.coderForFieldType on all it's
fields to create the coder. However, that method when called for the inner
record field just returns a SchemaCoder.of(fieldType.getRowSchema()) which
doesn't take into account LogicalTypes.
I think that's where the problem is. If anyone who knows that code could
have a look and let me know their thoughts, I can try to fix the issue if
we agree that there is one.
On Thu, Oct 14, 2021 at 7:12 AM Cristian Constantinescu <[email protected]>
wrote:
> Hi all,
>
> I have the following field in one of my avro schemas:
>
> {
> "name": "somedate",
> "type: {"type": "int", "logicalType": "date"}
> }
>
> This generates a java.time.LocalDate field in the corresponding java class
> (call it Foo).
>
> AvroUtils.toBeamSchema(FooClass.getSchema()) will return that field as
> DATETIME in the Beam schema. because of AvroUtils.toFieldType (around line
> 275, where TimestampMillis and Date are both stored as DATETIME).
>
> My problem is that in SchemaCoderHelpers, DATETIME is set to use
> InstantCoder which expects a joda Instant as input not a LocalDate. So when
> my PTransform returns Foo objects it crashes with "class
> java.time.LocalDate cannot be cast to class org.joda.time.Instant..." when
> trying to encode that field using InstantCoder.encode().
>
> Is there a workaround for this issue?
>
> Thank you,
> Cristian
>
> PS: I did search the mailing list and google, but didn't find anything
> related except a thread on AvroCoder.JodaTimestampConversion, which I don't
> think it applies to here.
>