Github user RotemShaul commented on the issue:
https://github.com/apache/spark/pull/13761
Indeed it is, but then you lose the already implemented
GenericAvroSerializer abilities which come out of the box with Spark.
(Caching / Registering of static schemas )
As Spark already chose to (partially) support Avro from within SparkCore,
to me it makes sense it will also support schema repos, as they are very
common with Avro users to deal with Schema Evolution.
It was this partial support that actually sparked the idea of 'if they
support registering of Avro schemas, why not go all the way ?' and that's
why I created the PR in the first place.
Avro Generic Records and Spark-Core users will always face the
serialization problem of schemas, some might be able to solve it with
static schemas, and other will need the dynamic solution. It makes sense
that either SparkCore will provide solution for both use cases or none of
them. (and let it be resolved by custom serializer)
Just my opinion. In my current workplace - I took your
GenericAvroSerializer, added few lines of code to it, and used it as custom
serializer. But it could be generalized - hence the PR.
On Sat, Jul 23, 2016 at 5:14 AM, Reynold Xin <[email protected]>
wrote:
> @RotemShaul <https://github.com/RotemShaul> is this something doable by
> implementing a custom serializer outside Spark?
>
> â
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub
> <https://github.com/apache/spark/pull/13761#issuecomment-234693402>, or
mute
> the thread
>
<https://github.com/notifications/unsubscribe-auth/AHlUNO0V3mMbYBJNDLLn3HEVMlxD4vXYks5qYXkhgaJpZM4I48Q9>
> .
>
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]