Thats a pretty advanced example that uses experimental APIs. I'd suggest
looking at https://github.com/databricks/spark-avro as a reference.
On Mon, Sep 28, 2015 at 9:00 PM, Ted Yu wrote:
> See this thread:
>
> http://search-hadoop.com/m/q3RTttmiYDqGc202
>
> And:
>
>
Yep, we've designed it so that we take care of any translation that needs
to be done for you.
On Tue, Sep 29, 2015 at 10:39 AM, Jerry Lam wrote:
> Hi Michael and Ted,
>
> Thank you for the reference. Is it true that once I implement a custom
> data source, it can be used
Hi Michael and Ted,
Thank you for the reference. Is it true that once I implement a custom data
source, it can be used in all spark supported language? That is Scala,
Java, Python and R. :)
I want to take advantage of the interoperability that is already built in
spark.
Thanks!
Jerry
On Tue,
See this thread:
http://search-hadoop.com/m/q3RTttmiYDqGc202
And:
http://spark.apache.org/docs/latest/sql-programming-guide.html#data-sources
> On Sep 28, 2015, at 8:22 PM, Jerry Lam wrote:
>
> Hi spark users and developers,
>
> I'm trying to learn how implement a