Hi,

Thanks for the rapid response.

I'd appreciate it if there will be some more documentation for this within
Spark documentation.
For example - I'm a Source/ Output format developer - I should add this.
I am an internal company library developer that has this specific logic
that does something automatically (for instance adds monitoring and
metrics. calculates efficiency or something like that) - should I use it?
I am an internal company library developer that has a custom ETL developed
(for instance read from Kafka and save to Delta/ Iceberg) - that people are
using so the same logic won't be written many times by dozens of developers
in the company - they just provide some params and this pipeline runs for
them - should/ can we do something like that to make upgrades and bug fixes
easier?

Thanks!
Nimrod

On Mon, Feb 3, 2025 at 4:23 PM Herman van Hovell <her...@databricks.com>
wrote:

> Hi Nimrod,
>
> We are working on this as we speak.
>
> There is already a PR out for the extensions use case:
> https://github.com/apache/spark/pull/49604
>
> Kind regards,
> Herman
>
> On Mon, Feb 3, 2025 at 10:10 AM Nimrod Ofek <ofek.nim...@gmail.com> wrote:
>
>> Hi,
>>
>> In https://spark.apache.org/spark-connect/ - at the bottom it says:
>>
>> Check out the guide on migrating from Spark JVM to Spark Connect to learn
>> more about how to write code that works with Spark Connect. Also, check out
>> how to build Spark Connect custom extensions to learn how to use
>> specialized logic.
>>
>> I think there should be links to those guides - I couldn't find such
>> guides. I did find some quick start that shows how to write in python/
>> Scala - but not a migration guide (maybe I just didn't find it) - but more
>> importantly - I couldn't find anything about "how to build Spark Connect
>> custom extensions to learn how to use specialized logic".
>> Is there such a guide? If so - what are the cases where one would need it?
>>
>> On a side note - there is a small documentation bug here
>> <https://spark.apache.org/docs/latest/spark-connect-overview.html#what-is-supported-in-spark-34>
>>  -
>> it states Spark 3.4 although we are already 3.5 and soon 4.0.
>>
>>
>> Thanks!
>> Nimrod
>>
>

Reply via email to