+1
Yeah, I tried to use Apache Livy, so as we can runing interactive query. But 
the Spark Driver in Livy looks heavy.

The SPIP may resolve the issue.







At 2022-06-14 18:11:21, "Wenchen Fan" <cloud0...@gmail.com> wrote:

+1



On Tue, Jun 14, 2022 at 9:38 AM Ruifeng Zheng <ruife...@foxmail.com> wrote:

+1





------------------ 原始邮件 ------------------
发件人: "huaxin gao" <huaxin.ga...@gmail.com>;
发送时间: 2022年6月14日(星期二) 上午8:47
收件人: "L. C. Hsieh"<vii...@gmail.com>;
抄送: "Spark dev list"<dev@spark.apache.org>;
主题: Re: [VOTE][SPIP] Spark Connect


+1



On Mon, Jun 13, 2022 at 5:42 PM L. C. Hsieh <vii...@gmail.com> wrote:

+1

On Mon, Jun 13, 2022 at 5:41 PM Chao Sun <sunc...@apache.org> wrote:
>
> +1 (non-binding)
>
> On Mon, Jun 13, 2022 at 5:11 PM Hyukjin Kwon <gurwls...@gmail.com> wrote:
>>
>> +1
>>
>> On Tue, 14 Jun 2022 at 08:50, Yuming Wang <wgy...@gmail.com> wrote:
>>>
>>> +1.
>>>
>>> On Tue, Jun 14, 2022 at 2:20 AM Matei Zaharia <matei.zaha...@gmail.com> 
>>> wrote:
>>>>
>>>> +1, very excited about this direction.
>>>>
>>>> Matei
>>>>
>>>> On Jun 13, 2022, at 11:07 AM, Herman van Hovell 
>>>> <her...@databricks.com.INVALID> wrote:
>>>>
>>>> Let me kick off the voting...
>>>>
>>>> +1
>>>>
>>>> On Mon, Jun 13, 2022 at 2:02 PM Herman van Hovell <her...@databricks.com> 
>>>> wrote:
>>>>>
>>>>> Hi all,
>>>>>
>>>>> I’d like to start a vote for SPIP: "Spark Connect"
>>>>>
>>>>> The goal of the SPIP is to introduce a Dataframe based client/server API 
>>>>> for Spark
>>>>>
>>>>> Please also refer to:
>>>>>
>>>>> - Previous discussion in dev mailing list: [DISCUSS] SPIP: Spark Connect 
>>>>> - A client and server interface for Apache Spark.
>>>>> - Design doc: Spark Connect - A client and server interface for Apache 
>>>>> Spark.
>>>>> - JIRA: SPARK-39375
>>>>>
>>>>> Please vote on the SPIP for the next 72 hours:
>>>>>
>>>>> [ ] +1: Accept the proposal as an official SPIP
>>>>> [ ] +0
>>>>> [ ] -1: I don’t think this is a good idea because …
>>>>>
>>>>> Kind Regards,
>>>>> Herman
>>>>
>>>>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to