+1 On 2025/02/24 05:30:34 Jungtaek Lim wrote: > +1 (non-binding) > > On Mon, Feb 24, 2025 at 2:26 PM Gengliang Wang <ltn...@gmail.com> wrote: > > > +1 > > > > On Sun, Feb 23, 2025 at 8:40 PM Hyukjin Kwon <gurwls...@apache.org> wrote: > > > >> +1 > >> > >> On Mon, 24 Feb 2025 at 11:47, Wenchen Fan <cloud0...@gmail.com> wrote: > >> > >>> +1 > >>> > >>> On Mon, Feb 24, 2025 at 7:51 AM John Zhuge <jzh...@apache.org> wrote: > >>> > >>>> +1 (non-binding) > >>>> > >>>> John Zhuge > >>>> > >>>> > >>>> On Sun, Feb 23, 2025 at 2:37 PM huaxin gao <huaxin.ga...@gmail.com> > >>>> wrote: > >>>> > >>>>> +1 > >>>>> > >>>>> On Sun, Feb 23, 2025 at 1:51 PM serge rielau.com <se...@rielau.com> > >>>>> wrote: > >>>>> > >>>>>> +1 it’s abt time. > >>>>>> Sent from my iPhone > >>>>>> > >>>>>> > On Feb 23, 2025, at 12:25 PM, L. C. Hsieh <vii...@gmail.com> wrote: > >>>>>> > > >>>>>> > +1 > >>>>>> > > >>>>>> >> On Sun, Feb 23, 2025 at 7:51 AM Max Gekk <max.g...@gmail.com> > >>>>>> wrote: > >>>>>> >> > >>>>>> >> Hi Spark devs, > >>>>>> >> > >>>>>> >> Following the discussion [1], I'd like to start the vote for the > >>>>>> SPIP [2]. The SPIP aims to add a new data type TIME to Spark SQL > >>>>>> types. New > >>>>>> type should conform to TIME(n) WITHOUT TIME ZONE as defined by the SQL > >>>>>> standard. > >>>>>> >> > >>>>>> >> This thread will be open for at least the next 72 hours. Please > >>>>>> vote accordingly, [ ] +1: Accept the proposal as an official SPIP [ ] > >>>>>> +0 [ > >>>>>> ] -1: I don’t think this is a good idea because … > >>>>>> >> > >>>>>> >> [1] > >>>>>> https://lists.apache.org/thread/892vkskktqrx1czk9wm6l8vchpydrny2 > >>>>>> >> [2] https://issues.apache.org/jira/browse/SPARK-51162 > >>>>>> >> > >>>>>> >> Yours faithfully, > >>>>>> >> Max Gekk > >>>>>> > > >>>>>> > > >>>>>> --------------------------------------------------------------------- > >>>>>> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > >>>>>> > > >>>>>> > >>>>> >
--------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org