Hi, I am kind of new at the whole Apache process (not specifically Spark). Does that means that the DataSourceV2 is dead or stays experimental? Thanks for clarifying for a newbie.
jg > On Mar 3, 2019, at 11:21, Ryan Blue <rb...@netflix.com.invalid> wrote: > > This vote fails with the following counts: > > 3 +1 votes: > > Matt Cheah > Ryan Blue > Sean Owen (binding) > 1 -0 vote: > > Jose Torres > 2 -1 votes: > > Mark Hamstra (binding) > Midrul Muralidharan (binding) > Thanks for the discussion, everyone, It sounds to me that the main objection > is simply that we’ve already committed to a release that removes deprecated > APIs and we don’t want to commit to features at the same time. While I’m a > bit disappointed, I think that’s a reasonable position for the community to > take and at least is a clear result. > > rb > >> On Thu, Feb 28, 2019 at 8:38 AM Ryan Blue rb...@netflix.com wrote: >> >> I’d like to call a vote for committing to getting DataSourceV2 in a >> functional state for Spark 3.0. >> >> For more context, please see the discussion thread, but here is a quick >> summary about what this commitment means: >> >> We think that a “functional DSv2” is an achievable goal for the Spark 3.0 >> release >> We will consider this a blocker for Spark 3.0, and take reasonable steps to >> make it happen >> We will not delay the release without a community discussion >> Here’s what we’ve defined as a functional DSv2: >> >> Add a plugin system for catalogs >> Add an interface for table catalogs (see the ongoing SPIP vote) >> Add an implementation of the new interface that calls SessionCatalog to load >> v2 tables >> Add a resolution rule to load v2 tables from the v2 catalog >> Add CTAS logical and physical plan nodes >> Add conversions from SQL parsed plans to v2 logical plans (e.g., INSERT INTO >> support) >> Please vote in the next 3 days on whether you agree with committing to this >> goal. >> >> [ ] +1: Agree that we should consider a functional DSv2 implementation a >> blocker for Spark 3.0 >> [ ] +0: . . . >> [ ] -1: I disagree with this goal because . . . >> >> Thank you! >> >> -- >> Ryan Blue >> Software Engineer >> Netflix > > -- > Ryan Blue > Software Engineer > Netflix