Thanks. Proceeding with the release, except Google, Databricks, that will
get RC3.

On Sun, Dec 17, 2023 at 3:03 PM Aritra Basu <[email protected]>
wrote:

> +1 (non binding)
>
> --
> Regards,
> Aritra Basu
>
> On Sun, Dec 17, 2023, 4:55 PM Amogh Desai <[email protected]>
> wrote:
>
> > +1 non binding
> >
> > On Sun, 17 Dec 2023 at 4:05 PM, Bolke de Bruin <[email protected]>
> wrote:
> >
> > > +1 binding
> > >
> > > Let's get this out.
> > >
> > > B
> > >
> > > Sent from my iPhone
> > >
> > > > On 17 Dec 2023, at 09:20, Rahul Vats <[email protected]> wrote:
> > > >
> > > > +1 non-binding
> > > >
> > > > Regards,
> > > > Rahul Vats
> > > > 9953794332
> > > >
> > > >
> > > >> On Sun, 17 Dec 2023 at 13:18, utkarsh sharma <
> [email protected]>
> > > wrote:
> > > >>
> > > >> +1 non-binding
> > > >>
> > > >> On Sun, 17 Dec 2023 at 1:15 PM, Pankaj Singh <
> > [email protected]>
> > > >> wrote:
> > > >>
> > > >>> +1 (non-binding) since we have excluded google and databricks
> > providers
> > > >>> from this release.
> > > >>>
> > > >>>> On Sun, Dec 17, 2023 at 12:48 PM Jarek Potiuk <[email protected]>
> > > wrote:
> > > >>>
> > > >>>> Bump ..
> > > >>>>
> > > >>>> On Fri, Dec 15, 2023 at 6:26 PM Jarek Potiuk <[email protected]>
> > > wrote:
> > > >>>>
> > > >>>>> One more +1 needed, please :)
> > > >>>>>
> > > >>>>> On Thu, Dec 14, 2023 at 1:39 PM Hussein Awala <[email protected]>
> > > >>> wrote:
> > > >>>>>
> > > >>>>>> +1 (binding) for daskexecutor, docker and odbc providers
> > > >>>>>>
> > > >>>>>> On Thu 14 Dec 2023 at 13:37, Jarek Potiuk <[email protected]>
> > wrote:
> > > >>>>>>
> > > >>>>>>> A small reminder. Despite -1s on Google, databricks, i would
> love
> > > >> to
> > > >>>>>> have
> > > >>>>>>> some +1 on the remaining ones (daskexecutor, docker, odbc)  -
> we
> > > >>> could
> > > >>>>>>> release them now while we are iterating on the last remaining
> > > >>>> databricks
> > > >>>>>>> fix and release RC3 for the three removed.
> > > >>>>>>>
> > > >>>>>>> Need two PMC +1s, pretty please :) .
> > > >>>>>>>
> > > >>>>>>> J
> > > >>>>>>>
> > > >>>>>>> On Wed, Dec 13, 2023 at 2:08 PM Wei Lee <[email protected]>
> > > >>> wrote:
> > > >>>>>>>
> > > >>>>>>>> -1 (non-binding) for the google and databricks providers. Sent
> > > >>>>>>>> https://github.com/apache/airflow/pull/36202 for fixing the
> > > >>> google
> > > >>>>>> issue
> > > >>>>>>>>
> > > >>>>>>>> Best,
> > > >>>>>>>> Wei
> > > >>>>>>>>
> > > >>>>>>>> Jarek Potiuk <[email protected]> 於 2023年12月13日 週三 下午6:34寫道:
> > > >>>>>>>>
> > > >>>>>>>>> Google and databricks will get RC3 (PRs to fix them are
> > > >> already
> > > >>>>>>> created).
> > > >>>>>>>>> I'd still love to have +1 PMC votes on the remaining 3 :)
> > > >>>>>>>>>
> > > >>>>>>>>> J
> > > >>>>>>>>>
> > > >>>>>>>>> On Wed, Dec 13, 2023 at 1:59 PM Pankaj Singh <
> > > >>>>>> [email protected]
> > > >>>>>>>>
> > > >>>>>>>>> wrote:
> > > >>>>>>>>>
> > > >>>>>>>>>> -1 (non-binding) for google and databricks providers since
> > > >>> bug a
> > > >>>>>> has
> > > >>>>>>>> been
> > > >>>>>>>>>> found.
> > > >>>>>>>>>>
> > > >>>>>>>>>> On Wed, Dec 13, 2023 at 3:45 PM Utkarsh Sharma
> > > >>>>>>>>>> <[email protected]> wrote:
> > > >>>>>>>>>>
> > > >>>>>>>>>>> -1 (non-binding)
> > > >>>>>>>>>>>
> > > >>>>>>>>>>> I looked into Databricks providers and the PR
> > > >>>>>>>>>>> <https://github.com/apache/airflow/pull/32319> introduced
> > > >>>>>> breaking
> > > >>>>>>>>>> changes
> > > >>>>>>>>>>> I have added my concern in the comment
> > > >>>>>>>>>>> <
> > > >>>> https://github.com/apache/airflow/pull/36161/files#r1425078748
> > > >>>>>>> .
> > > >>>>>>>>> Also,
> > > >>>>>>>>>> I
> > > >>>>>>>>>>> don't think there is an easy way to make row objects
> > > >>>>>> serializable.
> > > >>>>>>>>>>>
> > > >>>>>>>>>>> On Wed, Dec 13, 2023 at 12:28 PM Rahul Vats <
> > > >>>>>>> [email protected]>
> > > >>>>>>>>>>> wrote:
> > > >>>>>>>>>>>
> > > >>>>>>>>>>>> -1 (non-binding) our example DAG failing for Google and
> > > >>>>>>> Databricks
> > > >>>>>>>>>>>> providers, We are working on fixes.
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>>>   -
> > > >>>>>>>>>>>
> > > >>>>>>>
> > > >>>
> https://pypi.org/project/apache-airflow-providers-google/10.13.0rc2/
> > > >>>>>>>>>>>>   -
> > > >>>>>>>>>>>>
> > > >>>>>>>>>
> > > >>>>>>
> > > >>>
> > https://pypi.org/project/apache-airflow-providers-databricks/5.1.0rc2/
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>>> Regards,
> > > >>>>>>>>>>>> Rahul Vats
> > > >>>>>>>>>>>> 9953794332
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>>> On Wed, 13 Dec 2023 at 11:35, Phani Kumar <
> > > >>>>>>>> [email protected]
> > > >>>>>>>>>>>> .invalid>
> > > >>>>>>>>>>>> wrote:
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>>>> -1 non binding. We are finding failures in astro sdk
> > > >>> DAGs
> > > >>>>>> after
> > > >>>>>>>>> using
> > > >>>>>>>>>>> the
> > > >>>>>>>>>>>>> databricks and google RC.
> > > >>>>>>>>>>>>> We are working on creating PRs for the fixes.
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>> FAILED
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>>
> > > >>>>>>>>>>
> > > >>>>>>>>>
> > > >>>>>>>>
> > > >>>>>>>
> > > >>>>>>
> > > >>>>
> > > >>>
> > > >>
> > >
> >
> tests_integration/databases/databricks_tests/test_delta.py::test_delta_run_sql
> > > >>>>>>>>>>>>> - AttributeError: 'list' object has no attribute
> > > >>> 'asDict'
> > > >>>>>>>>>>>>> FAILED
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>>
> > > >>>>>>>>>>
> > > >>>>>>>>>
> > > >>>>>>>>
> > > >>>>>>>
> > > >>>>>>
> > > >>>>
> > > >>>
> > > >>
> > >
> >
> tests_integration/databases/databricks_tests/test_delta.py::test_delta_run_sql_with_parameters
> > > >>>>>>>>>>>>> - AttributeError: 'list' object has no attribute
> > > >>> 'asDict'
> > > >>>>>>>>>>>>> FAILED
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>>
> > > >>>>>>>>>>
> > > >>>>>>>>>
> > > >>>>>>>>
> > > >>>>>>>
> > > >>>>>>
> > > >>>>
> > > >>>
> > > >>
> > >
> >
> tests_integration/databases/databricks_tests/test_delta.py::test_delta_create_table_with_columns[delta]
> > > >>>>>>>>>>>>> - AssertionError: assert ['id', 'int', None] ==
> > > >>>>>>>> Row(col_name='id',
> > > >>>>>>>>>>>>> data_type='int', comment=None)
> > > >>>>>>>>>>>>>  Full diff:
> > > >>>>>>>>>>>>>  - Row(col_name='id', data_type='int', comment=None)
> > > >>>>>>>>>>>>>  + ['id', 'int', None]
> > > >>>>>>>>>>>>> ERROR
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>>
> > > >>>>>>>>>>
> > > >>>>>>>>>
> > > >>>>>>>>
> > > >>>>>>>
> > > >>>>>>
> > > >>>>
> > > >>>
> > > >>
> > >
> >
> tests_integration/databases/databricks_tests/test_delta.py::test_create_table_from_select_statement[delta]
> > > >>>>>>>>>>>>> - airflow.exceptions.AirflowException: Databricks job
> > > >>>>>> failed.
> > > >>>>>>> Job
> > > >>>>>>>>>> info
> > > >>>>>>>>>>>>> ***'job_id': 438976959785934, 'run_id':
> > > >> 268958235083047,
> > > >>>>>>>>>>>>> 'creator_user_name': '[email protected]',
> > > >>>>>>>> 'number_in_job':
> > > >>>>>>>>>>>>> 268958235083047, 'state': ***'life_cycle_state':
> > > >>>>>> 'TERMINATED',
> > > >>>>>>>>>>>>> 'result_state': 'FAILED', 'state_message': 'Workload
> > > >>>> failed,
> > > >>>>>>> see
> > > >>>>>>>>> run
> > > >>>>>>>>>>>> output
> > > >>>>>>>>>>>>> for details', 'user_cancelled_or_timedout': False***,
> > > >>>>>> 'task':
> > > >>>>>>>>>>>>> ***'spark_python_task': ***'python_file':
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>>
> > > >>>>>>>>>>
> > > >>>>>>>>>
> > > >>>>>>>>
> > > >>>>>>>
> > > >>>>>>
> > > >>>>
> > > >>>
> > > >>
> > >
> >
> 'dbfs:/mnt/pyscripts/load_file__tmp_en0ra8imcxfee4sp8b9y4hzqj4bjy5y2fvgpo9gv0s3pzvmzqv1tfaza9.py'***,
> > > >>>>>>>>>>>>> 'cluster_spec': ***'existing_cluster_id': '***'***,
> > > >>>>>>>>>> 'cluster_instance':
> > > >>>>>>>>>>>>> ***'cluster_id': '***', 'spark_context_id':
> > > >>>>>>>>> '4902558347078657686'***,
> > > >>>>>>>>>>>>> 'start_time': 1702436831591, 'setup_duration': 1000,
> > > >>>>>>>>>>>> 'execution_duration':
> > > >>>>>>>>>>>>> 12000, 'cleanup_duration': 0, 'end_time':
> > > >> 1702436844777,
> > > >>>>>>>>> 'run_name':
> > > >>>>>>>>>>>>> 'Untitled', 'run_page_url': '
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>>
> > > >>>>>>>>>>
> > > >>>>>>>>>
> > > >>>>>>>>
> > > >>>>>>>
> > > >>>>>>
> > > >>>>
> > > >>>
> > > >>
> > >
> >
> https://dbc-9c390870-65ef.cloud.databricks.com/?o=4256138892007661#job/438976959785934/run/268958235083047
> > > >>>>>>>>>>>>> ',
> > > >>>>>>>>>>>>> 'run_type': 'SUBMIT_RUN', 'attempt_number': 0,
> > > >> 'format':
> > > >>>>>>>>>>> 'SINGLE_TASK'***
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>> On Wed, Dec 13, 2023 at 4:02 AM Jarek Potiuk <
> > > >>>>>> [email protected]
> > > >>>>>>>>
> > > >>>>>>>>>> wrote:
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> Hey all,
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> [Filling-in for Elad, who had no time to do it this
> > > >>>> time]
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> I have just cut the new wave Airflow Providers
> > > >>> packages.
> > > >>>>>> This
> > > >>>>>>>>> email
> > > >>>>>>>>>>> is
> > > >>>>>>>>>>>>>> calling a vote on the release,
> > > >>>>>>>>>>>>>> which will last for 24 hours - which means that it
> > > >>> will
> > > >>>>>> end
> > > >>>>>>> on
> > > >>>>>>>>>>> December
> > > >>>>>>>>>>>>> 13,
> > > >>>>>>>>>>>>>> 2023 22:00 PM UTC and until 3 binding +1 votes have
> > > >>> been
> > > >>>>>>>>> received.
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> Following our processes, this is an accelerated vote
> > > >>>>>> taking
> > > >>>>>>>> into
> > > >>>>>>>>>>>> account
> > > >>>>>>>>>>>>>> that the RC1 version has been tested and it only
> > > >>>> contains
> > > >>>>>>>>>> incremental
> > > >>>>>>>>>>>>>> changes.
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> Consider this my (binding) +1.
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> This release contains fixes to issues found in rc1
> > > >> in
> > > >>>>>> google,
> > > >>>>>>>>>> docker,
> > > >>>>>>>>>>>>> odbc,
> > > >>>>>>>>>>>>>> databricks providers and the last (after scheduling
> > > >>> for
> > > >>>>>>>>>>>>>> removal) daskexecutor provider release.
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> Airflow Providers are available at:
> > > >>>>>>>>>>>>>>
> > > >>>> https://dist.apache.org/repos/dist/dev/airflow/providers/
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> *apache-airflow-providers-<PROVIDER>-*.tar.gz* are
> > > >> the
> > > >>>>>> binary
> > > >>>>>>>>>>>>>> Python "sdist" release - they are also official
> > > >>>> "sources"
> > > >>>>>>> for
> > > >>>>>>>>> the
> > > >>>>>>>>>>>>> provider
> > > >>>>>>>>>>>>>> packages.
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> *apache_airflow_providers_<PROVIDER>-*.whl are the
> > > >>>> binary
> > > >>>>>>>>>>>>>> Python "wheel" release.
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> The test procedure for PMC members is described in
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>>
> > > >>>>>>>>>>
> > > >>>>>>>>>
> > > >>>>>>>>
> > > >>>>>>>
> > > >>>>>>
> > > >>>>
> > > >>>
> > > >>
> > >
> >
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-the-release-candidate-by-pmc-members
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> The test procedure for and Contributors who would
> > > >> like
> > > >>>> to
> > > >>>>>>> test
> > > >>>>>>>>> this
> > > >>>>>>>>>>> RC
> > > >>>>>>>>>>>> is
> > > >>>>>>>>>>>>>> described in:
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>>
> > > >>>>>>>>>>
> > > >>>>>>>>>
> > > >>>>>>>>
> > > >>>>>>>
> > > >>>>>>
> > > >>>>
> > > >>>
> > > >>
> > >
> >
> https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-the-release-candidate-by-contributors
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> Public keys are available at:
> > > >>>>>>>>>>>>>>
> > > >>> https://dist.apache.org/repos/dist/release/airflow/KEYS
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> Please vote accordingly:
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> [ ] +1 approve
> > > >>>>>>>>>>>>>> [ ] +0 no opinion
> > > >>>>>>>>>>>>>> [ ] -1 disapprove with the reason
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> Only votes from PMC members are binding, but members
> > > >>> of
> > > >>>>>> the
> > > >>>>>>>>>> community
> > > >>>>>>>>>>>> are
> > > >>>>>>>>>>>>>> encouraged to test the release and vote with
> > > >>>>>> "(non-binding)".
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> Please note that the version number excludes the
> > > >> 'rcX'
> > > >>>>>>> string.
> > > >>>>>>>>>>>>>> This will allow us to rename the artifact without
> > > >>>>>> modifying
> > > >>>>>>>>>>>>>> the artifact checksums when we actually release.
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> The status of testing the providers by the community
> > > >>> is
> > > >>>>>> kept
> > > >>>>>>>>> here:
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> https://github.com/apache/airflow/issues/36194
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> You can find packages as well as detailed changelog
> > > >>>>>> following
> > > >>>>>>>> the
> > > >>>>>>>>>>> below
> > > >>>>>>>>>>>>>> links:
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>
> > > >>>>>>>>
> > > >>>>>>
> > > >>>>
> > > >>
> > >
> https://pypi.org/project/apache-airflow-providers-daskexecutor/1.1.1rc2/
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>
> > > >>>>>>
> > > >>
> https://pypi.org/project/apache-airflow-providers-google/10.13.0rc2/
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>
> > > >>>>>>>>
> > > >>>>>>
> > > >>>
> > https://pypi.org/project/apache-airflow-providers-databricks/5.1.0rc2/
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>
> > > >>>>
> https://pypi.org/project/apache-airflow-providers-docker/3.9.0rc2/
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>
> > > >> https://pypi.org/project/apache-airflow-providers-odbc/4.3.0rc2/
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> Cheers,
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>> J.
> > > >>>>>>>>>>>>>>
> > > >>>>>>>>>>>>>
> > > >>>>>>>>>>>>
> > > >>>>>>>>>>>
> > > >>>>>>>>>>
> > > >>>>>>>>>
> > > >>>>>>>>
> > > >>>>>>>
> > > >>>>>>
> > > >>>>>
> > > >>>>
> > > >>>
> > > >>
> > >
> > > ---------------------------------------------------------------------
> > > To unsubscribe, e-mail: [email protected]
> > > For additional commands, e-mail: [email protected]
> > >
> > >
> >
>

Reply via email to