Hi.

I discussed this topic with Kaxil and we have a solution that could be good
for everyone.

What do you think to release everything except Kubernetes now, and next
week two packages - Google and Kubernetes? This way, users will be able to
use all operators quickly and we will not have packages that have known
defects.

Best regards,
Kamil Breguła

On Tue, Oct 6, 2020 at 9:33 AM Jarek Potiuk <[email protected]>
wrote:

> Kamil - I understand your concerns but we discussed it yesterday (at the
> 2.0 meeting - notes will be posted soon) that the whole point of the
> backport packages and separate providers is that we CAN release a subset of
> those separately.
>
> And the process that seems to be converging for releases is that we can
> exclude particular providers from the release without impacting
> the releasing of all the other providers.
>
> The only reason why we are releasing all the packages together is
> efficiency in voting (and we only release all of them together because we
> black-reformatted all of them and they have modifications that change
> source line numbers etc, so it makes sense to release all of them).
>
> I believe the -1 is not justified here. Can you please reconsider? I'd
> love we all agree on the process before I make the decision as release
> manager.
>
> Also just to remind the rules on package releases (
> https://www.apache.org/foundation/voting.html) - voting on package
> releases cannot be vetoed:
>
> VOTES ON PACKAGE RELEASES
>
> Votes on whether a package is ready to be released use majority approval
> -- i.e. at least three PMC members must vote affirmatively for release, and
> there must be more positive than negative votes. Releases *may not be
> vetoed*. Generally the community will cancel the release vote if anyone
> identifies serious problems, but in most cases the ultimate decision, lies
> with the individual serving as release manager. The specifics of the
> process may vary from project to project, but the 'minimum quorum of three
> +1 votes' rule is universal.
>
> J.
>
>
> On Tue, Oct 6, 2020 at 12:14 AM Kamil Breguła <[email protected]>
> wrote:
>
>> Is there any time pressure or any other pressure for us to release this
>> package now? I think Kubernetes is highly anticipated by users and it is
>> worth considering the arguments that motivate us to make a given decision.
>>
>> -1 to release without cncf.kubernetes and an incomplete google package
>> (without GKEPodOperator). Sorry.
>>
>> On Mon, Oct 5, 2020 at 10:29 PM Kaxil Naik <[email protected]> wrote:
>>
>>> +1 to release without cncf.kubernetes
>>>
>>> On Mon, Oct 5, 2020 at 6:27 PM Jarek Potiuk <[email protected]>
>>> wrote:
>>>
>>>> (though of course releasing it without Kubernetes might be a little
>>>> less exciting Daniel ;)
>>>>
>>>> On Mon, Oct 5, 2020 at 7:25 PM Jarek Potiuk <[email protected]>
>>>> wrote:
>>>>
>>>>> We have 2 +1s now and the comment about the 'cncf.kubernetes" not yet
>>>>> working.
>>>>>
>>>>> My proposal:
>>>>>
>>>>> * we continue with the voting (hopefully one more +1) and release all
>>>>> the sources + backport but do nto publish the cncf.kubernetes one.
>>>>> * we fix (who can take a look?) the cncf.kubernetes one and release
>>>>> and vote cncf.kubernetes separately.
>>>>>
>>>>> I think it will also happen in the future where we will be releasing a
>>>>> batch of providers and we detect problems with one of them. I think we 
>>>>> need
>>>>> some agreed process for that and this might be what I explained.
>>>>>
>>>>> J,
>>>>>
>>>>>
>>>>>
>>>>> On Mon, Oct 5, 2020 at 1:56 AM Kamil Breguła <
>>>>> [email protected]> wrote:
>>>>>
>>>>>> I see another problem. The kubernetes_engine.GKEStartPodOperator
>>>>>> operator is not included in the Google package and the kubernetes_engine
>>>>>> module contains references to the
>>>>>> airflow.contrib.operators.kubernetes_pod_operator.KubernetesPodOperator
>>>>>> operator instead of providers.cncf.kubernetes package.
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Mon, Oct 5, 2020 at 1:47 AM Kamil Breguła <
>>>>>> [email protected]> wrote:
>>>>>>
>>>>>>> I started testing and encountered three problems - Two minor and one
>>>>>>> serious. I used official Docker images for
>>>>>>> testing: apache/airflow:1.10.12 with gcloud installed for GKE 
>>>>>>> authorization
>>>>>>>
>>>>>>> 1. The configuration for StackdriverTaskHandlle is not obvious and
>>>>>>> is not documented anywhere.
>>>>>>> I managed to do this when I created the
>>>>>>> /opt/airflow/config/log_config.py file and set an environment variable
>>>>>>> -  AIRFLOW__LOGGING__LOGGING_CONFIG_CLASS=log_config.LOGGING_CONFIG
>>>>>>> File log_config.py:
>>>>>>> https://gist.github.com/mik-laj/f94e9940bfd3f08cf69287ac62652bc5
>>>>>>>
>>>>>>> However, I had more problems when I tried to use
>>>>>>> KubernetesPodOperator.
>>>>>>>
>>>>>>> 2. The following files are not included in the backport package
>>>>>>> cnd.kubernetes:
>>>>>>>   -
>>>>>>> airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py,
>>>>>>>
>>>>>>> - 
>>>>>>> airflow/providers/cncf/kubernetes/example_dags/example_spark_kubernetes_spark_pi.yaml
>>>>>>>
>>>>>>> I downloaded file /example_spark_kubernetes_spark_pi.yaml from our
>>>>>>> repository and example DAG works
>>>>>>>
>>>>>>> 3. I haven't had such success with another example
>>>>>>> - example_kubernetes_operator. I performed the following steps which 
>>>>>>> ended
>>>>>>> with an error.
>>>>>>> $ curl -LO
>>>>>>> https://raw.githubusercontent.com/apache/airflow/master/airflow/providers/cncf/kubernetes/example_dags/example_kubernetes.py
>>>>>>> $ cat example_kubernetes.py | sed
>>>>>>> 's/operators.bash/operators.bash_operator/g' | sed
>>>>>>> 's/in_cluster=True/in_cluster=False/g' > a.py
>>>>>>> $ mv a.py example_kubernetes.py
>>>>>>> $ airflow test example_kubernetes_operator write-xcom 2010-01-01
>>>>>>> .....
>>>>>>> [2020-10-04 23:29:06,623] {taskinstance.py:1150} ERROR - type object
>>>>>>> 'PodGenerator' has no attribute 'add_xcom_sidecar'
>>>>>>> Traceback (most recent call last):
>>>>>>>   File
>>>>>>> "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py",
>>>>>>> line 984, in _run_raw_task
>>>>>>>     result = task_copy.execute(context=context)
>>>>>>>   File
>>>>>>> "/home/airflow/.local/lib/python3.6/site-packages/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py",
>>>>>>> line 269, in execute
>>>>>>>     self.pod = self.create_pod_request_obj()
>>>>>>>   File
>>>>>>> "/home/airflow/.local/lib/python3.6/site-packages/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py",
>>>>>>> line 405, in create_pod_request_obj
>>>>>>>     pod = PodGenerator.add_xcom_sidecar(pod)
>>>>>>> AttributeError: type object 'PodGenerator' has no attribute
>>>>>>> 'add_xcom_sidecar'
>>>>>>>
>>>>>>> Have I missed any extra step I should do to run this example?
>>>>>>>
>>>>>>> On Sun, Oct 4, 2020 at 7:11 PM Daniel Imberman <
>>>>>>> [email protected]> wrote:
>>>>>>>
>>>>>>>> +1 excited to finally encourage the Kubernetes providers!
>>>>>>>>
>>>>>>>> via Newton Mail
>>>>>>>> <https://cloudmagic.com/k/d/mailapp?ct=dx&cv=10.0.51&pv=10.14.6&source=email_footer_2>
>>>>>>>>
>>>>>>>> On Fri, Oct 2, 2020 at 11:00 AM, Jarek Potiuk <
>>>>>>>> [email protected]> wrote:
>>>>>>>>
>>>>>>>> Hey all,
>>>>>>>>
>>>>>>>> I have cut Airflow Backport Providers 2020.10.5rc1. This email is
>>>>>>>> calling a vote on the release,
>>>>>>>> which will last for 72 hours - which means that it will end on Mon,
>>>>>>>> 5th of October 2020, 19:59:56 CEST.
>>>>>>>>
>>>>>>>> This is the first time we release "cncf.kubernetes" package after
>>>>>>>> big backporting of kubernetes changes
>>>>>>>> to 1.10.11 and 1.10.12, so I'd appreciate, if some of our users
>>>>>>>> looking at the devlist test test this provider
>>>>>>>> - especially if you are waiting for it.
>>>>>>>>
>>>>>>>> Consider this my (binding) +1.
>>>>>>>>
>>>>>>>> Airflow Backport Providers 2020.10.5rc1 are available at:
>>>>>>>>
>>>>>>>> https://dist.apache.org/repos/dist/dev/airflow/backport-providers/2020.10.5rc1/
>>>>>>>>
>>>>>>>> *apache-airflow-backport-providers-2020.10.5rc1-source.tar.gz* is a
>>>>>>>> source release that comes
>>>>>>>> with INSTALL instructions.
>>>>>>>>
>>>>>>>> *apache-airflow-backport-providers-<PROVIDER>-2020.10.5rc1-bin.tar.gz*
>>>>>>>> are the binary
>>>>>>>> Python "sdist" release.
>>>>>>>>
>>>>>>>> Public keys are available at:
>>>>>>>> https://dist.apache.org/repos/dist/release/airflow/KEYS
>>>>>>>>
>>>>>>>> Please vote accordingly:
>>>>>>>>
>>>>>>>> [ ] +1 approve
>>>>>>>> [ ] +0 no opinion
>>>>>>>> [ ] -1 disapprove with the reason
>>>>>>>>
>>>>>>>>
>>>>>>>> Only votes from PMC members are binding, but members of the
>>>>>>>> community are
>>>>>>>> encouraged to test the release and vote with "(non-binding)".
>>>>>>>>
>>>>>>>> Please note that the version number excludes the 'rcX' string, so
>>>>>>>> it's now
>>>>>>>> simply 2020.10.5. This will allow us to rename the artifact without
>>>>>>>> modifying
>>>>>>>> the artifact checksums when we actually release.
>>>>>>>>
>>>>>>>> Each of the packages contains a detailed changelog. Here is the
>>>>>>>> list of links to
>>>>>>>> the released packages and changelogs:
>>>>>>>>
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-amazon/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-apache-cassandra/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-apache-druid/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-apache-hdfs/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-apache-hive/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-apache-kylin/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-apache-livy/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-apache-pig/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-apache-pinot/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-apache-spark/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-apache-sqoop/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-celery/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-cloudant/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-cncf-kubernetes/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-databricks/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-datadog/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-dingding/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-discord/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-docker/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-elasticsearch/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-exasol/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-facebook/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-ftp/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-google/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-grpc/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-hashicorp/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-http/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-imap/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-jdbc/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-jenkins/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-jira/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-microsoft-azure/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-microsoft-mssql/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-microsoft-winrm/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-mongo/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-mysql/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-odbc/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-openfaas/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-opsgenie/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-oracle/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-pagerduty/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-plexus/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-postgres/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-presto/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-qubole/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-redis/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-salesforce/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-samba/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-segment/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-sftp/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-singularity/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-slack/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-snowflake/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-sqlite/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-ssh/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-vertica/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-yandex/2020.10.5rc1/
>>>>>>>>
>>>>>>>> https://pypi.org/project/apache-airflow-backport-providers-zendesk/2020.10.5rc1/
>>>>>>>>
>>>>>>>>
>>>>>>>> Cheers,
>>>>>>>> Jarek
>>>>>>>>
>>>>>>>> --
>>>>>>>>
>>>>>>>> Jarek Potiuk
>>>>>>>> Polidea <https://www.polidea.com/> | Principal Software Engineer
>>>>>>>>
>>>>>>>> M: +48 660 796 129 <+48660796129>
>>>>>>>> [image: Polidea] <https://www.polidea.com/>
>>>>>>>>
>>>>>>>>
>>>>>
>>>>> --
>>>>>
>>>>> Jarek Potiuk
>>>>> Polidea <https://www.polidea.com/> | Principal Software Engineer
>>>>>
>>>>> M: +48 660 796 129 <+48660796129>
>>>>> [image: Polidea] <https://www.polidea.com/>
>>>>>
>>>>>
>>>>
>>>> --
>>>>
>>>> Jarek Potiuk
>>>> Polidea <https://www.polidea.com/> | Principal Software Engineer
>>>>
>>>> M: +48 660 796 129 <+48660796129>
>>>> [image: Polidea] <https://www.polidea.com/>
>>>>
>>>>
>
> --
>
> Jarek Potiuk
> Polidea <https://www.polidea.com/> | Principal Software Engineer
>
> M: +48 660 796 129 <+48660796129>
> [image: Polidea] <https://www.polidea.com/>
>
>

Reply via email to