Re: [D] Apache Spark Provider not working with python3.8 anymore [airflow]

2024-10-31 Thread via GitHub


GitHub user potiuk edited a comment on the discussion: Apache Spark Provider 
not working with python3.8 anymore

This is result of our policy: 
https://github.com/apache/airflow/blob/main/README.md#support-for-python-and-kubernetes-versions
 

Providers are released from `main` and the policy is clear about removing 
support for Python version in main.

I heartily recommend you to upgrade to Python 3.9+ - becuase Python 3.8 will 
not receive any security fixes - so if you want to stick with that you are 
endangering your employer to any security issue that Python team might release 
only for 3.9+.

If - despite knowing the risk - you want to stay with Python 3.8 and want to 
install latest provider version for Python 3.8, there is completely no problem 
for you to check out the right tag and backport the provider to 3.8, release it 
locally in your own registry and use it from there. You are absolutely not 
blocked, and it makes sense that the burden of doing it is on those who want to 
follow not-best-practices and do this for commercial reasons (for example 
becaue it will cost them to perform an upgrade in their own system) rather than 
expect that voluntary maintainers will handle their outdated installation.

We even are about to merge a short chapter explaining in detail how to build 
your own version of provider packages using Airflow's source code as it was a 
little bit burried: see #43524

This is a very deliberate decision of the community - more than 3 years ago 
when we adopted the policy and voted on it on the devlist. And specifically 
shifting the effort to those users who prefer to stay behind was the driver for 
that decision, so the effort that you need to to keep up with older versions of 
Python is expected and we deliberately designed it in the way our release 
policy is defined.

Generally in a very near future (~ 2027) when various regulations in EU and US 
and China wil be effective, you will be required to do that by law (i.e. 
upgrade to versions of software with latest security fixes). You can read it 
here for example 
https://ubuntu.com/blog/the-cyber-resilience-act-what-it-means-for-open-source, 
so this is but a very little intro to what is going to happen for your updates, 
and I heartily recommend you start upgrading your software early and often - 
this will decrease the pain later - when it will be cheaper for your company to 
upgrade rather than not.

GitHub link: 
https://github.com/apache/airflow/discussions/43542#discussioncomment-1363


This is an automatically sent email for commits@airflow.apache.org.
To unsubscribe, please send an email to: commits-unsubscr...@airflow.apache.org



Re: [D] Apache Spark Provider not working with python3.8 anymore [airflow]

2024-10-31 Thread via GitHub


GitHub user potiuk edited a comment on the discussion: Apache Spark Provider 
not working with python3.8 anymore

This is result of our policy: 
https://github.com/apache/airflow/blob/main/README.md#support-for-python-and-kubernetes-versions
 

Providers are released from `main` and the policy is clear about removing 
support for Python version in main.

I heartily recommend you to upgrade to Python 3.9+ - becuase Python 3.8 will 
not receive any security fixes - so if you want to stick with that you are 
endangering your employer to any security issue that Python team might release 
only for 3.9+.

If - despite knowing the risk - you want to stay with Python 3.8 and want to 
install latest provider version for Python 3.8, there is completely no problem 
for you to check out the right tag and backport the provider to 3.8, release it 
locally in your own registry and use it from there. You are absolutely not 
blocked, and it makes sense that the burden of doing it is on those who want to 
follow not-best-practices and do this for commercial reasons (for example 
becaue it will cost them to perform an upgrade in their own system) rather than 
expect that voluntary maintainers will handle their outdated installation,

We even are about to merge a short chapter explaining in detail how to build 
your own version of provider packages using Airflow's source code as it was a 
little bit burried: see #43524

This is a very deliberate decision of the community - more than 3 years ago 
when we adopted the policy and voted on it on the devlist. And specifically 
shifting the effort to those users who prefer to stay behind was the driver for 
that decision, so the effort that you need to to keep up with older versions of 
Python is expected and we deliberately designed it in the way our release 
policy is defined.

Generally in a very near future (~ 2027) when various regulations in EU and US 
and China wil be effective, you will be required to do that by law (i.e. 
upgrade to versions of software with latest security fixes). You can read it 
here for example 
https://ubuntu.com/blog/the-cyber-resilience-act-what-it-means-for-open-source, 
so this is but a very little intro to what is going to happen for your updates, 
and I heartily recommend you start upgrading your software early and often - 
this will decrease the pain later - when it will be cheaper for your company to 
upgrade rather than not.

GitHub link: 
https://github.com/apache/airflow/discussions/43542#discussioncomment-1363


This is an automatically sent email for commits@airflow.apache.org.
To unsubscribe, please send an email to: commits-unsubscr...@airflow.apache.org



Re: [D] Apache Spark Provider not working with python3.8 anymore [airflow]

2024-10-31 Thread via GitHub


GitHub user potiuk edited a comment on the discussion: Apache Spark Provider 
not working with python3.8 anymore

This is result of our policy: 
https://github.com/apache/airflow/blob/main/README.md#support-for-python-and-kubernetes-versions
 

Providers are released from `main` and the policy is clear about removing 
support for Python version in main.

I heartily recommend you to upgrade to Python 3.9+ - becuase Python 3.8 will 
not receive any security fixes - so if you want to stick with that you are 
endangering your employer to any security fixes that Python team might release.

If - despite knowing the risk - you want to stay with Python 3.8 and want to 
install latest provider version for Python 3.8, there is completely no problem 
for you to check out the right tag and backport the provider to 3.8, release it 
locally in your own registry and use it from there. You are absolutely not 
blocked, and it makes sense that the burden of doing it is on those who want to 
follow not-best-practices and do this for commercial reasons (for example 
becaue it will cost them to perform an upgrade in their own system) rather than 
expect that voluntary maintainers will handle their outdated installation,

We even are about to merge a short chapter explaining in detail how to build 
your own version of provider packages using Airflow's source code as it was a 
little bit burried: see #43524

This is a very deliberate decision of the community - more than 3 years ago 
when we adopted the policy and voted on it on the devlist. And specifically 
shifting the effort to those users who prefer to stay behind was the driver for 
that decision, so the effort that you need to to keep up with older versions of 
Python is expected and we deliberately designed it in the way our release 
policy is defined.

Generally in a very near future (~ 2027) when various regulations in EU and US 
and China wil be effective, you will be required to do that by law (i.e. 
upgrade to versions of software with latest security fixes). You can read it 
here for example 
https://ubuntu.com/blog/the-cyber-resilience-act-what-it-means-for-open-source, 
so this is but a very little intro to what is going to happen for your updates, 
and I heartily recommend you start upgrading your software early and often - 
this will decrease the pain later - when it will be cheaper for your company to 
upgrade rather than not.

GitHub link: 
https://github.com/apache/airflow/discussions/43542#discussioncomment-1363


This is an automatically sent email for commits@airflow.apache.org.
To unsubscribe, please send an email to: commits-unsubscr...@airflow.apache.org



Re: [D] Apache Spark Provider not working with python3.8 anymore [airflow]

2024-10-31 Thread via GitHub


GitHub user potiuk edited a comment on the discussion: Apache Spark Provider 
not working with python3.8 anymore

This is result of our policy: 
https://github.com/apache/airflow/blob/main/README.md#support-for-python-and-kubernetes-versions
 

Providers are released from `main` and the policy is clear about removing 
support for Python version in main.

I heartily recommend you to upgrade to Python 3.9+ - becuase Python 3.8 will 
not receive any security fixes - so if you want to stick with that you are 
endangering your employer to any security fixes that Python team might release.

If - despite knowing the risk - you want to stay with Python 3.8 and want to 
install latest provider version for Python 3.8, there is completely no problem 
for you to check out the right tag and backport the provider to 3.8, release it 
locally in your own registry and use it from there. You are absolutely not 
blocked, and it makes sense that the burden of doing it is on those who want to 
follow not-best-practices and do this for commercial reasons (for example 
becaue it will cost them to perform an upgrade in their own system) rather than 
expect that voluntary maintainers will handle their outdated installation,

We even are about to merge a short chapter explaining in detail how to build 
your own version of provider packages using Airflow's source code as it was a 
little bit burried: see #43524

This is a very deliberate decision of the community - more than 3 years ago 
when we adopted the policy and voted on it on the devlist. And specifically 
shifting the effort to those users who prefer to stay behind was the driver for 
that decision, so the effort that you need to to keep up with older versions of 
Python is expected and we deliberately designed it in the way our release 
policy is defined.

Generally in a very near future (~ 2027) when various regulations in EU and US 
and China wil be effective, you will be required to do that by law (i.e. 
upgrade to versions of software with latest security fixes). You can read it 
here for example 
https://ubuntu.com/blog/the-cyber-resilience-act-what-it-means-for-open-source, 
so this is but a very little intro to what is going to happen for your updates, 
and I heartily recommend you start upgrading your software early and often - 
this will decrease the pain later - when it will be cheaper for your company to 
upgrade rahter than not.

GitHub link: 
https://github.com/apache/airflow/discussions/43542#discussioncomment-1363


This is an automatically sent email for commits@airflow.apache.org.
To unsubscribe, please send an email to: commits-unsubscr...@airflow.apache.org



Re: [D] Apache Spark Provider not working with python3.8 anymore [airflow]

2024-10-31 Thread via GitHub


GitHub user potiuk edited a comment on the discussion: Apache Spark Provider 
not working with python3.8 anymore

This is result of our policy: 
https://github.com/apache/airflow/blob/main/README.md#support-for-python-and-kubernetes-versions
 

Providers are released from `main` and the policy is clear about removing 
support for Python version in main.

I heartily recommend you to upgrade to Python 3.9+ - becuase Python 3.8 will 
not receive any security fixes - so if you want to stick with that you are 
endangering your employer to any security fixes that Python team might release.

If - despite knowing the risk - you want to stay with Python 3.8 and want to 
install latest provider version for Python 3.8, there is completely no problem 
for you to check out the right tag and backport the provider to 3.8, release it 
locally in your own registry and use it from there. You are absolutely not 
blocked, and it makes sense that the burden of doing it is on those who want to 
follow not-best-practices and do this for commercial reasons (for example 
becaue it will cost them to perform an upgrade in their own system) rather than 
expect that voluntary maintainers will handle their outdated installation,

We even are about to merge a short chapter explaining in detail how to build 
your own version of provider packages using Airflow's source code as it was a 
little bit burried: see #43524

This is a very deliberate decision community more than 3 years ago when we 
adopted the policy and voted on it on the devlist. And specifically shifting 
the effort to those users who prefer to stay behind was the driver for that 
decision, so the effort that you need to to keep up with older versions of 
Python is expected and we deliberately designed it in the way our release 
policy is defined.

Generally in a very near future (~ 2027) when various regulations in EU and US 
and China wil be effective, you will be required to do that by law (i.e. 
upgrade to versions of software with latest security fixes). You can read it 
here for example 
https://ubuntu.com/blog/the-cyber-resilience-act-what-it-means-for-open-source, 
so this is but a very little intro to what is going to happen for your updates, 
and I heartily recommend you start upgrading your software early and often - 
this will decrease the pain later - when it will be cheaper for your company to 
upgrade rahter than not.

GitHub link: 
https://github.com/apache/airflow/discussions/43542#discussioncomment-1363


This is an automatically sent email for commits@airflow.apache.org.
To unsubscribe, please send an email to: commits-unsubscr...@airflow.apache.org



Re: [D] Apache Spark Provider not working with python3.8 anymore [airflow]

2024-10-31 Thread via GitHub


GitHub user potiuk added a comment to the discussion: Apache Spark Provider not 
working with python3.8 anymore

This is result of our policy: 
https://github.com/apache/airflow/blob/main/README.md#support-for-python-and-kubernetes-versions
 

Providers are released from `main` and the policy is clear about removing 
support for Python versio in main.

I heartily recommend you to upgrade to Python 3.9+ - becuase Python 3.8 will 
not receive any security fixes - so if you want to stick with that you are 
endangering your employer to any security fixes that Python team might release.

If - despite knowing the risk - you want to stay with Python 3.8 and want to 
install latest provider version for Python 3.8, there is completely no problem 
for you to check out the right tag and backport the provider to 3.8, release it 
locally in your own registry and use it from there. You are absolutely not 
blocked, and it makes sense that the burden of doing it is on those who want to 
follow not-best-practices and do this for commercial reasons (for example 
becaue it will cost them to perform an upgrade in their own system) rather than 
expect that voluntary maintainers will handle their outdated installation,

We even are about to merge a short chapter explaining in detail how to build 
your own version of provider packages using airlfow's source code as it was a 
little bit burried: see #43524

This is a very deliberate decision community more than 3 years ago when we 
adopted the policy and voted on it on the devlist. And specifically shifting 
the effort to those users who prefer to stay behind was the driver for that 
decision, so the effort that you need to to keep up with older versions of 
Python is expected and we deliberately designed it in the way our release 
policy is defined.

Generally in a very near future (~ 2027) when various regulations in EU and US 
and China wil be effective, you will be required to do that by law (i.e. 
upgrade to versions of software with latest security fixes). You can read it 
here for example 
https://ubuntu.com/blog/the-cyber-resilience-act-what-it-means-for-open-source, 
so this is but a very little intro to what is going to happen for your updates, 
and I heartily recommend you start upgrading your software early and often - 
this will decrease the pain later - when it will be cheaper for your company to 
upgrade rahter than not.

GitHub link: 
https://github.com/apache/airflow/discussions/43542#discussioncomment-1363


This is an automatically sent email for commits@airflow.apache.org.
To unsubscribe, please send an email to: commits-unsubscr...@airflow.apache.org



Re: [D] Apache Spark Provider not working with python3.8 anymore [airflow]

2024-10-31 Thread via GitHub


GitHub user potiuk edited a comment on the discussion: Apache Spark Provider 
not working with python3.8 anymore

This is result of our policy: 
https://github.com/apache/airflow/blob/main/README.md#support-for-python-and-kubernetes-versions
 

Providers are released from `main` and the policy is clear about removing 
support for Python version in main.

I heartily recommend you to upgrade to Python 3.9+ - becuase Python 3.8 will 
not receive any security fixes - so if you want to stick with that you are 
endangering your employer to any security fixes that Python team might release.

If - despite knowing the risk - you want to stay with Python 3.8 and want to 
install latest provider version for Python 3.8, there is completely no problem 
for you to check out the right tag and backport the provider to 3.8, release it 
locally in your own registry and use it from there. You are absolutely not 
blocked, and it makes sense that the burden of doing it is on those who want to 
follow not-best-practices and do this for commercial reasons (for example 
becaue it will cost them to perform an upgrade in their own system) rather than 
expect that voluntary maintainers will handle their outdated installation,

We even are about to merge a short chapter explaining in detail how to build 
your own version of provider packages using airlfow's source code as it was a 
little bit burried: see #43524

This is a very deliberate decision community more than 3 years ago when we 
adopted the policy and voted on it on the devlist. And specifically shifting 
the effort to those users who prefer to stay behind was the driver for that 
decision, so the effort that you need to to keep up with older versions of 
Python is expected and we deliberately designed it in the way our release 
policy is defined.

Generally in a very near future (~ 2027) when various regulations in EU and US 
and China wil be effective, you will be required to do that by law (i.e. 
upgrade to versions of software with latest security fixes). You can read it 
here for example 
https://ubuntu.com/blog/the-cyber-resilience-act-what-it-means-for-open-source, 
so this is but a very little intro to what is going to happen for your updates, 
and I heartily recommend you start upgrading your software early and often - 
this will decrease the pain later - when it will be cheaper for your company to 
upgrade rahter than not.

GitHub link: 
https://github.com/apache/airflow/discussions/43542#discussioncomment-1363


This is an automatically sent email for commits@airflow.apache.org.
To unsubscribe, please send an email to: commits-unsubscr...@airflow.apache.org



Re: [D] Apache Spark Provider not working with python3.8 anymore [airflow]

2024-10-31 Thread via GitHub


GitHub user duhizjame edited a discussion: Apache Spark Provider not working 
with python3.8 anymore

Hi everyone, 

I have noticed that since 4.11.1 the Apache Spark provider is not working with 
python3.8. What was the reason to abandon it in a minor version? Since it is 
requiring airflow >= 2.8.0 which is also still available on python3.8.

Python3.8 would at least make it easier to use in all Spark docker 
distributions(Spark 3.5.* official docker image uses python3.8), I am sure 
there are some distributions in the wild also running on this version 😄 

GitHub link: https://github.com/apache/airflow/discussions/43542


This is an automatically sent email for commits@airflow.apache.org.
To unsubscribe, please send an email to: commits-unsubscr...@airflow.apache.org