potiuk commented on issue #41937:
URL: https://github.com/apache/airflow/issues/41937#issuecomment-2507724501

   > Thanks, to my understand this one to implement for all the packages that 
we release today? airflow, providers, airflow python client?
   
   Absolutely. We can start with providers - then we will have a chance to test 
it quickly - and then use it for the others.
    
   > I’m planning to implement in generalize way, so that it can be useful for 
other Apache projects. This would allow them to integrate their scripts if they 
have any custom, and run it as a GitHub Action, WDYT? :)
   
   Perfect. ASF infrastructure already has a repo for shared actions in ASF:  
https://github.com/apache/infrastructure-actions and we are just discussing to 
splitting it to separate actions, but the idea is to have something that will 
be reusable across many projects.
   
   I think the best (and most reusable) way of publishing is to use packages 
released in "svn".
   
   We should be able to plug in this step in the release process. And we have 
two different steps there:
   
   1) RC candidates.
   
   * 
https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#publish-the-regular-convenience-package-to-pypi
   
   We should just download all packages from 
https://dist.apache.org/repos/dist/dev/airflow/providers/pypi/
   
   But we need to make sure that the "pypi" packages are also stored in SVN  - 
the little difficulty is that we currently do not upload "pypi" RC packages to 
SVN.  The "rc" candidates in SVN are different than the one we publish to PyPI, 
because they do not contain "rc" in the version (because potentially they might 
become final candidates to upload).
   
   So we should modify the process and make sure that we upload "pypi" packages 
to SVN (I propose to add "pypi" subfolder - so the release process shoudl be 
updated to clean/recreate the `pypi` subfolder and push the pypi packages there.
   
   Then our action should be as simple as a) download the right packages from 
SVN b) push them to PyPI via trusted publishing. I have a feeling that this 
might be simply a standard xml composite action using existing actions from 
GitHub  - nothing fancy, we might just add a few options:
   
   a) test mode - do everything except that final step will be just printing 
what should be done
   b) verification options -  it should be possible to also download and verify 
the signatures, checksums (and later maybe licences) in the downloaded 
artifacts.
   
   2) Final packages:
   
   Final packages can be directly downloaded from 
https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#publish-release-to-svn
 -    i.e. download and push packages from 
https://dist.apache.org/repos/dist/release/airflow/providers/ 
   
   There - the little difficulty with it that we should know WHICH packages to 
upload. And this I think means that the easiest will be to do it just before 
this step:
   
   ```bash
   for file in "${SOURCE_DIR}"/*
   do
    base_file=$(basename ${file})
    cp -v "${file}" "${AIRFLOW_REPO_ROOT}/dist/${base_file//rc[0-9]/}"
    svn mv "${file}" "${base_file//rc[0-9]/}"
   done
   ```
   
   Because at this moment, release manager already removed the files that were 
removed during voting - so "dist" contains only the packages that in a moment 
will be promoted to be "final" packages. 
   
   There, likely we need some controls - for example, being able to manually 
override which packages we want to publish.
   
   We might start with a simple set of features, but later on that action might 
become a little more feature-full.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to