potiuk commented on a change in pull request #20664:
URL: https://github.com/apache/airflow/pull/20664#discussion_r786712445



##########
File path: scripts/ci/libraries/_build_images.sh
##########
@@ -800,6 +624,12 @@ function build_images::build_prod_images() {
         echo
         exit 1
     fi
+    if [[ ${PREPARE_BUILDX_CACHE} == "true" ]]; then
+        # Cache for prod image contains also build stage for buildx when 
mode=max specified!
+        docker_cache_prod_directive+=(
+            "--cache-to=type=registry,ref=${AIRFLOW_PROD_IMAGE}:cache,mode=max"

Review comment:
       Yep. I did consider it, but the size of GHA cache is FAR too small for 
us (10GB will be esily 3x exceeded by the latest CI + prod images, v2-2, v2-3 
branch and we have a lot more caches that are directly used by the GHA (for 
various virtualenvs  and the like). Our images are huge (for a good reason 
though). Also PROD image cache is far bigger than that prod image itself 
because it contains the "build" stage cache which is many times bigger than the 
PROD image.
   
   > Github Actions cache saves both cache metadata and layers to GitHub's 
Cache service. This cache currently has a size limit of 10GB that is shared 
accross different caches in the repo. If you exceed this limit, GitHub will 
save your cache but will begin evicting caches until the total size is less 
than 10 GB. Recycling caches too often can result in slower runtimes overall.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to