assignUser commented on code in PR #13379:
URL: https://github.com/apache/arrow/pull/13379#discussion_r898176742


##########
ci/scripts/ccache_setup.sh:
##########
@@ -23,4 +23,4 @@ echo "ARROW_USE_CCACHE=ON" >> $GITHUB_ENV
 echo "CCACHE_COMPILERCHECK=content" >> $GITHUB_ENV
 echo "CCACHE_COMPRESS=1" >> $GITHUB_ENV
 echo "CCACHE_COMPRESSLEVEL=6" >> $GITHUB_ENV
-echo "CCACHE_MAXSIZE=500M" >> $GITHUB_ENV
+echo "CCACHE_MAXSIZE=2G" >> $GITHUB_ENV

Review Comment:
   But is that not what we want? With a cache size big enough the files from 
different workflows would just coexist in the cache until max-size is reached, 
at which point some would be evicted. 
   
   We seem to have 80G of cache capacity, so we could use some big max-size 
like 20G *BUT*  due to the `actions/cache` issue this would almost guarantee 
that any less used non-ccache caches would be evicted very quickly. To me 
sccache seems to be the best solution for this issue at the moment...



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to