Jay-Lokhande commented on code in PR #5454:
URL: https://github.com/apache/couchdb/pull/5454#discussion_r1982826270


##########
configure:
##########
@@ -473,11 +492,19 @@ check_local_clouseau_dir() {
 fetch_file() {
     _file_name="$1"
     _file_url="$2"
+    _max_attempts=3
+    _attempt=1
 
-    if ! curl -sSL --max-redirs 1 -o "$_file_name" "$_file_url"; then
-       printf "ERROR: %s could not be downloaded.\n" "$_file_url" >&2
-       exit 1
-    fi
+    while [ $_attempt -le $_max_attempts ]; do
+        if curl -sSL --max-redirs 1 -o "$_file_name" "$_file_url"; then

Review Comment:
   > As far as I am aware `curl` already offers [a built-in retry 
mechanism](https://curl.se/docs/manpage.html#--retry).
   
   You’re absolutely right curl does support retries with options like --retry 
<num> and --retry-delay <seconds>. I added the custom retry loop in fetch_file 
thinking it provided more control and explicit feedback (e.g., per-attempt 
warnings), but I now see it’s redundant given curl’s native capabilities. I’ll 
remove the custom loop and update fetch_file to use curl’s built-in retry 
mechanism instead. Here’s the proposed change:
   
   ```
   fetch_file() {
       _file_name="$1"
       _file_url="$2"
       if ! curl -sSL --max-redirs 1 --retry 3 --retry-delay 2 -o "$_file_name" 
"$_file_url"; then
           printf "ERROR: Failed to download %s after retries.\n" "$_file_url" 
>&2
           exit 1
       fi
   }
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to