When copying many files to a snapshot directory, try a mass copy
first, as it is much faster. It is however not portable and may thus
fail. If it fails, fallback to per-file processing, which always
works.

This change results in a huge performance boost on systems where
the cp command supports all the required options (which includes
all systems using GNU coreutils.)

Signed-off-by: Jean Delvare <[email protected]>
Reviewed-by: Raphael Hertzog <[email protected]>
---
 quilt/scripts/backup-files.in |   22 ++++++++++++++++++----
 1 file changed, 18 insertions(+), 4 deletions(-)

--- a/quilt/scripts/backup-files.in
+++ b/quilt/scripts/backup-files.in
@@ -261,10 +261,24 @@ copy_many()
        done
        exec 3>&-
 
-       while read -d $'\0' -r
-       do
-               copy "$REPLY"
-       done < "$NONEMPTY_FILES"
+       if [ -s "$NONEMPTY_FILES" ]; then
+               # Try a mass copy first, as it is much faster.
+               # It is however not portable and may thus fail. If it fails,
+               # fallback to per-file processing, which always works.
+
+               if xargs -0 cp -p --parents --target-directory="$OPT_PREFIX" \
+                  < "$NONEMPTY_FILES" 2> /dev/null; then
+                       while read -d $'\0' -r
+                       do
+                               $ECHO "Copying $REPLY"
+                       done < "$NONEMPTY_FILES"
+               else
+                       while read -d $'\0' -r
+                       do
+                               copy "$REPLY"
+                       done < "$NONEMPTY_FILES"
+               fi
+       fi
 }
 
 # Test if some backed up files have a link count greater than 1


_______________________________________________
Quilt-dev mailing list
[email protected]
http://lists.nongnu.org/mailman/listinfo/quilt-dev

Reply via email to