The "check for hard links" use case is almost always a no-op. Check if any work is needed at all first, and only if this is the case, walk the list of files and unlink the faulty files. This approach results in a huge performance gain in the most common case, and a very small performance loss in the uncommon case.
Signed-off-by: Jean Delvare <[email protected]> Reviewed-by: Raphael Hertzog <[email protected]> --- quilt/scripts/backup-files.in | 14 ++++++++++++++ 1 file changed, 14 insertions(+) --- a/quilt/scripts/backup-files.in +++ b/quilt/scripts/backup-files.in @@ -187,6 +187,13 @@ noop_nolinks() fi } +# Test if some backed up files have a link count greater than 1 +some_files_have_links() +{ + (cd "$OPT_PREFIX" && find . -type f -print0) \ + | xargs -0 stat @STAT_HARDLINK@ 2> /dev/null | grep -qv '^1$' +} + ECHO=echo while [ $# -gt 0 ]; do @@ -258,6 +265,13 @@ if [ "$1" = - ]; then exit fi + # We typically expect the link count of backed up files to be 1 + # already, so check quickly that this is the case, and only if not, + # take the slow path and walk the file list in search of files to fix. + if [ "$OPT_WHAT" = noop_nolinks ] && ! some_files_have_links; then + exit + fi + find "$OPT_PREFIX" -type f -print \ | while read do _______________________________________________ Quilt-dev mailing list [email protected] http://lists.nongnu.org/mailman/listinfo/quilt-dev
