To do this I have an externally stored table-of-contents for the tarfile, prepared in the order in which files were tarred when the tarfile was created. My extract command then looks like
tar -xzf archive.tar.gz --files-from=filelist.txt
where filelist.txt has format
-C /dir0
file0
-C /dir2
file2
-C /dir3
file3
-C /dir4
file4
-C /dir0
file5
-C /dir1
file6
etc.
Looking at the actual stderr, the errors look like
tar: html/group__pangommEnums.html: Cannot open: Too many open files
tar: Skipping to next header
tar: html/group__PatternMatching.html: Cannot open: Too many open files
tar: Skipping to next header
tar: html/group__Random.html: Cannot open: Too many open files
...and so on. I do not see this problem if I do not use --files-from and just extract everything to one directory. The problem also appears to be related to many small (1-10KB) files being extracted round-robin, so it may be a timing issue of some sort. I'm probably capable of doing a bit of test&debug myself, but would certainly appreciate a clue as to where to start. Thanks!
_______________________________________________ Bug-tar mailing list [email protected] http://lists.gnu.org/mailman/listinfo/bug-tar
