Re: [SLUG] trivial, but banging head on wall ...

2013-12-04 Thread Mark Suter
James,  

 First I thing that having spaces in filenames is like wearing a   
 tee shirt saying hit me!.   

Please remember that a pathname for valid POSIX filesystem may  
contain anything except the null character. 

 I'm trying to backup all my wife's pictures and although I can
 do any one file on CLI doing a script is humbling me. If anyone   
 can help I'd be grateful. Thanks  

If possible, just back everything up.  I'd much rather waste a  
bit of disk space than have to tell someone that I didn't backup
something because they didn't ask for it.   

That said, here's a quick command based on your script - try putting
this into http://explainshell.com/ if you don't grok the mechanics: 

  find . -type f \( -iname \*.jpg -o -iname \*.tif -o -iname \*.jpeg -o -iname 
\*.qrf -o -iname \*.nef \) -print0 | 
cpio --null --format=crc --create | 
ssh j...@dvr.home cd /mnt/photos \; cpio --make-directories 
--preserve-modification-time --extract  

The first command, find, just lists all the matching files with 
a null character between the list.  This will handle all kinds of   
weird characters in the filenames.  

The second command, cpio, reads a list of filenames from standard   
input, expecting them to be separated with null characters and  
creates an archive on standard out. 

The third command, ssh, executes the given command on the remote
system.  That command is in two parts: first change directory   
into /mnt/photos and then extract the archive.  

If you wanted to tradeoff CPU and RAM to save network bandwidth, this   
might be a suitable variant, adding compression and decompression at the
inside of the pipeline over the ssh connection: 

  find . -type f \( -iname \*.jpg -o -iname \*.tif -o -iname \*.jpeg -o -iname 
\*.qrf -o -iname \*.nef \) -print0 | 
cpio --null --format=crc --create | 
xz -9 --compress |  
ssh j...@dvr.home cd /mnt/photos \; xz --decompress \| cpio 
--make-directories --preserve-modification-time --extract   

--  
Mark Suter http://zwitterion.org/ | I have often regretted my   
email addr su...@zwitterion.org | speech, never my silence.   
mobile 0411 262 316  gpg FB1BA7E9 | Xenocrates (396-314 B.C.)   
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] trivial, but banging head on wall ...

2013-12-04 Thread Zenaan Harkness
On 12/4/13, Matt Hope matt.h...@gmail.com wrote:
 Random tips:

cp -t argument sometimes useful

rsync -avu src/ host://dest/ might be preferable

when find's -print0 also use:
 [e]grep's -z and -Z (or -zZ) options
 xarg's -0 option

Something I cut and pasted off the internet years ago:
  Way to solve it if you can't use GNU utilities:
find -name *.txt | sed 's//\\/g;s/.*//' | xargs grep whatever
  This escapes out all the double quotes in the pathname and
  then wraps the entire pathname in double quotes.

And if your directory is excessively large (this was a reminder to
myself years ago, probably no longer applicable with the modern
filesystems and proper direntry hashing):
For very large directories (12K files), for some reason piping find
into sed, can cause each iteration (ie. each file) to take a second or
two each. Bizzare. So use the following:
ls  zzz; then do the following; cat zzz |xargs sed -i -e
s%http://wiki.jboss.org/wiki/Wiki.jsp?page=%%g;; -e s%base
href=\http://www.jboss.com/\/%%g;

If you end up on your char fishing expedition, you might prefer to use
the ls  tmp.file form, followed by some sed or perl.

Good luck.
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] trivial, but banging head on wall ...

2013-12-03 Thread Matt Hope
Random tips:

- You can use find in one command, rather then loop over the file extensions
find . -type f ! -name '*thumb*' \( -iname '*.tif' -o -iname '*.jpg'
-o -iname '*.jpeg' -o -name '*.orf' -o -name '*.nef' \)

- Might be a good idea to set IFS to '\0', combined with find's
-print0 argument, or set IFS to $'\n'.
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html