Re: [BackupPC-users] Search for File

2011-10-01 Thread Jeffrey J. Kosowsky
Tim Connors wrote at about 11:15:31 +1000 on Thursday, September 29, 2011:
  On Wed, 28 Sep 2011, Timothy J Massey wrote:
  
   Arnold Krille arn...@arnoldarts.de wrote on 09/28/2011 11:20:57 AM:
  
 I'm sure someone with more shell-fu will give you a much better
   command
 line (and I look forward to learning something!).
   
Here you are:
   
find path_where_to_start -iname string_to_search
  ...
Using find you will realize that its rather slow and has your disk
   rattling
away. Better to use the indexing services, for example locate:
   
locate string_to_search
  
   Yeah, that's great if you update the locate database (as you mention).  On
   a backup server, with millions of files and lots of work to do pretty much
   around the clock?  That's one of the first things I disable!  So no
   locate.
  
  Hmmm.
  
  When I want to search for a file (half the time I don't even know what
  machine or from what time period, so I have to search the entire pool), I
  look at the mounted backuppcfs fuse filesystem (I mount onto /snapshots):
  https://svn.ulyssis.org/repos/sipa/backuppc-fuse/backuppcfs.pl

I too would recommend backuppc-fuse - though the one disadvantage is
that it is a lot slower than a native search through the pc tree since
the directories need to be reconstructed from the relevant partials 
fulls (which is a *good* thing but slow).

--
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security
threats, fraudulent activity, and more. Splunk takes this data and makes
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2dcopy2
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Search for File

2011-10-01 Thread Jeffrey J. Kosowsky
Timothy J Massey wrote at about 10:30:18 -0400 on Wednesday, September 28, 2011:
  Gerald Brandt g...@majentis.com wrote on 09/28/2011 10:15:12 AM:
  
   I need to search for a specific file on a host, via backuppc.  Is 
   there a way to search a host backup, so I don't have to manually go 
   through all directories via the web interface?
  
  The easiest, most direct way of doing that would be:
  
  cd /path/to/host/pc/directory
  find . | grep ffilename
  

I think it would generally be faster to do:
  find . -name ffilename

This still may have a problem in that the f-mangling *also* converts
non-printable ascii characters (and also whitespace and /) into %hex
codes. So, if your filename contains any of those chars then you need
to change the search term to be written that way.

Also, you need to be careful about incrementals vs. fulls since incrementals
will include only the most recently changed files while fulls might
not include the latest version if there are subsequent incrementals.

You can avoid both of the above problems by using backuppc-fuse as
pointed out by another respondent, though it may be slower.

--
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security
threats, fraudulent activity, and more. Splunk takes this data and makes
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2dcopy2
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Search for File

2011-09-29 Thread Bernd Rilling
Am Mittwoch, 28. September 2011 schrieb Gerald Brandt:
 Hi,
 
 I need to search for a specific file on a host, via backuppc.  Is there a
 way to search a host backup, so I don't have to manually go through all
 directories via the web interface?

Maybe another solution: you can simply look at the last full xferlog for 
that host and use the browser search function to locate the desired file
Then you can use the history function of backuppc to look at the different 
versions of that file.

Bye, Bernd

--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Search for File

2011-09-28 Thread Gerald Brandt
Hi,

I need to search for a specific file on a host, via backuppc.  Is there a way 
to search a host backup, so I don't have to manually go through all directories 
via the web interface?

Gerald

--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Search for File

2011-09-28 Thread Timothy J Massey
Gerald Brandt g...@majentis.com wrote on 09/28/2011 10:15:12 AM:

 I need to search for a specific file on a host, via backuppc.  Is 
 there a way to search a host backup, so I don't have to manually go 
 through all directories via the web interface?

The easiest, most direct way of doing that would be:

cd /path/to/host/pc/directory
find . | grep ffilename

I'm sure someone with more shell-fu will give you a much better command 
line (and I look forward to learning something!).  I'm sure there's a way 
to do it simply with the find command alone, but I've had limited success 
trying to limit the find command to find specific files.  For me, it's 
easier to use grep as above.  My way will work, if a bit slowly:  there's 
lots of files in there...

Don't forget the leading f in the filename:  BackupPC puts an f in front 
of every filename in the directory structure.

Tim Massey

 
Out of the Box Solutions, Inc. 
Creative IT Solutions Made Simple!
http://www.OutOfTheBoxSolutions.com
tmas...@obscorp.com 
 
22108 Harper Ave.
St. Clair Shores, MI 48080
Office: (800)750-4OBS (4627)
Cell: (586)945-8796 
--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Search for File

2011-09-28 Thread Gerald Brandt
Hi Tim, 

That's basically what I did, but I have a couple of BackupPC users that have no 
clue about command line stuff, so I was hoping for a BackupPC web based 
solution. 

Gerald 

- Original Message -

 From: Timothy J Massey tmas...@obscorp.com
 To: General list for user discussion, questions and support
 backuppc-users@lists.sourceforge.net
 Sent: Wednesday, September 28, 2011 9:30:18 AM
 Subject: Re: [BackupPC-users] Search for File

 Gerald Brandt g...@majentis.com wrote on 09/28/2011 10:15:12 AM:

  I need to search for a specific file on a host, via backuppc. Is
  there a way to search a host backup, so I don't have to manually go
  through all directories via the web interface?

 The easiest, most direct way of doing that would be:

 cd /path/to/host/pc/directory
 find . | grep ffilename

 I'm sure someone with more shell-fu will give you a much better
 command line (and I look forward to learning something!). I'm sure
 there's a way to do it simply with the find command alone, but I've
 had limited success trying to limit the find command to find
 specific files. For me, it's easier to use grep as above. My way
 will work, if a bit slowly: there's lots of files in there...

 Don't forget the leading f in the filename: BackupPC puts an f in
 front of every filename in the directory structure.

 Tim Massey
 
 Out of the Box Solutions, Inc.
 Creative IT Solutions Made Simple!
 http://www.OutOfTheBoxSolutions.com
 tmas...@obscorp.com   22108 Harper Ave.
 St. Clair Shores, MI 48080
 Office: (800)750-4OBS (4627)
 Cell: (586)945-8796

 --
 All the data continuously generated in your IT infrastructure
 contains a
 definitive record of customers, application performance, security
 threats, fraudulent activity and more. Splunk takes this data and
 makes
 sense of it. Business sense. IT sense. Common sense.
 http://p.sf.net/sfu/splunk-d2dcopy1
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki: http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Search for File

2011-09-28 Thread Steve
Don't know if it's faster than your way or not, but I've used:
find -type f -name *thing_i_want
note you can use wildcards...

a.

On Wed, Sep 28, 2011 at 10:52 AM, Gerald Brandt g...@majentis.com wrote:

 Hi Tim,

 That's basically what I did, but I have a couple of BackupPC users that have 
 no clue about command line stuff, so I was hoping for a BackupPC web based 
 solution.

 Gerald


 

 From: Timothy J Massey tmas...@obscorp.com
 To: General list for user discussion, questions and support 
 backuppc-users@lists.sourceforge.net
 Sent: Wednesday, September 28, 2011 9:30:18 AM
 Subject: Re: [BackupPC-users] Search for File

 Gerald Brandt g...@majentis.com wrote on 09/28/2011 10:15:12 AM:

  I need to search for a specific file on a host, via backuppc.  Is
  there a way to search a host backup, so I don't have to manually go
  through all directories via the web interface?

 The easiest, most direct way of doing that would be:

 cd /path/to/host/pc/directory
 find . | grep ffilename

 I'm sure someone with more shell-fu will give you a much better command line 
 (and I look forward to learning something!).  I'm sure there's a way to do it 
 simply with the find command alone, but I've had limited success trying to 
 limit the find command to find specific files.  For me, it's easier to use 
 grep as above.  My way will work, if a bit slowly:  there's lots of files in 
 there...

 Don't forget the leading f in the filename:  BackupPC puts an f in front of 
 every filename in the directory structure.

 Tim Massey

 Out of the Box Solutions, Inc.
 Creative IT Solutions Made Simple!
 http://www.OutOfTheBoxSolutions.com
 tmas...@obscorp.com       22108 Harper Ave.
 St. Clair Shores, MI 48080
 Office: (800)750-4OBS (4627)
 Cell: (586)945-8796

 --
 All the data continuously generated in your IT infrastructure contains a
 definitive record of customers, application performance, security
 threats, fraudulent activity and more. Splunk takes this data and makes
 sense of it. Business sense. IT sense. Common sense.
 http://p.sf.net/sfu/splunk-d2dcopy1
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


 --
 All the data continuously generated in your IT infrastructure contains a
 definitive record of customers, application performance, security
 threats, fraudulent activity and more. Splunk takes this data and makes
 sense of it. Business sense. IT sense. Common sense.
 http://p.sf.net/sfu/splunk-d2dcopy1
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:    http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




--
The universe is probably littered with the one-planet graves of
cultures which made the sensible economic decision that there's no
good reason to go into space--each discovered, studied, and remembered
by the ones who made the irrational decision. - Randall Munroe

--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Search for File

2011-09-28 Thread Arnold Krille
On Wednesday 28 September 2011 16:30:18 Timothy J Massey wrote:
 Gerald Brandt g...@majentis.com wrote on 09/28/2011 10:15:12 AM:
  I need to search for a specific file on a host, via backuppc.  Is
  there a way to search a host backup, so I don't have to manually go
  through all directories via the web interface?
 
 The easiest, most direct way of doing that would be:
 
 cd /path/to/host/pc/directory
 find . | grep ffilename
 
 I'm sure someone with more shell-fu will give you a much better command
 line (and I look forward to learning something!).

Here you are:

find path_where_to_start -iname string_to_search

iname means case-insensitive, so you don't have to care about that.
if you want to search for a combination of directory and filename, you have to 
think about the 'f' backuppc puts in front.

Using find you will realize that its rather slow and has your disk rattling 
away. Better to use the indexing services, for example locate:

locate string_to_search

gives a list of hits. But only from the state when locate last rebuilt its 
index (should happen daily/nightly). That is good enough to find files last 
seen 
two weeks ago, but doesn't find that file you just downloaded and can't 
remember 
where you saved it.

There are also disk-indexing services with web-frontends, htdig comes to mind. 
That even finds stuff inside the files.

Have fun,

Arnold


signature.asc
Description: This is a digitally signed message part.
--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Search for File

2011-09-28 Thread Timothy J Massey
Arnold Krille arn...@arnoldarts.de wrote on 09/28/2011 11:20:57 AM:

  I'm sure someone with more shell-fu will give you a much better 
command
  line (and I look forward to learning something!).
 
 Here you are:
 
 find path_where_to_start -iname string_to_search

Now I remember why I stick with the grep form:  remembering the different 
syntax of the find command.  As a *not* old-time UNIX guru (but a 
long-time but not full-time *Linux* user), I think that any parameter of 
multiple letters (like -name) should have two dashes!  :)  I am often 
frustrated by the unusual find command syntax, so I simply stick with 
grep, which has many more uses beyond finding files.

 Using find you will realize that its rather slow and has your disk 
rattling 
 away. Better to use the indexing services, for example locate:
 
 locate string_to_search

Yeah, that's great if you update the locate database (as you mention).  On 
a backup server, with millions of files and lots of work to do pretty much 
around the clock?  That's one of the first things I disable!  So no 
locate.

Timothy J. Massey

 
Out of the Box Solutions, Inc. 
Creative IT Solutions Made Simple!
http://www.OutOfTheBoxSolutions.com
tmas...@obscorp.com 
 
22108 Harper Ave.
St. Clair Shores, MI 48080
Office: (800)750-4OBS (4627)
Cell: (586)945-8796 
--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Search for File

2011-09-28 Thread Arnold Krille
On Wednesday 28 September 2011 17:23:17 Timothy J Massey wrote:
 Arnold Krille arn...@arnoldarts.de wrote on 09/28/2011 11:20:57 AM:
  Using find you will realize that its rather slow and has your disk
 rattling
  away. Better to use the indexing services, for example locate:
  
  locate string_to_search
 
 Yeah, that's great if you update the locate database (as you mention).  On
 a backup server, with millions of files and lots of work to do pretty much
 around the clock?  That's one of the first things I disable!  So no
 locate.

You could limit locate to the paths you want to be indexed. Or you could 
exclude the (c)pool of backuppc and still get the information.
And adding some minutes of updatedb indexing the filesystem-tree (its not even 
indexing the contents) to BackupPC-nightly shouldn't hurt that much.

Have fun,

Arnold


signature.asc
Description: This is a digitally signed message part.
--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Search for File

2011-09-28 Thread Tim Connors
On Wed, 28 Sep 2011, Timothy J Massey wrote:

 Arnold Krille arn...@arnoldarts.de wrote on 09/28/2011 11:20:57 AM:

   I'm sure someone with more shell-fu will give you a much better
 command
   line (and I look forward to learning something!).
 
  Here you are:
 
  find path_where_to_start -iname string_to_search
...
  Using find you will realize that its rather slow and has your disk
 rattling
  away. Better to use the indexing services, for example locate:
 
  locate string_to_search

 Yeah, that's great if you update the locate database (as you mention).  On
 a backup server, with millions of files and lots of work to do pretty much
 around the clock?  That's one of the first things I disable!  So no
 locate.

Hmmm.

When I want to search for a file (half the time I don't even know what
machine or from what time period, so I have to search the entire pool), I
look at the mounted backuppcfs fuse filesystem (I mount onto /snapshots):
https://svn.ulyssis.org/repos/sipa/backuppc-fuse/backuppcfs.pl

What if you let mlocate index into the /snapshots ?

I haven't tested to get it to index /snapshots, but mlocate doesn't index
into directories that haven't had a modified mtime.  If backuppfs
correctly preserves mtimes for directories, then updatedb.mlocate will do
the right thing and be a lot quicker than regular old updatedb.  Then make
sure that cron runs it at a time appropriate for you (when I was doing
night shift, this *wasn't* at 4am!), and you won't even notice that it's
busy.

Then wrap locate up in a simple cgi script to present to your users
instead of training them how to use locate on the commandline.

-- 
Tim Connors

--
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/