On Thu, Mar 23, 2006 at 09:07:18PM -0800, Dennis Peterson wrote:
> It does as you say. You may get around it using a tool like tripwire to 
> limit your scan to the files of interest. Really, scanning every file on 
> a system disk is draconian. User space is another thing entirely and 
> your own best practices should prevail. Here's an alternative to your 
> quandry:
> 
> tar cf - / |/usr/local/bin/clamscan
> 
> Use your path to clamscan. Tar does not follow symlinks that are 
> directories, and this usage invokes a single instance of clamscan which 
> is used for the entire scan. Tar also allows include and exclude files 
> which can help you fine tune your scope (man tar). If you run tar as 
> root then you get around all the messy permission problems. This is 
> processor intense, regardless, especially when scanning gzip'd tar files 
> or other archives. Tar itself is not much of a factor. But this too is 
> nuts. Do a global scan once to baseline things, then scan only changed 
> files thereafter. Build a new baseline every 45 days or so. When you 
> work smart your boss loves it and rewards you, and you are an instant 
> babe magnet. Better than owning a Harley. Ok - I made up that last part.
> 
> Adding options to clamscan and to tar to ignore benign file types can be 
> a plus. If there's a problem then tripwire would be your first line 
> anyway. Not enough people use tripwire or cfengine which has a tripwire 
> like capacity and runs just fine in OS X. Far less overhead than virus 
> scanning the entire system over and over. Just remember to keep your 
> masters on a read-only CD or nfs mount.
> 
> dp ... Harley owner

Hey dp,

I appreciate the reply and the verification.

I would love to do more intelligent scanning...Unfortunately, my boss doesn't
see things that way and he constantly comes back with, "it worked fine with
Virex!" He wants a full scan.

I guess I should explain how things worked before. Nightly, /Users gets
scanned. That seems to work just perfectly with clamscan. However, the weekly
scans were done on / and we were excluding /Users...Since it got scanned
earlier that day. This is the same thing Virex was doing, and my boss doesn't
want to move away from that.

I had debated using find(1) to do a similar thing (find all real files, pass
them to clamscan, etc.), but the tar method would probably be better. I'd be
interested to see how the reporting works, though. I'll give it a shot
tomorrow.

I"m curious what's causing clamscan to loop infinitely, though. Also, what's
with the odd double slash at the beginning of each path?

Thanks again,
Josh ('03 Suzuki SV650 owner)
-- 
Josh Tolbert
[EMAIL PROTECTED]  ||  http://www.puresimplicity.net/~hemi/

Security is mostly a superstition. It does not exist in nature, nor
do the children of men as a whole experience it. Avoiding danger
is no safer in the long run than outright exposure. Life is either
a daring adventure, or nothing.
    -- Helen Keller
_______________________________________________
http://lurker.clamav.net/list/clamav-users.html

Reply via email to