The problem with only scanning files that have changed since they were
last scanned is that there usually have been virus signature updates in
the meantime. So you could have an "old" file that contains what was a
zero-day virus at the time it was scanned, and now there is a signature
that would detect it.


On Wed, 06 Jan 2021 11:56:47 +0100
"Pierre Dehaen" <deha...@drever.be> wrote:

> Hi,
> 
> On 6 Jan 2021 at 9:58, G.W. Haywood via clamav-users wrote:
> 
> > > My goal is to terminate scan of big number of files like '/' on CPU busy 
> > > hours.  
> > Do not scan everything under the root directory.  
> 
> Use zfs, make regular snapshots, scan once, then use zfs diff to find the 
> new/changed(/removed) files, scan these only.
> 
> Or make a full scan every week if desired, then use a auditing program to 
> regularly search for 
> the files that were added/updated(/removed), scan these only. These auditing 
> programs use 
> hash signatures which are faster to compute than doing full virus scans, but 
> they will anyway 
> make a lot of i/o as they will read all files. If you are really constrained 
> by the i/o you could run 
> a less secure but lighter audit based on the file attributes (size, 
> ownership, mode, dates...) 
> and once a day/week a full audit...
> 
> There are many options...
> 
> HTH,
> Pierre

> 

_______________________________________________

clamav-users mailing list
clamav-users@lists.clamav.net
https://lists.clamav.net/mailman/listinfo/clamav-users


Help us build a comprehensive ClamAV guide:
https://github.com/vrtadmin/clamav-faq

http://www.clamav.net/contact.html#ml

Reply via email to