I hacked up this script to monitor at what rate a file grows:

#!/bin/bash
LASTSIZE=`ls -s "$1" | cut -d" " -f1`  #size now, init

while true; do
sleep 1 # wait a sec
SIZENOW=`ls -s "$1" | cut -d" " -f1`    #new size

let "DIFF= $SIZENOW-$LASTSIZE"
#echo $LASTSIZE-$SIZENOW $DIFF

if [ $DIFF == 0 ]; then
 echo stalled.
  else
  DIFF_IN_MB=`echo "scale=2; ($SIZENOW-$LASTSIZE)/1024" | bc`
  echo File grows at $DIFF_IN_MB MByte/s
fi

LASTSIZE=$SIZENOW #swap for next iteration
done


So far, so good. Now the script as such works, but always gives me integers. 
When uncommenting the line that echoes the raw values I noticed, the 
difference from one second to the next is *always* 3072k or 4096k, maybe 
sometimes +4k.
Where's my concept error?


-- 
-----BEGIN GEEK CODE BLOCK-----
Version: 3.12
GCS d--(+)@ s-:+ a- C++++ UL++ P+>++ L+++>++++ E-- W++ N o? K-
w--(---) !O M+ V- PS+ PE Y++ PGP t++(---)@ 5 X+(++) R+(++) tv--(+)@ 
b++(+++) DI+++ D- G++ e* h>++ r* y?
------END GEEK CODE BLOCK------

http://www.stop1984.com
http://www.againsttcpa.com


-- 
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to