hi

Dale wrote, at 08/05/2012 04:45 PM:
> Howdy,
>
>
> I have heard of bonnie and friends.  I also think dd could do some
> testing too.  Is there any other way to give this a good work and see if
> it holds up?  Oh, helpful hints with Bonnie would be great too.  I have
> never used it before.  Maybe someone has some test that is really brutal. 
>

some time ago i have played with bonnie++ to figure out my hard disk performance
using different filesystems and io schedulers. the script invokes 3 bonnie
instances; each instance runs its own set of tests: write/read/rewrite a 30gb
file followed with different file operations on 48k small files spread over 32
sub-directories:

#!/bin/bash
#
scratch=${1:-$(pwd)}
#
date
ft=$(df -Pl $scratch|tail -1|awk '{print $6}')
mnt=($(mount|grep "$ft "))
dev=$(basename $(readlink -fn ${mnt[0]}))
sched=$(cat /sys/block/$dev/queue/scheduler|sed -e 's/.*\[//1' -e 's/\].*//1')
log="$dev-${sched}-${mnt[4]}"
echo $log
rm -f ${log} ${log}.html
/usr/sbin/bonnie++ -p 3
/usr/sbin/bonnie++ -qd $scratch -n48:128K:16K:32 -ys -s30g -mg1 >> ${log} &
/usr/sbin/bonnie++ -qd $scratch -n48:128K:16K:32 -ys -s30g -mg2 >> ${log} &
/usr/sbin/bonnie++ -qd $scratch -n48:123K:16K:32 -ys -s30g -mg3 >> ${log} &
wait
/usr/sbin/bonnie++ -p -1
bon_csv2html ${log} >> ${log}.html
date

this script worked around 30mins on a sata3 1tb drive. presuming your 3tb, you
may adjust the file size and/or number of bonnie instances to fill up the disk
space; then start the script and leave it running for a day. i guess this test
would be brutal enough and on completion the disk might be considered good :)

victor

Reply via email to