Ike,
I have done this a little differently.  What I do is identify all the cost
associated with the backup process, backup servers, additional switch
capacity, disk pools, tape drives, and libraries and add that up to a total
dollar number.  Tapes are figured separately.  This infrastructure will
support so many GB of disk storage on the clients involved, round numbers.
This can be used to calculate a normalized cost/GB on average.  You can play
with the numbers of tapes and amount stored data per node to adjust a client
up or down.  Ultimately, the cost relates to the total number of GBs.  The
tape is adjusted based on the version requirements of the
application/customer.

If you come to Share in Nashville, I will show you how I do this for a tape
library solution.

-----Original Message-----
From: Hunley, Ike [mailto:[EMAIL PROTECTED]]
Sent: Monday, February 25, 2002 12:08 PM
To: [EMAIL PROTECTED]
Subject: Where would I begin to answer these questions?


The problem I have, is that NT servers don't report on filenames, specifying
*.PST files...


How much data is currently stored on file servers?  - $ per GB stored How
much is backed up through TSM (was daily now weekly)  soft $'s associated
with network traffic Does TSM store to dasd first?  if so, for how long  - $
per Terabyte stored When archived to tape silo's, for how long? - $ per
Terabyte stored


Blue Cross Blue Shield of Florida, Inc., and its subsidiary and affiliate
companies are not responsible for errors or omissions in this e-mail
message. Any personal comments made in this e-mail do not reflect the views
of Blue Cross Blue Shield of Florida, Inc.

Reply via email to