I am looking for a method to calculate the cost of running multiple,
simultaneous processing jobs on a computer cluster. 

For example, I am renting cluster time at $XX/hour. I can run multiple jobs,
but the cost is always $XX/hour regardless of the number of parallel,
simultaneous jobs running. If one job is running I still get charged
$XX/hour, or if 10 jobs are running during the same period, I still get
charged $XX/hour.

I have the start time/date and end time/date of each job. 

Anyone have any idea how to code this kind of calculation?

TNX if you can provide some clues.

Richard Colman
Institute for Genomics and Bioinformatics
949-824-1816, 701-5330
[EMAIL PROTECTED]



~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
Logware (www.logware.us): a new and convenient web-based time tracking 
application. Start tracking and documenting hours spent on a project or with a 
client with Logware today. Try it for free with a 15 day trial account.
http://www.houseoffusion.com/banners/view.cfm?bannerid=67

Message: http://www.houseoffusion.com/lists.cfm/link=i:4:200163
Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4
Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4
Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4
Donations & Support: http://www.houseoffusion.com/tiny.cfm/54

Reply via email to