Hi fellow galaxy devs,
I am trying to understand how to implement the galaxy database and get an idea
of how big it could get. Currently we are running galaxy on a webserver, and
want to have the postgresql db on locally mounted partition and not on an NFS
partition. This limits us to around 100GB of storage for the db. We will create
data libraries for users to load their data without copying to galaxy, so input
files won't be duplicated. Is there anything we can do about the output files?
Do these files need to end up in the database or can we put them on the NFS
partition somewhere with the db holding information about their location?
I noticed that on a routine small analysis I could easily have 20GB or more of
output files and history and all this is in the database.
If output files and history files are written to the database, are they cleaned
up daily to avoid storage issues?
Please keep all replies on the list by using "reply all"
in your mail client. To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
To search Galaxy mailing lists use the unified search at: