I am running a local instance of Galaxy and I've been trying to sort out some issue with dataset cleanup. For the most part, things are working OK running the shell scripts in the recommended order:

    delete_userless_histories.sh
    purge_histories.sh
    purge_libraries.sh
    purge_folders.sh
    delete_datasets.sh
    purge_datasets.sh

I have the number of days set to 10. When I look at the reports webapp however, it reports that there are "62 datasets were deleted more than 15 days ago, but have not yet been purged, disk space: 12975717335." These have stuck around now for 45 days (and counting). I have even tried running the scripts with the -f option to force galaxy to re-evaluate the datasets to no avail.

Any suggestions?  Thanks.

--
Lance Parsons - Scientific Programmer
134 Carl C. Icahn Laboratory
Lewis-Sigler Institute for Integrative Genomics
Princeton University

<<attachment: lparsons.vcf>>

___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/

Reply via email to