Simon Lank wrote:
> Hi.
> 
> Our current galaxy database is ~ 600 gb, most of which are user deleted
> datasets.
> 
> I followed the instructions here:
> https://bitbucket.org/galaxy/galaxy-central/wiki/Config/PurgeHistoriesAndDatasets
> 
> and ran the shell scripts in recommended order. One of them in particular (I
> think it was purge_histories.sh) took amost 24 hours to complete. However,
> it doesn't appear any / most of the files were actually deleted, since we
> still have ~ 600 gb of dataset files. Is there something obvious I can try
> to get the files purged correctly?

Hi Simon,

I've moved this conversation to the -dev list since it involves a local
installation.

Datasets are not removed until the number of days specified with the -d
to cleanup_datasets.py has elapsed.  To remove all deleted data
immediately, rerun the scripts outside of the provided wrappers and set
-d to 0.

--nate

> 
> Thanks.
> 
> Simon
> 
> Simon Lank
> Research Specialist
> O'Connor Lab, WNPRC
> 555 Science Dr. Madison WI
> (608) 265-3389

> ___________________________________________________________
> The Galaxy User list should be used for the discussion of
> Galaxy analysis and other features on the public server
> at usegalaxy.org.  Please keep all replies on the list by
> using "reply all" in your mail client.  For discussion of
> local Galaxy instances and the Galaxy source code, please
> use the Galaxy Development list:
> 
>   http://lists.bx.psu.edu/listinfo/galaxy-dev
> 
> To manage your subscriptions to this and other Galaxy lists,
> please use the interface at:
> 
>   http://lists.bx.psu.edu/

___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/

Reply via email to