-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Neil,
It'd be pretty easy to fix that yourself. In scripts/cleanup_datasets/cleanup_datasets.py, there's a filter which selects which datasets should be removed. Check L350 of that script. You could simply remove the other values from the _and query, and delete everything older than N days. Make sure your users know ahead of time! Cheers, Eric On 07/17/2014 02:55 PM, [email protected] wrote: > How does the main Galaxy server deal with it? Users data can't remain on the > "project's free server" forever can it? and if not then there must be some > automated way of deleting data? is that correct? > > Thanks > Neil > ________________________________________ > From: Hans-Rudolf Hotz [[email protected]] > Sent: Thursday, July 17, 2014 9:23 PM > To: Burdett, Neil (DP&S, Herston - RBWH); [email protected] > Subject: Re: [galaxy-dev] cleanup_datasets.py not deleting files... > > On 07/17/2014 12:38 PM, [email protected] wrote: >> Thanks. Is there a script that will delete all files older than a certain >> date even if not marked as deleted by the user? >> > > I am not aware of such a script > > Hans-Rudolf > >> Thanks >> Neil >> ________________________________________ >> From: Hans-Rudolf Hotz [[email protected]] >> Sent: Thursday, July 17, 2014 5:42 PM >> To: Burdett, Neil (DP&S, Herston - RBWH); [email protected] >> Subject: Re: [galaxy-dev] cleanup_datasets.py not deleting files... >> >> Hi Neil >> >> The cleanup_datasets.py script only removes the files if older than the >> time given (you did this, ie older than 20 days) AND if the dataset has >> been marked as 'deleted' by the user - have you done that? >> >> see also: >> https://wiki.galaxyproject.org/Admin/Config/Performance/Purge%20Histories%20and%20Datasets >> >> >> Hope this helps >> Hans-Rudolf >> >> >> On 07/17/2014 02:42 AM, [email protected] wrote: >>> Hi, >>> I'm trying to use the cleanup_datasets.py file to remove all files >>> on my system older than 20 days. My crontab looks like this: >>> >>> # m h dom mon dow command >>> 34 10 * * * cd /export/barium-data3/galaxy-suvr && python >>> scripts/cleanup_datasets/cleanup_datasets.py universe_wsgi.ini -d 20 -1 >>> > /home/galaxy/crontab_purge_milxcloud.log 2>&1 && python >>> scripts/cleanup_datasets/cleanup_datasets.py universe_wsgi.ini -d 20 -2 >>> -r >> /home/galaxy/crontab_purge_milxcloud.log 2>&1 && python >>> scripts/cleanup_datasets/cleanup_datasets.py universe_wsgi.ini -d 20 -3 >>> -r >> /home/galaxy/crontab_purge_milxcloud.log 2>&1 && python >>> scripts/cleanup_datasets/cleanup_datasets.py universe_wsgi.ini -d 20 -5 >>> -r >> /home/galaxy/crontab_purge_milxcloud.log 2>&1 && python >>> scripts/cleanup_datasets/cleanup_datasets.py universe_wsgi.ini -d 20 -4 >>> -r >> /home/galaxy/crontab_purge_milxcloud.log 2>&1 && python >>> scripts/cleanup_datasets/cleanup_datasets.py universe_wsgi.ini -d 20 -6 >>> -r >> /home/galaxy/crontab_purge_milxcloud.log 2>&1 >>> >>> The crontab executes and the contents of >>> /home/galaxy/crontab_purge_milxcloud.log is: >>> >>> cat /home/galaxy/crontab_purge_milxcloud.log >>> psycopg2 egg successfully loaded for postgres dialect >>> ########################################## >>> >>> # 2014-07-17 10:34:02 - Handling stuff older than 20 days >>> Datasets will NOT be removed from disk. >>> >>> Deleted 0 histories >>> Elapsed time: 0.170083045959 >>> ########################################## >>> psycopg2 egg successfully loaded for postgres dialect >>> ########################################## >>> >>> # 2014-07-17 10:34:02 - Handling stuff older than 20 days >>> Datasets will be removed from disk. >>> >>> Purged 0 histories. >>> Elapsed time: 0.174137830734 >>> ########################################## >>> psycopg2 egg successfully loaded for postgres dialect >>> ########################################## >>> >>> # 2014-07-17 10:34:03 - Handling stuff older than 20 days >>> Datasets will be removed from disk. >>> >>> Purged 0 datasets >>> Freed disk space: 0 >>> Elapsed time: 0.168104887009 >>> ########################################## >>> psycopg2 egg successfully loaded for postgres dialect >>> ########################################## >>> >>> # 2014-07-17 10:34:04 - Handling stuff older than 20 days >>> Datasets will be removed from disk. >>> >>> # Purged 0 folders. >>> Elapsed time: 0.168635129929 >>> ########################################## >>> psycopg2 egg successfully loaded for postgres dialect >>> ########################################## >>> >>> # 2014-07-17 10:34:05 - Handling stuff older than 20 days >>> Datasets will be removed from disk. >>> >>> # Purged 0 libraries . >>> Elapsed time: 0.166506052017 >>> ########################################## >>> psycopg2 egg successfully loaded for postgres dialect >>> ########################################## >>> >>> # 2014-07-17 10:34:06 - Handling stuff older than 20 days >>> Datasets will be removed from disk. >>> >>> Examined 0 datasets, marked 0 datasets and 0 dataset instances (HDA) as >>> deleted >>> Total elapsed time: 0.00744795799255 >>> ########################################## >>> >>> However, today is the 17th July and when I look into the database >>> directory i.e. ~/database/files/000 I still have files from the 18th >>> June so older than 20 days i.e.: >>> >>> drwxr-xr-x 2 galaxy nogroup 4096 Jun 18 09:19 dataset_213_files >>> -rw-r--r-- 1 galaxy nogroup 1061 Jun 18 09:19 dataset_213.dat >>> drwxr-xr-x 2 galaxy nogroup 4096 Jun 18 09:19 dataset_215_files >>> -rw-r--r-- 1 galaxy nogroup 270 Jun 18 09:19 dataset_215.dat >>> drwxr-xr-x 2 galaxy nogroup 4096 Jun 18 09:19 dataset_221_files >>> drwxr-xr-x 2 galaxy nogroup 4096 Jun 18 09:19 dataset_219_files >>> drwxr-xr-x 2 galaxy nogroup 4096 Jun 18 09:19 dataset_220_files >>> drwxr-xr-x 2 galaxy nogroup 4096 Jun 18 09:19 dataset_222_files >>> drwxr-xr-x 2 galaxy nogroup 4096 Jun 18 09:19 dataset_218_files >>> -rw-r--r-- 1 galaxy nogroup 994 Jun 18 09:19 dataset_216.dat >>> -rw-r--r-- 1 galaxy nogroup 161 Jun 18 09:19 dataset_214.dat >>> >>> Am I doing something wrong? Using wrong arguments/file etc ? >>> >>> Thanks >>> Neil >>> >>> >>> ___________________________________________________________ >>> Please keep all replies on the list by using "reply all" >>> in your mail client. To manage your subscriptions to this >>> and other Galaxy lists, please use the interface at: >>> http://lists.bx.psu.edu/ >>> >>> To search Galaxy mailing lists use the unified search at: >>> http://galaxyproject.org/search/mailinglists/ >>> > > ___________________________________________________________ > Please keep all replies on the list by using "reply all" > in your mail client. To manage your subscriptions to this > and other Galaxy lists, please use the interface at: > http://lists.bx.psu.edu/ > > To search Galaxy mailing lists use the unified search at: > http://galaxyproject.org/search/mailinglists/ > - -- Eric Rasche Programmer II Center for Phage Technology Texas A&M University College Station, TX 77843 404-692-2048 [email protected] [email protected] -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.22 (GNU/Linux) iQIcBAEBAgAGBQJTyC0aAAoJEMqDXdrsMcpVeZoP/jEHECH54DEaQR6IeqLy09DO oU/u7kqAY1BIJFb7mP0pKPbXP8hG6AMjdP/Ya6OW+7OVRmyNgUKUmj0sZSv6zKV3 cPWeeEDPmsEYBSnIG6DMqoc7oCB9rFEETUpZDH008sGCauSHQe98MP2nQQHxMvxs wdIevSjvWYEe8MLYgc+yNaYQnyjRuu1DKq9e9+KjFWbUJPztFCyTRs5AClKlCm4N W36artakHUq2WLPaufeU+krNVqUqiGyzA6uKnNDGqGszGnXnUJYlUnuolZ+2A1pc 6ENqxzS1KA1qNx+I89lmoPXq5lNv/JlUdBKflvq3ZD66ZLzLCpchQ6qYMtgby09N H8QHUhQA4RDiPqzzyUh5lv55aFgbra7rsxfTkIbb9fMG6I3jvl2BJ4+5ArKITFOL tSoYPR/dndSmrEn/xbjtDQGxllGNN6msQEdKZZ0F/NKfT0ygdplOli9BabQvVD/W 5J1iHlFx2FZs+g15NsjbUfCnAjZvL2iiqaEI0GBSToCBLFJgMcVMuR2D2BqqC38V OKGH/OAX6Bcs+xQ38+3xXUje8wA0TLfzznOkMMVJiwZsxA4SMP9Bu/apxj8XYg+E kTX1WMXlHdJcBSVcAMGcMTnkz+QUQ/A65C2O9iHvFwzjnHick08HXScqwfkANRYM WazGhTgSxVfpPcnMr5Qq =3rVh -----END PGP SIGNATURE----- ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: http://lists.bx.psu.edu/ To search Galaxy mailing lists use the unified search at: http://galaxyproject.org/search/mailinglists/
