Problem with dataset not being cleaned.

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Problem with dataset not being cleaned.

C. Ch.

Hi all,


I have the problem that there are a lot of files that are not deleted in the database/files dir.

Among these files I have failed uploads and duplicated library files. The latter do not correspond to new versions of existing datasets.


I have run the cleanup_datasets.py script but I still have files dating back a year that are in this condition.


I have no clue as why/how this could have happened. 


The galaxy instance is installed on a cluster with the filesystem on a NAS. 

I have been running 16.01 on this instance (will update as soon as I can stop the service).



Could anyone suggest a way to identify those files that have to be in database/files so that I can delete all the rest??


Thanks in advance!

Cristian


___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
  https://lists.galaxyproject.org/

To search Galaxy mailing lists use the unified search at:
  http://galaxyproject.org/search/