On Feb 8, 2012, at 9:32 PM, Fields, Christopher J wrote:

> 'samtools sort' seems to be running on our server end as well (not on the 
> cluster).  I may look into it a bit more myself.  Snapshot of top off our 
> server (you can see our local runner as well):
> 
>  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND           
>   
> 3950 galaxy    20   0 1303m 1.2g  676 R 99.7 15.2 234:48.07 samtools sort 
> /home/a-m/galaxy/dist-database/file/000/dataset_587.dat 
> /home/a-m/galaxy/dist-database/tmp/tmp9tv6zc/sorted
> 5417 galaxy    20   0 1186m 104m 5384 S  0.3  1.3   0:15.08 python 
> ./scripts/paster.py serve universe_wsgi.runner.ini --server-name=runner0 
> --pid-file=runner0.pid --log-file=runner0.log --daemon

Hi Chris,

'samtools sort' is run by groom_dataset_contents, which should only be called 
from within the upload tool, which should run on the cluster unless you still 
have the default local override for it in your job runner's config file.

Ryan's instance is running 'samtools index' which is in set_meta which is 
supposed to be run on the cluster if set_metadata_externally = True, but can be 
run locally under certain conditions.

--nate

> 
> chris
> 
> On Jan 20, 2012, at 10:43 AM, Shantanu Pavgi wrote:
> 
>> 
>> Just wanted to add that we have consistently seen this issue of 'samtools 
>> index' running locally on our install. We are using SGE scheduler. Thanks 
>> for pointing out details in the code Nate. 
>> 
>> --
>> Shantanu.
>> 
>> 
>> 
>> On Jan 20, 2012, at 9:35 AM, Nate Coraor wrote:
>> 
>>> On Jan 18, 2012, at 11:54 AM, Ryan Golhar wrote:
>>> 
>>>> Nate - Is there a specific place in the Galaxy code that forks the 
>>>> samtools index on bam files on the cluster or the head node?  I really 
>>>> need to track this down.
>>> 
>>> Hey Ryan,
>>> 
>>> Sorry it's taken so long, I've been pretty busy.  The relevant code is in 
>>> galaxy-dist/lib/galaxy/datatypes/binary.py, in the Bam class.  When Galaxy 
>>> runs a tool, it creates a Job, which is placed inside a JobWrapper in 
>>> lib/galaxy/jobs/__init__.py.  After the job execution is complete, the 
>>> JobWrapper.finish() method is called, which contains:
>>> 
>>>                  if not self.app.config.set_metadata_externally or \
>>>                   ( not 
>>> self.external_output_metadata.external_metadata_set_successfully( dataset, 
>>> self.sa_session ) \
>>>                     and self.app.config.retry_metadata_internally ):
>>>                      dataset.set_meta( overwrite = False )
>>> 
>>> Somehow, this conditional is being entered.  Since set_metadata_externally 
>>> is set to True, presumably the problem is 
>>> external_metadata_set_successfully() is returning False and 
>>> retry_metadata_internally is set to True.  If you leave behind the relevant 
>>> job files (cleanup_job = never) and have a look at the PBS and metadata 
>>> outputs you may be able to see what's happening.  Also, you'll want to set 
>>> retry_metadata_internally = False.
>>> 
>>> --nate
>>> 
>>>> 
>>>> On Fri, Jan 13, 2012 at 12:54 PM, Ryan Golhar 
>>>> <ngsbioinformat...@gmail.com> wrote:
>>>> I re-uploaded 3 BAM files using the "Upload system file paths.  
>>>> runner0.log shows:
>>>> 
>>>> galaxy.jobs DEBUG 2012-01-13 12:50:08,442 dispatching job 76 to pbs runner
>>>> galaxy.jobs INFO 2012-01-13 12:50:08,555 job 76 dispatched
>>>> galaxy.jobs.runners.pbs DEBUG 2012-01-13 12:50:08,697 (76) submitting file 
>>>> /home/galaxy/galaxy-dist-9/database/pbs/76.sh
>>>> galaxy.jobs.runners.pbs DEBUG 2012-01-13 12:50:08,697 (76) command is: 
>>>> python /home/galaxy/galaxy-dist-9/tools/data_source/upload.py 
>>>> /home/galaxy/galaxy-dist-9 /home/galaxy/galaxy-dist-9/datatypes_conf.xml 
>>>> /home/galaxy/galaxy-dist-9/database/tmp/tmpqrVYY7         
>>>> 208:/home/galaxy/galaxy-dist-9/database/job_working_directory/76/dataset_208_files:None
>>>>          
>>>> 209:/home/galaxy/galaxy-dist-9/database/job_working_directory/76/dataset_209_files:None
>>>>          
>>>> 210:/home/galaxy/galaxy-dist-9/database/job_working_directory/76/dataset_210_files:None;
>>>>  cd /home/galaxy/galaxy-dist-9; /home/galaxy/galaxy-dist-9/set_metadata.sh 
>>>> ./database/files ./database/tmp . datatypes_conf.xml 
>>>> ./database/job_working_directory/76/galaxy.json 
>>>> galaxy.jobs.runners.pbs DEBUG 2012-01-13 12:50:08,699 (76) queued in 
>>>> default queue as 114.localhost.localdomain
>>>> galaxy.jobs.runners.pbs DEBUG 2012-01-13 12:50:09,037 
>>>> (76/114.localhost.localdomain) PBS job state changed from N to R
>>>> galaxy.jobs.runners.pbs DEBUG 2012-01-13 12:51:09,205 
>>>> (76/114.localhost.localdomain) PBS job state changed from R to E
>>>> galaxy.jobs.runners.pbs DEBUG 2012-01-13 12:51:10,206 
>>>> (76/114.localhost.localdomain) PBS job state changed from E to C
>>>> galaxy.jobs.runners.pbs DEBUG 2012-01-13 12:51:10,206 
>>>> (76/114.localhost.localdomain) PBS job has completed successfully
>>>> 
>>>> 76.sh shows:
>>>> [galaxy@bic pbs]$ more 76.sh 
>>>> #!/bin/sh
>>>> GALAXY_LIB="/home/galaxy/galaxy-dist-9/lib"
>>>> if [ "$GALAXY_LIB" != "None" ]; then
>>>>  if [ -n "$PYTHONPATH" ]; then
>>>>      export PYTHONPATH="$GALAXY_LIB:$PYTHONPATH"
>>>>  else
>>>>      export PYTHONPATH="$GALAXY_LIB"
>>>>  fi
>>>> fi
>>>> cd /home/galaxy/galaxy-dist-9/database/job_working_directory/76
>>>> python /home/galaxy/galaxy-dist-9/tools/data_source/upload.py 
>>>> /home/galaxy/galaxy-dist-9 /home/galaxy/galaxy-dist-9/datatypes_conf.xml 
>>>> /home/galaxy
>>>> /galaxy-dist-9/database/tmp/tmpqrVYY7         
>>>> 208:/home/galaxy/galaxy-dist-9/database/job_working_directory/76/dataset_208_files:None
>>>>          209:/
>>>> home/galaxy/galaxy-dist-9/database/job_working_directory/76/dataset_209_files:None
>>>>          210:/home/galaxy/galaxy-dist-9/database/job_working_dire
>>>> ctory/76/dataset_210_files:None; cd /home/galaxy/galaxy-dist-9; 
>>>> /home/galaxy/galaxy-dist-9/set_metadata.sh ./database/files ./database/tmp 
>>>> . dataty
>>>> pes_conf.xml ./database/job_working_directory/76/galaxy.json 
>>>> 
>>>> Right as the job ended I check the job output files:
>>>> 
>>>> [galaxy@bic pbs]$ ll
>>>> total 4
>>>> -rw-rw-r-- 1 galaxy galaxy 950 Jan 13 12:50 76.sh
>>>> [galaxy@bic pbs]$ ll
>>>> total 4
>>>> -rw------- 1 galaxy galaxy   0 Jan 13 12:50 76.e
>>>> -rw------- 1 galaxy galaxy   0 Jan 13 12:50 76.o
>>>> -rw-rw-r-- 1 galaxy galaxy 950 Jan 13 12:50 76.sh
>>>> 
>>>> samtools is now running on the head node.
>>>> 
>>>> 
>>>> Where does Galaxy decide how to run samtools?  Maybe I can add a check of 
>>>> some sort to see what's going on?
>>>> 
>>>> 
>>>> On Fri, Jan 13, 2012 at 10:53 AM, Nate Coraor <n...@bx.psu.edu> wrote:
>>>> On Jan 12, 2012, at 11:41 PM, Ryan Golhar wrote:
>>>> 
>>>>> Any ideas as to how to fix this?  We are interested in using Galaxy to 
>>>>> host all our NGS data.  If indexing on the head node is going to happen, 
>>>>> then this is going to be an extremely slow process.
>>>> 
>>>> Could you post the contents of 
>>>> /home/galaxy/galaxy-dist-9/database/pbs/62.sh ?
>>>> 
>>>> Although I have to admit this is really baffling.  The presence of this 
>>>> line without an error:
>>>> 
>>>> galaxy.datatypes.metadata DEBUG 2012-01-11 10:22:40,162 Cleaning up 
>>>> external metadata files
>>>> 
>>>> Indicates that metadata was set externally and the relevant metadata files 
>>>> were present on disk.
>>>> 
>>>> --nate
>>>> 
>>>> 
>>>> 
>>> 
>>> 
>>> ___________________________________________________________
>>> Please keep all replies on the list by using "reply all"
>>> in your mail client.  To manage your subscriptions to this
>>> and other Galaxy lists, please use the interface at:
>>> 
>>> http://lists.bx.psu.edu/
>> 
>> 
>> ___________________________________________________________
>> Please keep all replies on the list by using "reply all"
>> in your mail client.  To manage your subscriptions to this
>> and other Galaxy lists, please use the interface at:
>> 
>> http://lists.bx.psu.edu/
> 
> 


___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/

Reply via email to