Hi Katherine,
Probably about the only way to get the job start time is job.create_time (a
close estimate to the actual start time) and the end time is job.update_time
(again, a close estimate), so you can calculate the job’s estimated execution
time using something like this:
import datetime
e
Actually just figured out the history content api id, just needing the
extra stuff about the job like runtime and start and end times.
On Fri, Jul 8, 2016 at 8:55 AM, Katherine Beaulieu <
katherine.beaulieu...@gmail.com> wrote:
> Hi Greg,
> Thanks for the link to your github repo, some of the inf
Hi Greg,
Thanks for the link to your github repo, some of the information there was
very useful. Do you have any idea on how to access some of the stuff you
don't yourself pass to your python script such as the history content api
id, or the job runtime? For example for the job runtime I feel like
Hi Katherine,
For the job related stuff, I’m doing this in my tools that provide statistics
for the Galaxy ChIP-exo instance I’m setting up for a lab here at Penn State.
You can see variations of tool examples here:
https://github.com/gregvonkuster/cegr-galaxy/tree/master/tools/cegr_statistics
Hi everyone,
I am working on a tool which attempts to create a file which stores all the
metadata associated with a job execution. The things I know how to access
right now are the file extension, name history id, dataset id. I'd like to
know how to access other things like the job id, uuid, file s