Just a quick note: One way to achieve this functionality (in a very simple way) is to run Kepler directly at the server (e.g., using ptexecute) as a background process ... This would be a "poor man's" version of what you are asking for. Some limitted monitoring could be "built in" to the workflow or possibly using some of Ptolemy's logging capabilities.
Also, there are some workflows defined in Kepler (TSI, etc.) that are designed to perform what you are describing. That is, the workflow itself describes a control-process for the execution of distributed and long running (external) processes -- having special purpose logging, monitoring, and adaptation mechanisms (e.g., for fault tolerance). -shawn Chad Berkley wrote: > Hi Srinath, > > This is something that has been discussed several times and a feature we > definitely want. We do not have anything like that working yet though > and probably won't for a while unless someone else volunteers to work on > it. I think this is probably a post 1.0 release set of functionality. > > chad > > Srinath Perera wrote: >> Hi All; >> >> Can we run the workflow in offline mode with kepler? >> >> By offline mode I mean something like >> >> 1) Some services in the workflow are long running >> 2) The workflow is composed, saved and submitted to kepler (some server) >> 3) The server will run the workflow and the user can monitor the >> workflow time to time? >> >> If yes where I can found more information about how to do it? >> >> Thanks >> Srinath >> _______________________________________________ >> Kepler-users mailing list >> Kepler-users at ecoinformatics.org >> http://mercury.nceas.ucsb.edu/ecoinformatics/mailman/listinfo/kepler-users > _______________________________________________ > Kepler-users mailing list > Kepler-users at ecoinformatics.org > http://mercury.nceas.ucsb.edu/ecoinformatics/mailman/listinfo/kepler-users

