Both the Web Service (pipelines) and Asynchronous Web Service (workflows) calls support both a content body as well as parameters.
Cheers, Matt https://hop.apache.org/manual/latest/hop-server/web-service.html https://hop.apache.org/manual/latest/metadata-types/asyncwebservice.html On Tue, 21 Mar 2023 at 12:51, <[email protected]> wrote: > > Would be great if we could sent some palyload to Carte (I do not know > what's the name in Hop) - not only parameters. That's what I've always > missed in Kettle. > > *Sent:* Thursday, March 09, 2023 at 9:36 AM > *From:* "Matt Casters" <[email protected]> > *To:* [email protected] > *Subject:* Re: HOP Server REST API > Hi Phil, > > The easiest way to start a pipeline (or workflow) from another application > is probably to use a named synchronous or asynchronous web service. > See here for more information: > https://hop.apache.org//manual/latest/hop-server/web-service.html > > Your point about the lacking REST API is taken though. We're actually in > the process of making a better set of services. > Please let us know what you need in terms of the interface so we can build > this out over time. We'll have a docker container to go along with that as > well. > https://hop.apache.org//manual/next/hop-rest/index.html > > All the best, > Matt > > On Thu, 9 Mar 2023 at 06:16, Phillip Brown <[email protected]> > wrote: > >> Hi >> >> Trying to understand the Hop Server REST API, and not finding the user >> manual all that helpful (Note, the Pentaho documentation is not any >> better in this regard) >> >> First, there seems to be duplication without any explanation of when to >> use some calls rather than other calls. For example, when would I use >> addPipeline vs registerPipeline? Why would I use prepareExec and >> startExec instead of startPipeline? >> >> Second, "Request body should contain xml containing >> pipeline_configuration (pipeline and pipeline_execution_configuration >> wrapped in pipeline_configuration tag)" is there without any real >> explanation of how you go about creating that request body, or where the >> pieces come from. There appears to be the implicit assumption that >> people should just know what it means. And the example in >> registerPipeline doesn't really help, and has an additional >> "metastore_json is base64 encoded GZip content" which also isn't >> explained. >> >> Finally, how would I go about doing something like running a pipeline >> from another application like Oracle APEX (see, for example, >> https://pretius.com/blog/pentaho-data-integration-oracle-apex/) ? >> >> Regards >> >> Phil Brown >> > > > >
