What you do in AI is create a job, into which one or more transformations
are placed.  You can then set the Job to run on an automated schedule or
set workflow to run the job event driven. All AI data is stored in the UDM
forms.

Rick
On Feb 24, 2016 1:24 PM, "jham36" <[email protected]> wrote:

> **
> Has this been figured out?  I opened a ticket with support on this.  With
> AIE I would call aiexfer.exe and pass my variables there and it would
> create an entry in Application Pending and only process the single record I
> wanted.
> Support is telling me to use this command for AI:
>
> cd C:\Program Files\BMC Software\ARSystem\diserver\data-integration
> kitchen.bat /level:Basic /server:W-SSARKATE-67 /port:0 /user:Demo
> /pass:Demo /job:Test_job1 /dir:/ > C:\Test\trans.log
>
>
> What they failed to tell me is how to pass in a PKE ID and if I need to
> add anything into the transformation to make use of the
> parameter/variable.  In AIE it just worked.
>
>
> Thanks,
>
> James
>
> On Monday, August 3, 2015 at 5:10:45 PM UTC-4, Andrew Hicox wrote:
>>
>> **
>> Thanks for the pointers everyone!
>> I'm still a long way from getting this working, however I think I'm on
>> the right track now.
>>
>> For those who may be interested, here's some interesting things I've
>> found so far:
>>
>> 1) there is a (very convoluted, yet apparently working) example of this
>> sort of thing already in the OOB code if you have SRM installed.
>> Check out the workflow on
>>
>> *SRS:ImportExportConsole*
>> 2) contrary to the documentation, you do NOT need to create a Job, you
>> can call transformations directly from workflow.
>> As Jarl pointed out, you can do this by pushing data into *UDM:Execution*
>> (though I suspect you still need Job if you want to use *Application
>> Pending*)
>>
>> 3) The way you can send variables (i.e. "Parameters" in Pentaho-land)
>> into an AI Job or Transformation is via the *UDM:Variable* form.
>> The *Name* field corresponds to the *Parameter* (in Spoon, right-click,
>> select "Job Properties" or "Transformation Properties", these are defined
>> in the "Parameters" tab)
>> The *Value *field is the value you want to set on the parameter.
>> And this is apparently what *'Variable Set Name'* is for on
>> *UDM:Execution*, and *UDM:ExecutionInstance* ... you send a GUID on this
>> to *UDM:Variables*, as well as to *UDM:Execution*, and that's how it
>> binds these specific variables to the job or transformation you're
>> executing (this field is missing from *Application Pending*, so I've got
>> no idea how you can send parameters into a job from that form).
>>
>> thanks again everyone. I'll keep the thread updated as I figure this out
>> (for future googling posterity if nothing else).
>>
>> -Andy
>>
>>
>> On Mon, Aug 3, 2015 at 1:20 PM, Jarl Grøneng <[email protected]> wrote:
>>
>>> **
>>> Hi.
>>>
>>> Push to the UDM:Execution form:
>>> Directory:  "<your AI job directory>"
>>> Log Level:  "Minimal"
>>> Type:   "Job"
>>> Operation:  "Start"
>>> Carte ObjectID: "<execution instance name>" (grab the name from
>>> UDM:ExecutionInstance)
>>> Name:   "<AI job name>"
>>>
>>> Not sure about parameteters, but take a look at the Variable Set Name
>>> field.
>>>
>>>
>>> Logging, take a look at:
>>> UDM:StepLog
>>> UDM:TransformationLog
>>> UDM:JobEntryLog
>>> UDM:JobLog
>>> UDM:ExecutionStatus
>>>
>>> --
>>> J
>>>
>>> 2015-08-03 18:46 GMT+02:00 Andrew Hicox <[email protected]>:
>>>
>>>> **
>>>> Aah yeah, I found this in the documentation, so it looks like the way
>>>> to trigger an AI job is basically to push the right values into
>>>> "Application Pending" (at least on 8.1.01):
>>>>
>>>>
>>>> https://docs.bmc.com/docs/display/public/ac81/Setting+up+event-driven+jobs+in+Atrium+Integrator
>>>>
>>>> I think from there, I can probably watch the various log forms
>>>> (UDM:TransformationLog, UDM:StepLog etc) to try and catch errors, status,
>>>> etc. Though identifying MY specific job versus any other random job that
>>>> might be running at the time ... I'm not quite clear on that (yet), but
>>>> .... really what I'm struggling with now, is how to send a variable into
>>>> the AI job.
>>>>
>>>> Just one would do. If I could send a GUID or something into the job,
>>>> that'd be good enough to glue everything together (for instance I can use
>>>> the GUID to reach back into the DB and find my attachment filename,
>>>> identify my logs from all the others, etc).  Thing is ... so far ... I just
>>>> don't see a way to do something like that, but maybe I'm just barking up
>>>> the wrong tree and there's a "right" way to do this?
>>>>
>>>> I see a few suspiciously "extra" fields on Application Pending ('Field
>>>> 1', 'Field 2', 'Other Short', 'Other Long', etc). That is certainly the
>>>> sort of thing I'd put on a form to send arguments to a job, but as to how
>>>> specifically these are used (if at all) ... God, it's just anyone's guess
>>>> because there's absolutely zippy mention in the documentation (that I can
>>>> find at least).
>>>>
>>>> Any more help out there? LOL :-)
>>>>
>>>> thanks everyone!
>>>>
>>>> -Andy
>>>>
>>>>
>>>>
>>>>
>>>> On Mon, Aug 3, 2015 at 11:28 AM, Rick Cook <[email protected]> wrote:
>>>>
>>>>> **
>>>>>
>>>>> I've done it, though my memory is fuzzy on the details.  Basically,
>>>>> when you schedule your AI job, it creates a record in a form.  The form
>>>>> name depends on the version of AI, but it's in the docs.
>>>>>
>>>>> You can have workflow look for records matching your criteria entering
>>>>> that form, and do what you want with it.
>>>>>
>>>>> Rick
>>>>> On Aug 3, 2015 8:04 AM, "Andrew Hicox" <[email protected]> wrote:
>>>>>
>>>>>> **
>>>>>>
>>>>>> Hi everyone,
>>>>>>
>>>>>> I have a requirement to create a user interface whereby a spreadsheet
>>>>>> is uploaded as an attachment, and used as an input to an atrium 
>>>>>> integrator
>>>>>> job that will import/mangle data appropriately.
>>>>>>
>>>>>> Offhand, it seems like this ought  to be possible. I suspect I could
>>>>>> use a clever run-process to copy the attachment into a temp directory and
>>>>>> have AI pick it up from there.
>>>>>>
>>>>>> However, I need some way of not only triggering the AI job in real
>>>>>> time from workflow, but I also need a way to send a variable into the job
>>>>>> at runtime (i.e. "process this specific filename right  now"). Also
>>>>>> (ideally) I'd need some way of capturing exceptions from AI and 
>>>>>> displaying
>>>>>> a sensible error message to the user ("missing required data", "bad
>>>>>> spreadsheet format", etc etc).
>>>>>>
>>>>>> Has anyone here ever attempted this sorta thing? If so, any pointers?
>>>>>>
>>>>>> -Andy
>>>>>> _ARSlist: "Where the Answers Are" and have been for 20 years_
>>>>>
>>>>> _ARSlist: "Where the Answers Are" and have been for 20 years_
>>>>>
>>>>
>>>> _ARSlist: "Where the Answers Are" and have been for 20 years_
>>>>
>>>
>>> _ARSlist: "Where the Answers Are" and have been for 20 years_
>>>
>>
>> _ARSlist: "Where the Answers Are" and have been for 20 years_
>
> _ARSlist: "Where the Answers Are" and have been for 20 years_

_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org
"Where the Answers Are, and have been for 20 years"

Reply via email to