Wei, I have looked at adding a processor for calling stored procedures, but the problem I encountered was how to allow the user to specify input and/or output parameters to stored procedures and functions. You can use ExecuteSQL to execute a stored procedure with no input/output parameters as you have suggested in option 2. We could allow user-defined properties but since we're limited to key/value pairs we can't represent the parameter name, the value, and the type (input or output or inout).
Perhaps we could do something like having properties "Input parameters", "Output Parameters", and "Inout Parameters" that accept a comma-delimited list of parameter names along with user-defined properties where you would again specify the parameter name but the value would be the optional value you want to put in for an input or inout parameter. This doesn't seem like a good user experience to me but perhaps the user would be willing to accept the awkward configuration if they have the capability for calling any kind of stored procedure or function. Our existing processors already support multiple ResultSets so we can reuse that code. I will give this some more thought but would welcome any and all opinions on the matter. Thank you, Matt On Thu, Jun 13, 2024 at 12:13 PM Zhong Wei <sirzhong...@hotmail.com> wrote: > Hi, I should apologize and beg your pardon in advance for my informal > English expression since English is not my home language. > > I am a developer and IT maintainer of a college. I am trying to use Nifi > to transfer data between databases. The power of NiFi surprise me. I want > to use it as my main tool. > > However, for one of my case, I have not found good solution, even after > talking with ChatGPT4, and some digging on internet. > > My case is: > > Mutiple RDBMS => NiFi(extract) => meta database(transformer) => NiFi(load) > => data warehouse. > > Why I need meta database(it's also my centerized transformer)? because > there are many circumstances that attaching, translating, converting or > removing columns from large dataset are needed, and those transforming > operations rely on bunch of meta tables in meta database. > > So, I think that in my batch transforming scenario, no any other tool is > more powerful than stored procedures or functions in meta database. So I > hope hungrily NiFi to provide the flow ability like below: > > > 1. > Source RDBMS => ExecuteSQLRecord => NewNifiProcessor(Call > StoredProcedure/Function with Records and fetch transformed records back) > =>PutDatabaseRecord => data warehouse > > or > > 2. > Srouce RDBMS => ExecuteSQLRecord => (1)ExecuteSQL(create temp table on > meta database) =>(2)ExecuteSQL(insert records into temp table)=> > (3)ExecuteSQL(call storedprocedure/function and fetch records back) => > PutDatabseRecord => data warehouse > > in the 2nd flow, exactly the same opening database connection instance > need to be guaranteed during the three steps (1)(2)(3) in any > circumstance(i.e multiple flows share one database pool controller > service), in order to enclose both three steps in one database connection > session. however, I don't know currently how to ensure the same session > across processors, so my solution is : > > ... insert records into fixed table => transform records in > function/stored procedure => truncate the fixed table => return records > =>... > > that can work, but is a bit odd. > > Lastly, in my opinion, for batch transforming dataset, stored procedure or > function in database is more powerful and maintainable than any other > solution. so, I think more processors centering on utilizing this ability > should be built. > > best regards > > Wei. > 2024/6/13 >