Maybe the ConsumeKafka processor caused one of the later processors transform the attribute value differently?
Either that or the fact that I was writing to a local postgres db rather than H2? -Daniel On Nov 7, 2016 3:00 AM, "Koji Kawamura" <[email protected]> wrote: > Hi Daniel, > > Sorry to hear that you've encountered the issue, and taking so long > for you to get any feedback. > > PutSQL simply uses Integer.parseInt() method to convert attribute > value to an Integer to map it to a integer table column. > Integer.parseInt("0.0") threw a NumberFormatException, although we > expect it automatically converts "0.0" to 0. I think it makes sense to > throw an Exception, because it can't convert other values such as > "5.5" to 5 or 6 automatically. > > I tried to find which processor and conversion actually convert the > original 0 value into "0.0", but I couldn't reproduce it. > I used AttributesToJSON, EvaluateJSONPath, ConvertJSONToSQL and PutSQL > processors, with H2 database. The original 0 value was successfully > inserted into an int column. > > Do you know which processor does actually create the value with as a > decimal number? > > Thanks, > Koji > > On Mon, Oct 31, 2016 at 12:44 PM, Daniel Einspanjer > <[email protected]> wrote: > > I'm working on the following flow: > > > > ConsumeKafka_0_10 -- topic contains ~440 byte json blobs > > v > > SplitText -- One line per file > > v > > EvaluateJsonPath -- pulls out a subset of string and integer fields > > v > > AttributesToJson -- Puts those fields back into the flow file content > > v > > ConvertJsonToSQL -- Builds update statements out of the new JSON content > > v > > PutSQL -- Attempt to execute the update statements. > > > > I'm getting errors in the PutSQL processor that seem unusual. The > original > > value for this field was a numeric value 0. The conversion from JSON to > > Attribute and back changes it to the string "0.0", but even then, I'd > still > > expect it to be recognizable, wouldn't it? > > > > 2016-10-30 23:34:28,413 ERROR [Timer-Driven Process Thread-1] > > o.apache.nifi.processors.standard.PutSQL > > PutSQL[id=18c12f53-0158-1000-2dee-0c595adbe081] Cannot update database > for > > StandardFlowFileRecord[uuid=3a934cba-7e1d-4df2-8457-d1858a90c50a,claim= > StandardContentClaim > > [resourceClaim=StandardResourceClaim[id=1477884567453-1722, > > container=default, section=698], offset=242176, > > length=128],offset=0,name=58640865062299,size=128] due to > > org.apache.nifi.processor.exception.ProcessException: The value of the > > sql.args.2.value is '0.0', which cannot be converted into the necessary > data > > type; routing to failure: > > org.apache.nifi.processor.exception.ProcessException: The value of the > > sql.args.2.value is '0.0', which cannot be converted into the necessary > data > > type > > 2016-10-30 23:34:28,414 ERROR [Timer-Driven Process Thread-1] > > o.apache.nifi.processors.standard.PutSQL > > org.apache.nifi.processor.exception.ProcessException: The value of the > > sql.args.2.value is '0.0', which cannot be converted into the necessary > data > > type > > at org.apache.nifi.processors.standard.PutSQL.setParameters( > PutSQL.java:628) > > ~[nifi-standard-processors-1.0.0.jar:1.0.0] > > at org.apache.nifi.processors.standard.PutSQL.onTrigger(PutSQL.java:241) > > ~[nifi-standard-processors-1.0.0.jar:1.0.0] > > at > > org.apache.nifi.processor.AbstractProcessor.onTrigger( > AbstractProcessor.java:27) > > [nifi-api-1.0.0.jar:1.0.0] > > at > > org.apache.nifi.controller.StandardProcessorNode.onTrigger( > StandardProcessorNode.java:1064) > > [nifi-framework-core-1.0.0.jar:1.0.0] > > at > > org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call( > ContinuallyRunProcessorTask.java:136) > > [nifi-framework-core-1.0.0.jar:1.0.0] > > at > > org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call( > ContinuallyRunProcessorTask.java:47) > > [nifi-framework-core-1.0.0.jar:1.0.0] > > at > > org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run( > TimerDrivenSchedulingAgent.java:132) > > [nifi-framework-core-1.0.0.jar:1.0.0] > > at java.util.concurrent.Executors$RunnableAdapter. > call(Executors.java:511) > > [na:1.8.0_77] > > at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) > > [na:1.8.0_77] > > at > > java.util.concurrent.ScheduledThreadPoolExecutor$ > ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) > > [na:1.8.0_77] > > at > > java.util.concurrent.ScheduledThreadPoolExecutor$ > ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) > > [na:1.8.0_77] > > at > > java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > [na:1.8.0_77] > > at > > java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > [na:1.8.0_77] > > at java.lang.Thread.run(Thread.java:745) [na:1.8.0_77] > > Caused by: java.lang.NumberFormatException: For input string: "0.0" > > at > > java.lang.NumberFormatException.forInputString( > NumberFormatException.java:65) > > ~[na:1.8.0_77] > > at java.lang.Integer.parseInt(Integer.java:580) ~[na:1.8.0_77] > > at java.lang.Integer.parseInt(Integer.java:615) ~[na:1.8.0_77] > > at org.apache.nifi.processors.standard.PutSQL.setParameter( > PutSQL.java:769) > > ~[nifi-standard-processors-1.0.0.jar:1.0.0] > > at org.apache.nifi.processors.standard.PutSQL.setParameters( > PutSQL.java:626) > > ~[nifi-standard-processors-1.0.0.jar:1.0.0] > > ... 13 common frames omitted > > >
