Sticking with the original logicalType of timestamp-millis, I was able to
get it working using ${now():format('yyyy-MM-dd HH:mm:ss')} as the value
of the timestamp column.
On Mon, Oct 16, 2017 at 3:23 PM, Bryan Bende <[email protected]> wrote:
> If it is helpful, here is a template of the working version where I
> changed to a long:
>
> https://gist.githubusercontent.com/bbende/3187d3905f868bef3143899841bb70
> f9/raw/f266b149eb845a371543a1b68f9797d5437c6a2d/update-record-
> timestamp.xml
>
> On Mon, Oct 16, 2017 at 3:19 PM, Bryan Bende <[email protected]> wrote:
>
>> Aruna,
>>
>> I think the issue here might be on the reader side of things...
>>
>> If you're incoming data does not have SYS_CREAT_TS in it, then your
>> schema needs to allow null values for this field so you can create records
>> from the incoming data, or you can create two versions of your schema and
>> have the reader use a version that doesn't have this field, and have the
>> writer use a version that does.
>>
>> I am currently testing this using these examples:
>>
>> {
>> "type": "record",
>> "name": "schema1",
>> "fields": [
>> { "name": "id", "type": "string" }
>> ]
>> }
>>
>> {
>> "type": "record",
>> "name": "schema2",
>> "fields": [
>> { "name": "id", "type": "string" },
>> { "name": "timestamp", "type" : { "type" : "long", "logicalType" :
>> "timestamp-millis" } }
>> ]
>> }
>>
>> I have a CSVReader using schema1 and a CSVWriter using schema2.
>>
>> I'm able to get past the issue you are having and now it is getting to
>> the point where it is trying to take the result ofr ${now():toNumber()} and
>> convert it to a timestamp, and I'm running into a different issue:
>>
>> org.apache.nifi.serialization.record.util.IllegalTypeConversionException:
>> Could not convert value [1508180727622] of type java.lang.String to
>> Timestamp for field timestamp because the value is not in the expected date
>> format: org.apache.nifi.serialization.record.util.DataTypeUtils$$Lam
>> bda$414/572291470@33d51c66
>> org.apache.nifi.serialization.record.util.IllegalTypeConversionException:
>> Could not convert value [1508180727622] of type java.lang.String to
>> Timestamp for field timestamp because the value is not in the expected date
>> format: org.apache.nifi.serialization.record.util.DataTypeUtils$$Lam
>> bda$414/572291470@33d51c66
>> at org.apache.nifi.serialization.record.util.DataTypeUtils.toTi
>> mestamp(DataTypeUtils.java:564)
>> at org.apache.nifi.serialization.record.util.DataTypeUtils.conv
>> ertType(DataTypeUtils.java:134)
>> at org.apache.nifi.serialization.record.util.DataTypeUtils.conv
>> ertType(DataTypeUtils.java:84)
>> at org.apache.nifi.serialization.record.MapRecord.setValue(MapR
>> ecord.java:317)
>> at org.apache.nifi.record.path.StandardFieldValue.updateValue(S
>> tandardFieldValue.java:132)
>> at org.apache.nifi.processors.standard.UpdateRecord.lambda$proc
>> ess$1(UpdateRecord.java:177)
>>
>> I haven't figured out the issue, but it has something to do with the
>> logic of trying to convert a string to timestamp and whether or not a date
>> format is provided.
>>
>> If I change schema1 to use a regular long for the timestamp field then it
>> works:
>>
>> {
>> "type": "record",
>> "name": "schema2",
>> "fields": [
>> { "name": "id", "type": "string" },
>> { "name": "timestamp", "type" : "long" }
>> ]
>> }
>>
>> I don't know what the ramifications of this would be in PutDatabaseRecord.
>>
>>
>> On Mon, Oct 16, 2017 at 2:23 PM, Aruna Sankaralingam <
>> [email protected]> wrote:
>>
>>> Matt,
>>>
>>>
>>>
>>> My issue is that when I am trying to assign current date to sys_creat_ts
>>> for every record, I am getting this error that it cannot be null.
>>> SYS_CREAT_TS is not coming from the source. It is present in the target
>>> table alone and I need to load the current datetime in that field for every
>>> record that is coming from source.
>>>
>>>
>>>
>>> Mark,
>>>
>>>
>>>
>>> I am still getting the same error.
>>>
>>>
>>>
>>> Also, actv_ind must display just “Y” in the target table.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> *From:* Mark Payne [mailto:[email protected]]
>>> *Sent:* Monday, October 16, 2017 12:03 PM
>>> *To:* [email protected]
>>> *Subject:* Re: UpdateRecord Processor
>>>
>>>
>>>
>>> Hi Aruna,
>>>
>>>
>>>
>>> I think the issue is likely that you are setting /sys_create_ts but your
>>> schema has that field as SYS_CREAT_TS (i.e., it is in upper case).
>>>
>>> Would recommend you change the property names to /ACTV_IND and
>>> /SYS_CREAT_TS. Also, the value that you have for /sys_creat_ts is
>>>
>>> set to "{now()}" but I think what you really are wanting is
>>> "${now():toNumber()}".
>>>
>>>
>>>
>>> Also, of note, /actv_ind is set to '/Y' so I just want to clarify that
>>> the literal value '/Y' (with the single quotes) is what will be placed
>>> there. Is that
>>>
>>> the intention? Or did you actually want just /Y to be there?
>>>
>>>
>>>
>>> Thanks
>>>
>>> -Mark
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Oct 16, 2017, at 11:41 AM, Aruna Sankaralingam <
>>> [email protected]> wrote:
>>>
>>>
>>>
>>> Hi,
>>>
>>>
>>>
>>> I updated to version 1.4 now. I am using Updaterecord processor. I have
>>> assigned values to two of the columns that are there in the target table as
>>> shown below but I am getting the error that SYS_CREAT_TS cannot be NULL. Am
>>> I missing something?
>>>
>>> I have provided the screenshots of the CSVReordWriter,
>>> AVROSchemaRegistry and my Nifi Flow below.
>>>
>>>
>>>
>>> <image004.png> <image005.png>
>>>
>>>
>>>
>>> CSVRecordSetWriter:
>>>
>>> <image001.png>
>>>
>>>
>>>
>>> AVROSchemaRegistry:
>>>
>>> <image002.png>
>>>
>>>
>>>
>>> <image006.png>
>>>
>>>
>>>
>>>
>>>
>>> *From:* Koji Kawamura [mailto:[email protected]
>>> <[email protected]>]
>>> *Sent:* Thursday, October 12, 2017 8:28 PM
>>> *To:* [email protected]
>>> *Subject:* Re: Transformations using Nifi
>>>
>>>
>>>
>>> Hi Aruna,
>>>
>>>
>>>
>>> If you can not upgrade from NiFi 1.2.0, then I think the best bed is
>>> using:
>>>
>>> ScriptedReader or ScriptedRecordSetWriter for data conversion #1, 2 and
>>> 3.
>>>
>>> As Russ mentioned, EL might be helpful when you implement the scripted
>>> components.
>>>
>>>
>>>
>>> #4 is a bit harder since it requires a database connection, but doable,
>>> I don't know it works efficiently though..
>>>
>>>
>>>
>>> Alternative approach would be create a temporary table, insert rows as
>>> they are, then perform a insert/update query using the temporary table and
>>> postgresql lookup table such as:
>>>
>>> "insert into X select a, b, c, from T inner join L on X.j = T.j where d
>>> = ..."
>>>
>>> Probably data conversion #1, 2, and 3 can be performed from this query
>>> as well.
>>>
>>>
>>>
>>> Thanks,
>>>
>>> Koji
>>>
>>>
>>>
>>> On Thu, Oct 12, 2017 at 11:50 PM, Russell Bateman <[email protected]>
>>> wrote:
>>>
>>> Aruna,
>>>
>>> I don't think there is any generalized NiFi training course yet. I
>>> started writing a book a year ago, but it's a pretty thankless task and
>>> takes a lot of time. I mostly write custom processors so I tend to see the
>>> world differently from most "user" users of NiFi. When I answer questions,
>>> which I don't do too much, I often tell an impractical story that doesn't
>>> answer the question asked.
>>>
>>> Now, I haven't done much database work at all, so I'm not going to be
>>> too much help to you. However, looking at your questions, particularly the
>>> one about obtaining the current date when the flow is run (#2) and also #3,
>>> I would suggest that you study the NiFi Expression Language. This would
>>> allow you to insert the "now" date into your flow at some point.
>>>
>>> The other suggestions I have are to experiment, which I'm sure you're
>>> doing, and Google hard for help. For this forum, which is filled with very
>>> nice and helpful people, you'll tend to get a lot more and better help if
>>> you come in with a very specific question rather than a list of things or a
>>> general, "help me" sort of plea.
>>>
>>> Cheers,
>>>
>>> Russ
>>>
>>> P.S. NiFi absolutely rocks, which you'll see as soon as you get over
>>> your initial hump here. But, you're on the right track.
>>>
>>>
>>>
>>> On 10/12/2017 08:16 AM, Aruna Sankaralingam wrote:
>>>
>>> Hi,
>>>
>>>
>>>
>>> Could you someone please help me with these requirements in the email
>>> below?
>>>
>>>
>>>
>>> Thanks
>>>
>>> Aruna
>>>
>>>
>>>
>>> *From:* Aruna Sankaralingam [mailto:[email protected]
>>> <[email protected]>]
>>> *Sent:* Wednesday, October 11, 2017 11:26 AM
>>> *To:* [email protected]
>>> *Subject:* Transformations using Nifi
>>>
>>>
>>>
>>> I am trying to see what kind of transformations that can be done in nifi
>>> and how.
>>>
>>> Now I have a basic flow that takes CSV from the local dir and puts into
>>> s3 and loads into postgres database.
>>>
>>> There are 4 columns in my test file 3 of which are string and one is an
>>> integer field. I would like to do the following before I load the data into
>>> postgres. If someone can help me on how to go about these, it will be great.
>>>
>>> 1. Convert one of the string columns to upper case
>>>
>>> *For converting to upper case, I was told to use the Update Record
>>> processor but my version is 1.2.0 and the update record processor is not
>>> available.*
>>>
>>>
>>>
>>> 2. Postgres has an extra column called “Load_Date” in which I
>>> would like to load the current date with timestamp when the flow is run
>>>
>>> 3. If the integer column has more than 5 digits, I would like to
>>> take only the first 5 digits and load to the table
>>>
>>> 4. There is a look up table in postgres. I would like to check if
>>> the first column value is present in the look up table and if yes, proceed
>>> ahead and if not ignore the record
>>>
>>>
>>>
>>> I am trying to learn nifi so I would really appreciate any kind of help
>>> here. Is there any training available online that I can take in order to
>>> understand and do all these?
>>>
>>>
>>>
>>> <image003.png>
>>>
>>>
>>>
>>
>>
>