Re: NiFi 0.5.1 "too many open files"

2016-04-27 Thread Joe Witt
Mike Ok that is a good data point. In my case they are all in archive but I do agree that isn't super meaningful because in reality nothing should ever be open for writing in the archive. If you can and have enough logging on try searching for that first part of the filename in your logs.

[GitHub] nifi pull request: NIFI-1817 Respecting when run.as is commented o...

2016-04-27 Thread asfgit
Github user asfgit closed the pull request at: https://github.com/apache/nifi/pull/383 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is

[GitHub] nifi pull request: NIFI-981: Added ExecuteHiveQL and PutHiveQL pro...

2016-04-27 Thread bbende
Github user bbende commented on a diff in the pull request: https://github.com/apache/nifi/pull/384#discussion_r61316112 --- Diff: nifi-nar-bundles/nifi-hive-bundle/nifi-hive-processors/src/main/java/org/apache/nifi/processors/hive/ExecuteHiveQL.java --- @@ -0,0 +1,178 @@ +/*

[GitHub] nifi pull request: NIFI-981: Added ExecuteHiveQL and PutHiveQL pro...

2016-04-27 Thread bbende
Github user bbende commented on a diff in the pull request: https://github.com/apache/nifi/pull/384#discussion_r61317929 --- Diff: nifi-nar-bundles/nifi-hive-bundle/nifi-hive-processors/src/main/java/org/apache/nifi/util/hive/HiveJdbcCommon.java --- @@ -0,0 +1,272 @@ +/*

[GitHub] nifi pull request: NIFI-981: Added ExecuteHiveQL and PutHiveQL pro...

2016-04-27 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request: https://github.com/apache/nifi/pull/384#discussion_r61317547 --- Diff: nifi-nar-bundles/nifi-hive-bundle/nifi-hive-processors/src/main/java/org/apache/nifi/processors/hive/ExecuteHiveQL.java --- @@ -0,0 +1,178 @@

Re: NiFi 0.5.1 "too many open files"

2016-04-27 Thread Michael Moser
I found something in the logs on the nodes where I had a problem. A ContentNotFoundException begins occurring on these nodes and after many thousands of times we eventually get "too many open files". Once I do surgery on the content repository so that ContentNotFoundException stops happening,

Re: NiFi 0.5.1 "too many open files"

2016-04-27 Thread Joe Witt
Mike, Definitely does not sound familiar. However, just looked up what you describe and I do see it. In my case there are only three files but they are sitting there open for writing by the nifi process and yet have been deleted. So I do believe there is an issue...will dig in a bit but

Re: NiFi 0.5.1 "too many open files"

2016-04-27 Thread Michael Moser
Another data point ... we had archiving turned on at first, and then most (but not all) files that lsof reported were /content_repository/0/archive/123456789-123456 (deleted). We turned archiving off, hoping that was related in some way, but it was not. -- Mike On Wed, Apr 27, 2016 at 11:53

[GitHub] nifi pull request: NIFI-1818 Adjusting repository exception handli...

2016-04-27 Thread apiri
GitHub user apiri opened a pull request: https://github.com/apache/nifi/pull/385 NIFI-1818 Adjusting repository exception handling NIFI-1818 Adjusting repository exception handling to reflect the appropriate repository instantiation that caused the issue. You can merge this pull

NiFi 0.5.1 "too many open files"

2016-04-27 Thread Michael Moser
Devs, We recently upgraded from NiFi 0.4.1 to 0.5.1 on a cluster. We noticed half of our cluster nodes getting "too many open files" errors that require a NiFi restart, while the other half works without this problem. Using 'lsof -p ' to identify the open file descriptors at the time of the

[GitHub] nifi pull request: NIFI-981: Added ExecuteHiveQL and PutHiveQL pro...

2016-04-27 Thread bbende
Github user bbende commented on a diff in the pull request: https://github.com/apache/nifi/pull/384#discussion_r61343688 --- Diff: nifi-nar-bundles/nifi-hive-bundle/nifi-hive-processors/src/main/java/org/apache/nifi/processors/hive/ExecuteHiveQL.java --- @@ -0,0 +1,178 @@ +/*

[GitHub] nifi pull request: NIFI-1818 Adjusting repository exception handli...

2016-04-27 Thread alopresto
Github user alopresto commented on the pull request: https://github.com/apache/nifi/pull/385#issuecomment-215251460 Reviewing... --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature

[GitHub] nifi pull request: Fixed typo in ScryptCipherProviderGroovyTest Ja...

2016-04-27 Thread asfgit
Github user asfgit closed the pull request at: https://github.com/apache/nifi/pull/380 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is

Re: Sample to read file and insert into oracle based on the pattern in the line

2016-04-27 Thread Bryan Bende
Siva, If I am understanding your scenario correctly, the processors to look at would likely be the following... GetFTP - to retrieve the remote file into NiFi RouteText - to separate the lines of the file based on a pattern SplitText - to split the output of RouteText into one line per flow file

Reading attributes

2016-04-27 Thread Jim Wagoner
Would it be worth adding to check making sure required attributes are on a flow file when onTrigger is called? I find myself putting all the attributes I define in 'ReadAttributes" in a List and then verifying they exist on the flow file as one of the first steps of my onTrigger call (and of

Re: Sample to read file and insert into oracle based on the pattern in the line

2016-04-27 Thread sivam1
Any body has sample code for the above? I am new to NiFi and would like to explore more? -- View this message in context: http://apache-nifi-developer-list.39713.n7.nabble.com/Sample-to-read-file-and-insert-into-oracle-based-on-the-pattern-in-the-line-tp9620p9645.html Sent from the Apache NiFi

Re: Sample to read file and insert into oracle based on the pattern in the line

2016-04-27 Thread Pierre Villard
Hi, Please detail "remote location". How the file can be accessed? Then you can use processors to extract your patterns and use expression language to perform insertions into tables. Depending of what is the technology behind your "tables" you have multiple options regarding processors to use.

[GitHub] nifi pull request: NIFI-1817 Respecting when run.as is commented o...

2016-04-27 Thread apiri
GitHub user apiri opened a pull request: https://github.com/apache/nifi/pull/383 NIFI-1817 Respecting when run.as is commented out Respecting when run.as is commented out by ensuring the chosen line starts with run.as optionally preceded by whitespace You can merge this pull

Re: Sample to read file and insert into oracle based on the pattern in the line

2016-04-27 Thread sivam1
Thanks Bryan. Love to see one sample if you could help me with one. Regards Siva. -- View this message in context: http://apache-nifi-developer-list.39713.n7.nabble.com/Sample-to-read-file-and-insert-into-oracle-based-on-the-pattern-in-the-line-tp9620p9651.html Sent from the Apache NiFi

Re: Sample to read file and insert into oracle based on the pattern in the line

2016-04-27 Thread sivam1
Thanks Pierre, Since I am a newbie to Nifi, struggling to get in to the groove, especially while inserting into Oracle DB i.e mapping values to attributes.. Need some light on this. Any working sample would make my life simple. Regards Siva. -- View this message in context:

Re: Sample to read file and insert into oracle based on the pattern in the line

2016-04-27 Thread sivam1
Hi Pierre, Thanks for the reply. Remote location means, FTP or a remote Network File system using file:// As I said, I executed all these in Mule ESB, Spring Batch. Wanting to see in the Nifi as I could see Nifi as great future in handling BigData. Regards Siva. On Wed, Apr 27, 2016 at 8:59 AM,

Re: Sample to read file and insert into oracle based on the pattern in the line

2016-04-27 Thread Pierre Villard
You can get your file using GetFtp processor in case of FTP or GetFile in case the file is available locally. Then, if the job is done on a line by line basis, you could use SplitContent processor to have one flow file per line. Once you have your line, depending of the format, you have multiple