Good catch. Can you please be sure to cover in a JIRA?
That said, wouldn't we see that in the stack trace during the
problematic condition?
On Mon, Sep 18, 2017 at 9:16 AM, Bryan Bende wrote:
> The code in SocketChannelSender that Davy pointed out could definitely
> be the
Davy
Interesting. So in looking through the stack trace I don't see
anything related to sockets nifi has initiated to another service and
nothing for PutTCP. I'm not saying that means there is nothing but
the stack traces only show the custom GetTCP processors.
You can use netstat to show open
I just created a JIRA and will put up a PR shortly:
https://issues.apache.org/jira/browse/NIFI-4391
The processor is catching the exception while attempting to obtain a
connection, and then logs an error and transfers to failure which is
where we see this message:
2017-09-17 14:20:20,860 ERROR
The code in SocketChannelSender that Davy pointed out could definitely
be the problem...
It makes a non-blocking channel and calls connect, then goes into a
loop waiting for finishConnect() to return true, but if that doesn't
happen before the configured timeout, then it throws an exception, but
A typical production setup is to use Kafka in the middle.
Andrew
On Mon, Sep 18, 2017, 3:02 AM Margus Roo wrote:
> Hi
>
> I need to take flow with Spark streaming from Nifi port. As we know
> Spark supports spark.streaming.receiver.maxRate and
>
Hello,
You should be able to have a ListenHTTP processor (or
HandleHttpRequest/Response) that connects to a Notify processor which
would release the flow files sitting in front of Wait.
Thanks,
Bryan
On Sun, Sep 17, 2017 at 12:19 PM, joe harvyy wrote:
> Hi,
>
> I have a
Andrew,
yes. we are doing the same for the oracle db which is quite old and does not provide this information.
Anyway. Was just curious if somebody has a smarter solution. The blogs of Nifi and Kafka have really good samples of extracting data but none of them touches the topic of
Thx a lot for the quick response. Looking forward to the PR and the
release :)
Would this for example still make the 1.4.0 release ?
It would also be very interesting to log client ports in debug mode
don't know how easy that is with nio.
There is Keep Alive Timeout = 2min specified on
Thx a lot for the quick response. Looking forward to the PR and the
release :)
Would this for example still make the 1.4.0 release ?
It would also be very interesting to log client ports in debug mode
don't know how easy that is with nio.
There is Keep Alive Timeout = 2min specified on
Uwe,
Is there anything in the V$ARCHIVED_LOG table [1] in your source
database? If so you may be able to get some of that information from
there. Also is LogMiner [2] enabled on the database? That's another
way to be able to query the logs to get things like deletes.
In general, there has to be
Davy,
I just pushed a second commit to the PR that will log the port from
the local address of the socket being used by the sender, which I
think is what you mean by the client port.
If you turn on debug for PutTCP you will see something like...
o.apache.nifi.processors.standard.PutTCP
Davy
If you could give the PR a try and see if it helps I'd be happy to
help get it reviewed and in for 1.4 if timing works out.
Thanks
On Mon, Sep 18, 2017 at 4:39 PM, Bryan Bende wrote:
> Davy,
>
> I just pushed a second commit to the PR that will log the port from
> the
Hi Koji,
Thanks for response and helpful links !
NiFi version : 1.1.0.2.1.2.0-10
I am trying to move data from operational system (oracle db) to analytical
system (postgres db). Postgres table has been model/designed by us (and can
add primary key). Data from oracle looks like below (i need to
Could not find 'PutDatabaseRecord' in the NiFi version : 1.1.0.2.1.2.0-10 I
am using . Please suggest ?
On Tue, Sep 19, 2017 at 12:10 AM, Vikram More
wrote:
> Hi Koji,
> Thanks for response and helpful links !
>
> NiFi version : 1.1.0.2.1.2.0-10
>
> I am trying to
Hi
I need to take flow with Spark streaming from Nifi port. As we know
Spark supports spark.streaming.receiver.maxRate and
spark.streaming.receiver.backpressure
Seems that org.apache.nifi.spark.NiFiReceiver does support it at all.
Hi Everyone,
I am new to NiFi and community :)
I am trying to build a Nifi flow which will pull from Oracle table and load
into Postgres table. My select query has two columns and I need to remove
duplicates based on these two columns. Can I remove duplicates in Nifi
based on two column data
16 matches
Mail list logo