Hi,
I assume by endpoint you suggest to expose the broker side of the kafka
API? As far as I know it is not possible.
However you could use the site2site API or other supported reliable
protocols (http, beats, relp, etc) to feed data to NiFi and from there feed
kafka.
How this fits into your
I have not been able to find anything in the logs that is useful, but I'm
not sure I enabled all of the logging that I need to.
But I did find some more information, if I restart the NiFi service the
InvokeHTTP call starts working and we have two other flow patterns that
have the same issue. In
Mike,
Just out of curiosity, what would the original data for your example
look like that produced that JSON?
Is it a CSV with two lines, like:
ABC, XYZ
DEF, LMN
and then ExecuteScript is turning that into the JSON array?
As far as reading the JSON, I created a simple flow of GeneratFlowFile
Bryan,
I have the processor somewhat operational now, but I'm running into a
problem with the record readers. What I've done is basically this:
Ex. JSON:
[
{
"key": "ABC", "value": "XYZ"
},
{
"key": "DEF", "value": "LMN"
}
]
Avro schema:
{
"type": "record",
"name":
Yeah, I just screwed up and didn't reference one.
On Thu, Jun 8, 2017 at 1:26 PM, Mike Thomsen wrote:
> I'll have to look again, but I scanned through the XML and didn't see
> either my avro schema registry or the jsonpath reader.
>
> Thanks,
>
> Mike
>
> On Thu, Jun 8,
Hi
I have Nifi 3 node cluster (Installed Via Hortonworks Data Flow - HDF ) in
Kerborized environment. As part of installation Ambari has created nifi
service keytab .
Can I use this nifi.service.keytab for configuring processors like PutHDFS
who talks to Hadoop services ?
The
I'll have to look again, but I scanned through the XML and didn't see
either my avro schema registry or the jsonpath reader.
Thanks,
Mike
On Thu, Jun 8, 2017 at 1:10 PM, Matt Gilman wrote:
> Mike,
>
> Currently, the services are saved if they are referenced by
Raymond,
If you enable debug level logging, I believe that InvokeHTTP will log the
request and response. It may be helpful in diagnosing this issue. I think
you could just set the bulletin level to DEBUG to see these as messages as
bulletins. Additionally, you can update your conf/logback.xml to
Mike,
Currently, the services are saved if they are referenced by processors in
your data flow. There is an existing JIRA [1] to always include them.
Thanks
Matt
[1] https://issues.apache.org/jira/browse/NIFI-2895
On Thu, Jun 8, 2017 at 12:59 PM, Mike Thomsen
wrote:
Mike,
I believe templates include controller services by default, as long as one
or more of the processors in the template references the controller
service. Did that not happen for you?
Thanks,
James
On Thu, Jun 8, 2017 at 9:59 AM, Mike Thomsen wrote:
> Is it
No bulletins on any of the processors. All of the output flow-files have 0
bytes and the error 401 in the attributes.
All of the properties look correct and I can copy the values from the
non-working to the manually created processor and it works fine.
When you export the SSL context service and
Is it possible to save the controller services w/ a template?
Thanks,
Mike
You won't need/want NiFi for that part; instead you would need to
login to the machine running SQL Server, install an FTP daemon (such
as ftpd), then in the PutFTP processor in NiFi you can point to the
FTP server using the Hostname, Port, Username, Password, etc.
On Thu, Jun 8, 2017 at 12:18 PM,
Jim,
This might be related and coincidentally today we were talking with a
coworker about the "advanced" button of UpdateAttribute and its ability to
set attributes based on conditions.
It's pretty powerful. [1]
It might come in useful for your efforts.
[1]
Raymond,
When it's in a state that is not working are there any bulletins on the
second processor? When it's in that state and you view the configuration
details for that processor, do the properties look correct and the same as
when you manually re-add the processor through the UI? Specifically,
Matt,
Thanks for your wonderful response
I think create FTP server is best way for me to move input file into sql
and runs a query.
Can you please suggest way
to create FTP server in Sql installed machine using NIFI?
Many thanks,
Prabhu
On 08-Jun-2017 6:27 PM, "Matt Burgess"
I do not believe NiFi has any specific features for Zeppelin yet, but it is
possible to write custom Zeppelin code paragraphs that communicate with the
NiFi API to pull data or inspect flow status. For an example, I recommend
Pierre Villard's US Presidential Election: tweet analysis using
We have a node.js service that automatically creates & manages NiFi groups
using the REST API which works great in NiFi 1.1.1. We are upgrading our
NiFi instances to 1.2.0 and I have found that some of the processors are
exhibiting odd behavior.
We have a flow the connects to the Outlook 365 OWA
Hi All!
I am new here. I find NiFi as a great project for data routing. However,
extendability of NiFi give us a great potential to expand application of
NiFi routing. Recently I looked at Kylo- a software for management of data
lake (based on NiFi and its templates). In this composition of NiFi
I do understand now. Thank you very much Mark. -Jim
On Thu, Jun 8, 2017 at 9:34 AM, Mark Payne wrote:
> Jim,
>
> The first expression will return false. None of the expressions below will
> ever throw an Exception.
>
> You could even chain them together like
>
Jim,
The first expression will return false. None of the expressions below will ever
throw an Exception.
You could even chain them together like ${myAttribute:toLower():length():gt(4)}
and if myAttribute does not
exist, it will return false, rather than throwing an Exception.
Thanks
-Mark
Jim,
You can use the expression:
${myAttribute:isNull()}
Or, alternatively, depending on how you want to setup the route:
${myAttribute:notNull()}
If you want to check if the attribute contains 'True' somewhere within its
value,
then you can use:
${myAttribute:contains('True')}
Thanks
Hi Pierre,
After converting those date time format in to integer (using expression
language),I can able to process the file as per the requirement
by setting those integer values to the priority attribute and process those
files based on that priority.
Thanks for your guidance
Regards,
Manoj
Good morning. I receive HTTP POSTs of various types of files. Some have a
particular attribute myAttribute, some do not. I want to route the
flowfiles to different workflow paths depending on the presence of this
attribute. Can I use RouteAttribute and the expression language to do that,
something
i have running nifi instance in one machine and have SQL Server in another
machine.
Here i can try to perform bulk insert operation with bulk insert Query in
SQLserver. but i cannot able insert data from one machine and move it into
SQL Server in another machine.
If i run nifi and SQL Server in
Koji,
One could convert date to epoch format which is incremental in nature.
Would that help?
On 8 Jun 2017 19:33, "Koji Kawamura" wrote:
> Hi Manoj,
>
> I think EnforceOrder would not be useful in your case, as it expects
> the order to increases one by one (without
Hi Manoj,
I think EnforceOrder would not be useful in your case, as it expects
the order to increases one by one (without skip).
As Pierre suggested, I'd suggest using PriorityAttributePrioritizer.
Thanks,
Koji
On Thu, Jun 8, 2017 at 3:50 PM, Pierre Villard
wrote:
Hi Manoj,
You may want ot have a look at EnforceOrder processor [1] or simply the
prioritizers [2] of the connections (it depends of how your workflow is
working). The idea would be to extract the date as an attribute of your
flow file, convert into an integer (using expression language) and use
Hi All,
I need to process the files based on the date time value stored on the
attribute
*For example:*
If the incoming files contains the following date time attribute values
*2017/06/07 16:57:02*
*2017/06/06 12:49:49*
*2017/06/06 11:09:28*
*2017/06/06 06:37:45*
I need to process the
29 matches
Mail list logo