hi all,
I am trying to import the below structure.
While importing, I need to merge multiple records from the same file (kind
of self-join) to produce something like this.
Here, all records having the same ID are being merged together into a
single row. To separate the records, I have used
Ravi,
I assume the MapR client package is working and operational and you can
login to the uid running NiFi and issues the following successfuly:
$ maprlogin authtest
$ maprlogin print
$ hdfs dfs -ls /
So if those fail, fix them before you proceed.
If those work, I would point the issue is
Thank you for taking the time to try it!
Please let me know if there's other features you would find useful.
On Tue, 27 Mar 2018, 04:10 scott, wrote:
> Daniel,
>
> That worked perfectly. Thank you for the help, and for creating this great
> tool.
>
> Scott
>
> On 03/25/2018
Hello. I have installed NiFi Docker on a Google Cloud Debian 9 server.
The server has a public IP of PUBLIC_IP (placeholder for the real IP).
I ran docker pull apache/nifi and it installed the image successfully.
Then, I ran:
sudo docker run --name nifi -p 8080:8080 -d -e
thanks Mike, you've reasoned my concerns as well.
On Mon, Mar 26, 2018 at 5:57 PM, Mike Thomsen
wrote:
> I think you would hit two big barriers in design:
>
> 1. NiFi just isn't designed to be an app server for additional service
> layer components a la Tomcat.
> 2.
If you know one of the supported scripting languages, you can probably do
some of that with ExecuteScript. For example, if you wanted to drop every
other flowfile in a block of 100, it'd be like this:
def flowfiles = session.get(100) // Get up to 100
int index = 1
flowfiles?.each { flowFile ->
if
How is it going to work with e.g. 20GB of events in the queue? I'd be
careful, as requirements blow up into a full db with indexes, search, and a
UI on top. If one wanted to filter events, wouldn't a standard processor do
the job better?
Andrew
On Tue, Mar 27, 2018, 12:11 AM Joe Witt
Thanks Pierre,
It is working now.
Mohit
From: Pierre Villard
Sent: 27 March 2018 13:15
To: users@nifi.apache.org
Subject: Re: Unable to create HiveConnectionPool with kerberos.
Hi,
It needs to be the principal of the Hive server, not yours. Can you
When I try using my user prinicipal instead of hive it gives following error:
SelectHiveQL[id=633d54ed-0162-1000--6fa47d56]
org.apache.nifi.processors.hive.SelectHiveQL$$Lambda$523/1312347477@1c34a7fa
failed to process due to java.lang.IllegalArgumentException: Kerberos principal