Hi Ryan,
With following settings:
# LookupAttribute
(+dynamic) lookedUp=${someAttribute}
# SimpleCsvFileLookupService
CSV File=data.csv
Lookup Key Column=id
Lookup Value Column=value
# data.csv
id,value,desc
one,1,the first number
two,2,the 2nd number
If a FlowFile having 'someAttribute'
Hi Vijay,
Apache NiFi 1.x doesn’t have “roles”, so the “administrators” group doesn’t
carry any special significance [1], and connections do not have policies
assigned to them. You’ll need to assign the “View the data” and “Modify the
data” policies to yourself on the specified resources (the
Hi
I have a secure NiFi instance setup.
I put my self in the Administrator group and created policies for the
administrator.
When I right click on a connection and click on the List Queue, I get the
following error:
Insufficient Permissions:
No Applicable policies could be
I know you mentioned staying schema agnostic, but if you went with the
record approach then this sounds like a good fit for the HBase lookup
service.
Steps 3-5 would be using LookupRecord with an HBaseLookupService where
you lookup by row id, and put the results into the current record.
I'm not
James,
The easiest would be to merge json in a custom processor. Not easy as in no
work at all, but given your limitations with the NiFi version could be done
sooner maybe.
Andrew
On Mon, Dec 17, 2018, 9:53 AM James Srinivasan
wrote:
> Hi all,
>
> I'm trying to enrich a data stream using
Hi all,
I'm trying to enrich a data stream using NiFi. So far I have the following:
1) Stream of vehicle data in JSON format containing (id, make, model)
2) This vehicle data goes into HBase, using id as the row key and the
json data as the cell value (cf:json)
3) Stream of position data in JSON
Ok, I did so.
https://issues.apache.org/jira/browse/NIFI-5901
Thanks,
Flo
On Mon, Dec 17, 2018 at 2:29 PM Matt Burgess wrote:
> No, that case was only for reading from RDBMS or Hive, because the type is
> OTHER we assume the object can be represented as a String so we just get
> the Object
No, that case was only for reading from RDBMS or Hive, because the type is
OTHER we assume the object can be represented as a String so we just get the
Object and call toString() on it, basically a “best-effort” interpretation
which happens to work for JSON and JSONB fields (at least for the
Does NIFI-5845 allow to use an avro schema with a type of OTHER for
json/jsonb and then use it in PutDatabaseRecord ?
fields": [
{"name": "data", "type": "other"},
On Sat, Dec 15, 2018 at 9:48 PM Matt Burgess wrote:
> Does NIFI-5845 [1] help? If the drivers return OTHER for JSON or JSONB
My bad, If I explicitly set the the type to OTHERS, it works perfectly.
Object data = "{...}";
...
ps.setObject(2, data.toString(), java.sql.Types.OTHER);
On Mon, Dec 17, 2018 at 10:30 AM Flo Rance wrote:
> What I mean, if I try to get an object and try to store a "stringify"
> version, I get
What I mean, if I try to get an object and try to store a "stringify"
version, I get the following error:
Object data = "{...}";
...
ps.setObject(2, data.toString());
...
ERROR: column "data" is of type jsonb but expression is of type character
varying
On Mon, Dec 17, 2018 at 9:28 AM Flo Rance
This may work with JSON field, but I'm not sure it will with JSONB because
it's a binary format.
On Sat, Dec 15, 2018 at 9:48 PM Matt Burgess wrote:
> Does NIFI-5845 [1] help? If the drivers return OTHER for JSON or JSONB
> types (like PostgreSQL does) then this improvement should do the
12 matches
Mail list logo