Hi,
Thanks for this.
I'm struggling a little with types etc, specifically specifying arrays in
schema. I'm missing something simple. Not sure I've located the best
examples to work with. Currently my error is:
ConvertRecord[id=01861024-3979-1a14-d64d-90e6deefd13b] Failed to process
FlowFile[filena
I think the hard part here is taking a "raw" file like PDF bytes and
creating a record in a certain format. For now I think ScriptedReader
is your best bet, you can read the entire input stream in as a byte
array then return a Record that contains a "bytes" field containing
that data. You can creat
Any insights on this question post break? I think my problem can be
summarised as looking for the right way to place binary data, stored as a
on-disk file into a field of an avro record
On Wed, Dec 20, 2023 at 5:06 PM Richard Beare
wrote:
> I think I've made some progress with this, but I'm now
I think I've made some progress with this, but I'm now having trouble with
pdf files. The approach that seems to partly solve the problem is to have a
ConvertRecord processor with a scripted reader to place the on disk (as
delivered by the GetFile processor) into a record field. I can then use an
U
Hi,
I've gotten rusty, not having done much nifi work for a while.
I want to run some tests of the following scenario. I have a workflow that
takes documents from a DB and feeds them through tika. I want to test with
a different document set that is currently living on disk. The tika
(groovy) proc