Sorry for the late reply.

As of today, the issue is still present.

Nifi Web UI just shows the message "Error while reading data, error
message: JSON table encountered too many errors, giving up. Rows: 1;
errors: 1. Please look into the errors[] collection for more details."

But the log is clearer :


--------------------------------------------------

Standard FlowFile Attributes

Key: 'entryDate'

Value: 'Thu Sep 26 09:53:49 UTC 2019'

Key: 'lineageStartDate'

Value: 'Thu Sep 26 09:53:49 UTC 2019'

Key: 'fileSize'

Value: '999'

FlowFile Attribute Map Content

Key: 'avro.schema'

Value:
'{"type":"record","name":"nifiRecord","namespace":"org.apache.nifi","fields":[{"name":"ExplicitConsent","type":["null",{"type":"record","name":"ExplicitConsentType","fields":[{"name":"id","type":["null","string"]},{"name":"identity","type":["null",{"type":"record","name":"identityType","fields":[{"name":"id","type":["null","string"]},{"name":"type","type":["null","string"]},{"name":"businessUnit","type":["null","string"]}]}]},{"name":"finality","type":["null","string"]},{"name":"expired","type":["null","boolean"]},{"name":"createdBy","type":["null","string"]},{"name":"createdDate","type":["null","string"]},{"name":"sender","type":["null",{"type":"record","name":"senderType","fields":[{"name":"id","type":["null","string"]},{"name":"application","type":["null","string"]},{"name":"type","type":["null","string"]}]}]},{"name":"state","type":["null","string"]}]}]}]}'


Key: 'bq.error.message'

Value: 'Error while reading data, error message: JSON table encountered
too many errors, giving up. Rows: 1; errors: 1. Please look into the
errors[] collection for more details.'

Key: 'bq.error.reason'

Value: 'invalid'

Key: 'bq.job.link'

Value:
'https://www.googleapis.com/bigquery/v2/projects/psh-datacompliance/jobs/9e790299-dc77-46f4-8978-476f284fe5b5?location=EU'


Key: 'bq.job.stat.creation_time'

Value: '1569491661818'

Key: 'bq.job.stat.end_time'

Value: '1569491662935'

Key: 'bq.job.stat.start_time'

Value: '1569491662366'

Key: 'filename'

Value: 'e6d604d7-b517-4a87-a398-e4a5df342ce6'

Key: 'kafka.key'

Value: '--'

Key: 'kafka.partition'

Value: '0'

Key: 'kafka.topic'

Value: 'dc.consent-life-cycle.kpi-from-dev-nifi-json'

Key: 'merge.bin.age'

Value: '1'

Key: 'merge.count'

Value: '3'

Key: 'mime.type'

Value: 'application/json'

Key: 'path'

Value: './'

Key: 'record.count'

Value: '3'

Key: 'uuid'

Value: 'e6d604d7-b517-4a87-a398-e4a5df342ce6'
2019-09-26 10:09:39,633 INFO [Timer-Driven Process Thread-4]
o.a.n.processors.standard.LogAttribute
LogAttribute[id=ce9c171f-0c8f-3cab-e0f2-16156faf15b8] logging for flow
file
StandardFlowFileRecord[uuid=e6d604d7-b517-4a87-a398-e4a5df342ce6,claim=StandardContentClaim
[resourceClaim=StandardResourceClaim[id=1569490848560-6,
container=default, section=6], offset=569098,
length=999],offset=0,name=e6d604d7-b517-4a87-a398-e4a5df342ce6,size=999]


I don't exactly understand why i would have to set an authentication,
because I've set the service.json content into the GCP Credentials
Provider I use for my PutBigQueryBatch processor ...

Is there anything I'm missing ? or a simple way to make sure verything
work as expected ?

Thanks


Le 24/09/2019 à 16:12, Pierre Villard a écrit :
Hey Nicolas,

Did you manage to solve your issue? Happy to help on this one.

Thanks,
Pierre

Le ven. 20 sept. 2019 à 16:42, Nicolas Delsaux <nicolas.dels...@gmx.fr
<mailto:nicolas.dels...@gmx.fr>> a écrit :

    Hello

    I'm using PutBigQueryBash and having weird auth issues.

    I have set the GCP Credentials Controller Service to use Service
    Account JSON which I have copied from the value given in Google
    Cloud Console.

    But when I run my flow, I get the error message "Error while
    reading data, error message: JSON table encountered too many
    errors, giving up. Rows: 1; errors: 1. Please look into the
    errors[] collection for more details."


    What is stranger is that when I log all properties, there is a
    bq.job.link which messages indicate "Request is missing required
    authentication credential. Expected OAuth 2 access token, login
    cookie or other valid authentication credential. See
    https://developers.google.com/identity/sign-in/web/devconsole-project.";
    ...

    But nifi can access the bigquery workspace and dataset (I've
    checked that by deleting the table schema that I have already
    written).

    So, is there something I'm doing wrong ?

    Thanks !

Reply via email to