Re: HTTP and OAuth

2016-04-12 Thread Andy LoPresto
I’d encourage people interested in how Pierre solved this to read his explanation blog: > New post! OAuth 1.0A with @apachenifi using ExecuteScript and #Groovy on a > @twitterapi… https://t.co/T1dXFchIFE > https://t.co/BbjBXAjJyt Andy LoPresto

Re: Large dataset on hbase

2016-04-12 Thread Bryan Bende
Is the output of your Pig script a single file that contains all the JSON documents corresponding to your CSV? or does it create a single JSON document for each row of the CSV? Also, are there any errors in logs/nifi-app.log (or on the processor in the UI) when this happens? -Bryan On Tue, Apr 1

Re: Large dataset on hbase

2016-04-12 Thread prabhu Mahendran
Hi, I just use Pig Script to convert the CSV into JSON with help of ExecuteProcess. In my case i have use n1 from JSON document which could be stored as row key in HBase Table.So n2-n22 store as columns in hbase. some of rows (n1's) are stored inside the table but remaining are read well but not

Re: Large dataset on hbase

2016-04-12 Thread Bryan Bende
Hi Prabhu, How did you end up converting your CSV into JSON? PutHBaseJSON creates a single row from a JSON document. In your example above, using n1 as the rowId, it would create a row with columns n2 - n22. Are you seeing columns missing, or are you missing whole rows from your original CSV? Th