Re: Error with lines ended with backslash when Bulk Data Loading

2016-12-08 Thread Gabriel Reid
Hi, Indeed, the -e parameter isn't documented for the bulk loader on the documentation page. This is definitely not intentional. It also does indeed appear that there are some functional differences between the way that the escape character is defined between psql (where there is no default

diffent behavior for escape character backslash when bulk loading data

2016-12-08 Thread rubysina
diffent behavior for escape character backslash when bulk loading data seems there's diffent behavior for escape character between Mapreduce JsonBulkLoadTool and psql.py if lines end with backslash \ , psql.py can load it without any error, Mapreduce JsonBulkLoadTool fails with error

Re: Error with lines ended with backslash when Bulk Data Loading

2016-12-08 Thread rubysina
ok. thank you. but there's no parameter -e on page http://phoenix.apache.org/bulk_dataload.html and, why the -g,–ignore-errors parameter doesn't work? if there's some lines ended with backslash, just ignore it, why fail? there's always something error in txt files. why not ignore it? how?

Re: Can I reuse parameter values in phoenix query?

2016-12-08 Thread Cheyenne Forbes
thank you

Re: Can I reuse parameter values in phoenix query?

2016-12-08 Thread James Taylor
Yes - use :1 instead of ?1 On Thu, Dec 8, 2016 at 12:11 PM Cheyenne Forbes < cheyenne.osanu.for...@gmail.com> wrote: > So instead of doing: > > query("select from table where c1 = ? or c2 = ?", [my_id, my_id]) > > i would do: > > query("select from table where c1 = ?1 or c2 = ?1",

Can I reuse parameter values in phoenix query?

2016-12-08 Thread Cheyenne Forbes
So instead of doing: query("select from table where c1 = ? or c2 = ?", [my_id, my_id]) i would do: query("select from table where c1 = ?1 or c2 = ?1", [my_id])

Re: Error with lines ended with backslash when Bulk Data Loading

2016-12-08 Thread Gabriel Reid
Hi Backslash is the default escape character that is used for parsing CSV data when running a bulk import, so it has a special meaning. You can supply a different (custom) escape character with the -e or --escape flag on the command line so that parsing your CSV files that include backslashes

Re: Inconsistent null behavior

2016-12-08 Thread Mark Heppner
Increasing it from the default of 2 MB to 8 MB seemed to fix it for me, thanks. On Wed, Dec 7, 2016 at 12:38 AM, Ankit Singhal wrote: > > @James, is this similar to https://issues.apache.org/ > jira/browse/PHOENIX-3112? > @Mac, can you try if increasing

Error with lines ended with backslash when Bulk Data Loading

2016-12-08 Thread rubysina
hi, I'm new to phoenix sql and here's a little problem. I'm following this page http://phoenix.apache.org/bulk_dataload.html I just found that the MapReduce importer could not load file with lines ended with backslash even with the -g parameter , i.e. ignore-errors, "java.io.IOException: EOF