Hi, I was going to suggest the above (i.e. using bind variables). There's currently no way to specify a binary constant on the command line. You could enhance the ENCODE built-in function to take a VARCHAR and return a VARBINARY. Or you could modify the grammar to interpret escaped characters correctly. Thanks, James
On Fri, Jun 20, 2014 at 4:11 AM, Abhilash L L <[email protected]> wrote: > Hello, > > Need help as I was looking for a command line counterpart for the > following: > > CREATE TABLE IF NOT EXISTS "my_case_sensitive_table" > ( "id" char(10) not null primary key, "value" integer) > DATA_BLOCK_ENCODING='NONE',MAX_FILESIZE=2000000 split on (?, ?, ?) > > Defines a split point for a table. Use a bind parameter with > preparedStatement.setBinary(int,byte[]) to supply arbitrary bytes. > > > > Regards, > Abhilash L L > Capillary Technologies > M:919886208262 > [email protected] | www.capillarytech.com > > Email from people at capillarytech.com may not represent official policy of > Capillary Technologies unless explicitly stated. Please see our > Corporate-Email-Policy for details. Contents of this email are confidential. > Please contact the Sender if you have received this email in error. > > > > On Tue, Jun 17, 2014 at 3:03 PM, Abhilash L L <[email protected]> > wrote: >> >> Hello, >> >> Im trying to construct a corresponding create table statement in >> phoenix for the following hbase create table statement. Facing an issue >> while specifying the splits. >> >> create 'my_table', {NAME=>'S', COMPRESSION=>'LZO', VERSIONS=>1}, >> {SPLITS=>["\x80\x00\x012\x80ZO'", "\x80\x00\x01\xe6\x81\xce\xbe]", >> "\x80\x00\x03\x12\x81+<\t"]}, {METHOD => 'table_att', MAX_FILESIZE => >> '16106127360', CONFIG => {'SPLIT_POLICY' => >> 'org.apache.hadoop.hbase.regionserver.ConstantSizeRegionSplitPolicy'}} >> >> >> Creates the splits as expected. >> >> >> >> The below phoenix statement, >> >> CREATE TABLE "my_table" ( >> "id" INTEGER NOT NULL, >> "id2" INTEGER NOT NULL, >> "S"."f1" VARCHAR, >> "S"."f2" INTEGER >> CONSTRAINT pk PRIMARY KEY ("id", "id2") ) >> MAX_FILESIZE=16106127360, >> COMPRESSION='LZO', >> VERSIONS=1, >> >> SPLIT_POLICY='org.apache.hadoop.hbase.regionserver.ConstantSizeRegionSplitPolicy' >> SPLIT ON ('\x80\x00\x01\xe6\x81\xce\xbe]', '\x80\x00\x03\x12\x81+<\\t'); >> >> is taking the splits as literal constants. >> >> >> I checked the phoenix sql grammar, which allows only string, boolean, >> numeric values. >> >> How do we specify splits for composite row keys, which could be 'concat' >> of bytes with any necessary lengths/delimiters ? >> >> Regards, >> Abhilash L L >> Capillary Technologies >> M:919886208262 >> [email protected] | www.capillarytech.com >> >> Email from people at capillarytech.com may not represent official policy >> of Capillary Technologies unless explicitly stated. Please see our >> Corporate-Email-Policy for details. Contents of this email are confidential. >> Please contact the Sender if you have received this email in error. >> > > > Email from people at capillarytech.com may not represent official policy of > Capillary Technologies unless explicitly stated. Please see our > Corporate-Email-Policy for details.Contents of this email are confidential. > Please contact the Sender if you have received this email in error.
