Re: Hive UDF for creating row key in HBASE

2017-12-18 Thread James Taylor
Hi Chethan,
As Ethan mentioned, take a look first at the Phoenix/Hive integration. If
that doesn't work for you, the best way to get the row key for a phoenix
table is to execute an UPSERT VALUES against the primary key columns
without committing it. We have a utility function that will return the
Cells that would be submitted to the server that you can use to get the row
key. You can do this through a "connectionless" JDBC Connection, so you
don't need any RPCs (including executing the CREATE TABLE call so that
Phoenix knows the metadata).

Take a look at ConnectionlessTest.testConnectionlessUpsert() for an example.

Thanks,
James

On Sun, Dec 17, 2017 at 1:19 PM, Ethan  wrote:

>
> Hi Chethan,
>
> When you write data from HDFS, are you planning to use hive to do the ETL?
> Can we do something like reading from HDFS and use Phoenix to write into to
> HBASE?
>
> There is https://phoenix.apache.org/hive_storage_handler.html, I think is
> enabling Hive to read from phoenix table, not the other way around.
>
> Thanks,
>
> On December 16, 2017 at 8:09:10 PM, Chethan Bhawarlal (
> cbhawar...@collectivei.com) wrote:
>
> Hi Dev,
>
> Currently I am planning to write data from HDFS to HBASE. And to read data
> I am using Phoenix.
>
> Phoenix is converting its primary keys separated by bytes("\x00") and
> storing it in HBASE as row key.
>
> I want to write a custom UDF in hive to create ROW KEY value of HBASE such
> that Phoenix will be able to split it into multiple columns.
>
> Following is the custom UDF code I am trying to write;
>
>
> import org.apache.hadoop.hive.ql.exec.Description;
>
> import org.apache.hadoop.hive.ql.exec.UDF;
>
> import org.apache.hadoop.hive.ql.udf.UDFType;
>
>
> @UDFType(stateful = true)
>
> @Description(name = "hbasekeygenerator", value = "_FUNC_(existing) -
> Returns a unique rowkey value for hbase")
>
> public class CIHbaseKeyGenerator extends UDF{
>
> public String evaluate(String [] args){
>
> byte zerobyte = 0x00;
>
> String zbyte = Byte.toString(zerobyte);
>
> StringBuilder sb = new StringBuilder();
>
>
> for (int i = 0; i < args.length-1;++i) {
>
> sb.append(args[i]);
>
> sb.append(zbyte);
>
>
> }
>
> sb.append(args[args.length-1]);
>
> return sb.toString();
>
> }
>
> }
>
>
> Following are my questions,
>
>
> 1.is it possible to emulate the behavior of phoenix(decoding) using hive
> custom UDF.
>
>
> 2. If it is possible, what is the better approach for this. It will be
> great if some one can share some pointers on this.
>
>
> Thanks,
>
> Chethan.
>
>
>
>
>
>
>
>
>
>
> Collective[i] dramatically improves sales and marketing performance using
> technology, applications and a revolutionary network designed to provide
> next generation analytics and decision-support directly to business users.
> Our goal is to maximize human potential and minimize mistakes. In most
> cases, the results are astounding. We cannot, however, stop emails from
> sometimes being sent to the wrong person. If you are not the intended
> recipient, please notify us by replying to this email's sender and deleting
> it (and any attachments) permanently from your system. If you are, please
> respect the confidentiality of this communication's contents.
>
>


RE: problem to run phoenix client 4.13.1 for CDH5.11.2 on windows

2017-12-18 Thread Pedro Boado
Hi Noam,

thanks for your feedback. PHOENIX-4454 and PHOENIX-4453 were opened for
looking into these issues and a fix for both has already been applied to
the git branch.

I'll publish a new dev release of the parcel in the next couple of days in
the same repo as the previous one.

Cheers.

On 6 Dec 2017 06:54, "Bulvik, Noam"  wrote:

> One more on this when using the client from the regular release (4.13 for
> HBase )1.3 it works fine on same pc
>
>
>
> *From:* Bulvik, Noam
> *Sent:* Thursday, November 30, 2017 11:22 AM
> *To:* user@phoenix.apache.org
> *Subject:* problem to run phoenix client 4.13.1 for CDH5.11.2 on windows
>
>
>
> Hi
>
> I am using JDBC UI client on windows. After I upgrade to latest parcel of
> phonix I got the following error (it did not happened on older parcels
> either when I compiled myself or when I used ones supplied by cloudera
> [4.7])
>
>
>
> SEVERE: Failed to locate the winutils binary in the hadoop binary path
>
> java.io.IOException: Could not locate executable null\bin\winutils.exe in
> the Hadoop binaries.
>
> at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.
> java:404)
>
> at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:419)
>
> at org.apache.hadoop.util.Shell.(Shell.java:412)
>
> at org.apache.hadoop.util.StringUtils.(
> StringUtils.java:79)
>
> at org.apache.hadoop.security.Groups.parseStaticMapping(
> Groups.java:168)
>
> at org.apache.hadoop.security.Groups.(Groups.java:132)
>
> at org.apache.hadoop.security.Groups.(Groups.java:100)
>
> at org.apache.hadoop.security.Groups.
> getUserToGroupsMappingService(Groups.java:435)
>
> at org.apache.hadoop.security.UserGroupInformation.initialize(
> UserGroupInformation.java:337)
>
> at org.apache.hadoop.security.UserGroupInformation.
> ensureInitialized(UserGroupInformation.java:304)
>
> at org.apache.hadoop.security.UserGroupInformation.
> loginUserFromSubject(UserGroupInformation.java:891)
>
> at org.apache.hadoop.security.UserGroupInformation.getLoginUser(
> UserGroupInformation.java:857)
>
>at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(
> UserGroupInformation.java:724)
>
> at org.apache.hadoop.hbase.security.User$
> SecureHadoopUser.(User.java:293)
>
> at org.apache.hadoop.hbase.security.User.getCurrent(User.java:191)
>
> at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver$
> ConnectionInfo.(PhoenixEmbeddedDriver.java:504)
>
> at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver$
> ConnectionInfo.create(PhoenixEmbeddedDriver.java:312)
>
> at org.apache.phoenix.jdbc.PhoenixDriver.
> getConnectionQueryServices(PhoenixDriver.java:232)
>
> at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(
> PhoenixEmbeddedDriver.java:150)
>
> at org.apache.phoenix.jdbc.PhoenixDriver.connect(
> PhoenixDriver.java:221)
>
> at workbench.db.DbDriver.connect(DbDriver.java:513)
>
> at workbench.db.ConnectionMgr.connect(ConnectionMgr.java:255)
>
> at workbench.db.ConnectionMgr.getConnection(ConnectionMgr.
> java:182)
>
> at workbench.gui.profiles.ConnectionGuiHelper$1.run(
> ConnectionGuiHelper.java:142)
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *Noam *
>
>
>
> --
>
> PRIVILEGED AND CONFIDENTIAL
> PLEASE NOTE: The information contained in this message is privileged and
> confidential, and is intended only for the use of the individual to whom it
> is addressed and others who have been specifically authorized to receive
> it. If you are not the intended recipient, you are hereby notified that any
> dissemination, distribution or copying of this communication is strictly
> prohibited. If you have received this communication in error, or if any
> problems occur with transmission, please contact sender. Thank you.
>