Re: Replacing solr with pheonix for XML data

2016-04-14 Thread Randy Gelhausen
Hi Vikram, This question comes up somewhat often, so I'd be very interested in comments from other users of Solr and Phoenix as well. You'd need to either pre-parse fields from XML and save them as column values to use in queries, or use a UDF to dynamically parse XML stored within a single

Re: Secondary indexes on dynamic columns

2016-04-14 Thread James Taylor
No, it's currently not possible to have a secondary index on dynamic columns. You can, however, create a view with new, ad hoc columns and add a secondary index on the view. On Thu, Apr 14, 2016 at 2:38 PM, wrote: > Hi, > > > > Is there a way to make phoenix

Secondary indexes on dynamic columns

2016-04-14 Thread vikram.kondadasula
Hi, Is there a way to make phoenix build indexes on secondary columns? vikram ___ This message is for information purposes only, it is not a recommendation, advice, offer or solicitation to buy or sell a product or service nor an official

Re: Column Cardinality and Stats table as an "interface"

2016-04-14 Thread James Taylor
Thanks for the clarifications, Nick. That's a cool idea for cube building - I'm not aware of any JIRAs for that. FYI, for approximate count, we have PHOENIX-418 which Ravi is working on. I think he was looking at using a HyperLogLog library, but perhaps BlinkDB is an alternative. On Thu, Apr 14,

Re: Column Cardinality and Stats table as an "interface"

2016-04-14 Thread Nick Dimiduk
> > The stats table would purely be used to drive optimizer decisions in > Phoenix. The data in the table is only collected during major compaction > (or when an update stats is run manually), so it's not really meant for > satisfying queries. > > For Kylin integration, we'd rely on Kylin to

Map the hbase column qualifier which is in byte type to phoenix table view

2016-04-14 Thread Viswanathan J
Hi, How to map the HBase column qualifier which is in byte type(highlighted below) to the view in phoenix? eg., \x00\x00\x00\x0Bcolumn=fact:\x05, timestamp=1460666736042, value=\x02\x9E.\x8A Please help. -- Regards, Viswa.J

Re: Column Cardinality and Stats table as an "interface"

2016-04-14 Thread James Taylor
The stats table would purely be used to drive optimizer decisions in Phoenix. The data in the table is only collected during major compaction (or when an update stats is run manually), so it's not really meant for satisfying queries. For Kylin integration, we'd rely on Kylin to maintain the cubes

Re: Column Cardinality and Stats table as an "interface"

2016-04-14 Thread James Taylor
FYI, Lars H. is looking at PHOENIX-258 for improving performance of DISTINCT. We don't yet keep any cardinality info in our stats (see PHOENIX-1178). Thanks, James On Thu, Apr 14, 2016 at 11:22 AM, Nick Dimiduk wrote: > Hello, > > I'm curious if there are any tricks for

Column Cardinality and Stats table as an "interface"

2016-04-14 Thread Nick Dimiduk
Hello, I'm curious if there are any tricks for estimating the cardinality of the values in a phoenix column. Even for leading rowkey column, a select distinct query on a large table requires a full scan (PHOENIX-258). Maybe one could reach into the stats table and derive some knowledge? How much

Replacing solr with pheonix for XML data

2016-04-14 Thread vikram.kondadasula
Hi, Are there any use cases ,resources or experience substitutingsolr with phoenix forsemi-structured data like XML. Thanks vikram ___ This message is for information purposes only, it is not a recommendation, advice, offer or

Re: prepareAndExecute with UPSERT not working

2016-04-14 Thread Steve Terrell
I found it much easier and reliable to make my own phoenix HTTP server with my own JSON API. It was too confusing for me to send multiple requests for what would normally be just one SQL statement. And I had problems getting upserts working, to boot (even with the thin server). Now I can make

prepareAndExecute with UPSERT not working

2016-04-14 Thread Plamen Paskov
Hey folks, I'm trying to UPSERT some data via the json api but no luck for now. My requests looks like: { "request": "openConnection", "connectionId": "6" } { "request": "createStatement", "connectionId": "6" } { "request": "prepareAndExecute", "connectionId": "6",

create table like syntax

2016-04-14 Thread luohui20001
Hi guys As I know create table a like b syntax should have been supported since long before(refer to https://issues.apache.org/jira/browse/PHOENIX-734) ,however when i am using phoenix 4.5, i got below exception:0: jdbc:phoenix:cnzk0,cnzk1,cnzk2> create table debug_visit like visit;Error:

Re: apache phoenix json api

2016-04-14 Thread Plamen Paskov
Now another error appears for prepare and execute batch request: content="text/html;charset=ISO-8859-1"/> Error 500 HTTP ERROR: 500 Problem accessing /. Reason: com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException:

Re: apache phoenix json api

2016-04-14 Thread Plamen Paskov
Ah i found the error. It should be "sqlCommands": instead of "sqlCommands", The documentation syntax is wrong for this request type: http://calcite.apache.org/avatica/docs/json_reference.html#prepareandexecutebatchrequest On 14.04.2016 11:09, Plamen Paskov wrote: @Josh: thanks for your answer.

create table like syntax

2016-04-14 Thread luohui20001
Hi guys As I know create table a like b syntax should have been supported since long before(refer to https://issues.apache.org/jira/browse/PHOENIX-734) ,however when i am using phoenix 4.5, i got below exception:0: jdbc:phoenix:cnzk0,cnzk1,cnzk2> create table debug_visit like visit;Error:

Re: apache phoenix json api

2016-04-14 Thread Plamen Paskov
@Josh: thanks for your answer. Folks, I'm trying to prepare and execute batch request with no luck. These are the requests i send: { "request": "openConnection", "connectionId": "2" } { "request": "createStatement", "connectionId": "2" } { "request": "prepareAndExecuteBatch",