RE: CsvBulkUpload not working after upgrade to 4.6

2015-12-09 Thread Riesland, Zack
This morning I tried running the same operation from a data node as well as a name node, where phoenix 4.2 is completely gone, and I get the exact same error. From: Riesland, Zack Sent: Tuesday, December 08, 2015 8:42 PM To: user@phoenix.apache.org Subject: CsvBulkUpload not working after

Re: [EXTERNAL] Re: Confusion Installing Phoenix Spark Plugin / Various Errors

2015-12-09 Thread Josh Mahonin
Hi Jonathan, Thanks for the information. If you're able, could you also try the 'SPARK_CLASSPATH' environment variable instead of the spark-defaults.conf setting, and let us know if that works? Also the exact Spark package you're using would be helpful as well (from source, prebuilt for 2.6+,

Re: [EXTERNAL] Re: Confusion Installing Phoenix Spark Plugin / Various Errors

2015-12-09 Thread James Taylor
Would it make sense to tweak the Spark installation instructions slightly with this information, Josh? On Wed, Dec 9, 2015 at 9:11 AM, Cox, Jonathan A wrote: > Josh, > > > > Previously, I was using the SPARK_CLASSPATH, but then read that it was > deprecated and switched to the

Re: CsvBulkUpload not working after upgrade to 4.6

2015-12-09 Thread James Taylor
Zack, Have you asked Hortonworks through your support channel? This sounds like an issue related to the HDP version you have - you need to confirm with them that upgrading to Phoenix 4.6.0 will work (and if there are any extra steps you need to take). Thanks, James On Wed, Dec 9, 2015 at 10:41

Re: Confusion Installing Phoenix Spark Plugin / Various Errors

2015-12-09 Thread Cox, Jonathan A
Josh, So using user provided Hadoop 2.6 solved the immediate Phoenix / Spark integration problem I was having. However, I now have another problem, which seems to be similar to: https://issues.apache.org/jira/browse/SPARK-8332 java.lang.NoSuchMethodError:

JDBC adapter connection issue - JRor Phoenix project.

2015-12-09 Thread Josh Harrison
Hi all, I’m trying to integrate a JRor app with Phoenix and Hbase. When running rake db: create I’m getting the following error: Couldn't create database for {"adapter"=>"", "driver"=>"org.apache.phoenix.jdbc.PhoenixDriver", "url"=>"jdbc:phoenix://localhost:2181/default",

Re: CsvBulkUpload not working after upgrade to 4.6

2015-12-09 Thread Samarth Jain
Zack, What version of HBase are you running? And which version of Phoenix (specifically 4.6-0.98 version or 4.6-1.x version)? FWIW, I don't see the MetaRegionTracker.java file in HBase branches 1.x and master. Maybe you don't have the right hbase-client jar in place? - Samarth On Wed, Dec 9,

RE: CsvBulkUpload not working after upgrade to 4.6

2015-12-09 Thread Riesland, Zack
Thanks Samarth, I’m running hbase 0.98.4.2.2.8.0-3150 and phoenix 4.6.0-HBase-0.98 The hbase stuff is there via the HDP 2.2.8 install. It worked before upgrading to 4.6. From: Samarth Jain [mailto:sama...@apache.org] Sent: Wednesday, December 09, 2015 1:29 PM To: user@phoenix.apache.org

Re: Help tuning for bursts of high traffic?

2015-12-09 Thread Samarth Jain
Zack, These stats are collected continuously and at the global client level. So collecting them only when the query takes more than 1 second won't work. A better alternative for you would be to report stats at a request level. You could then conditionally report the metrics for queries that

Re: [EXTERNAL] Re: Confusion Installing Phoenix Spark Plugin / Various Errors

2015-12-09 Thread Josh Mahonin
Thanks Jonathan, I'll follow-up with the issue there. In the meantime, you may have some luck just submitting a fat (assembly) JAR to a spark cluster. If you really want to dive into the nitty-gritty, I'm decomposing the client JAR down to the required components that allow for the Spark

Re: [EXTERNAL] Re: Confusion Installing Phoenix Spark Plugin / Various Errors

2015-12-09 Thread Josh Mahonin
Hi Jonathan, Thanks, I'm digging into this as we speak. That SPARK-8332 issue looks like the same issue, and to quote one of the comments in that issue 'Classpath hell is hell'. What is interesting is that the unit tests in Phoenix 4.6.0 successfully run against Spark 1.5.2 [1], so I wonder if

RE: [EXTERNAL] Re: Confusion Installing Phoenix Spark Plugin / Various Errors

2015-12-09 Thread Cox, Jonathan A
Thanks, Josh. I submitted the issue, which can be found at: https://issues.apache.org/jira/browse/PHOENIX-2503 Multiple Java NoClass/Method Errors with Spark and Phoenix From: Josh Mahonin [mailto:jmaho...@gmail.com] Sent: Wednesday, December 09, 2015 1:15 PM To: user@phoenix.apache.org

phoenix-spark and pyspark

2015-12-09 Thread Nick Dimiduk
Heya, Has anyone any experience using phoenix-spark integration from pyspark instead of scala? Folks prefer python around here... I did find this example [0] of using HBaseOutputFormat from pyspark, haven't tried extending it for phoenix. Maybe someone with more experience in pyspark knows

Re: Phoenix JDBC connection to secure HBase fails

2015-12-09 Thread Biju N
Thanks Akhilesh/Mujtaba for your suggestions. Adding core-site.xml from the target cluster to the class path resolved the issue. We initially only had hbase and hdfs site xmls in the class path. Is there a way to set the hbase/core site properties in the code instead of copying the config xmls to

Re: [EXTERNAL] Re: Confusion Installing Phoenix Spark Plugin / Various Errors

2015-12-09 Thread Cox, Jonathan A
Josh, I added all of those JARs separately to Spark's class paths, and it seems to be working fine now. Thanks a lot for your help! Sent from my iPhone On Dec 9, 2015, at 2:30 PM, Josh Mahonin > wrote: Thanks Jonathan, I'll follow-up with the

Re: Phoenix JDBC connection to secure HBase fails

2015-12-09 Thread anil gupta
Hi Akhilesh, You can add hbase/hadoop config directories in application classpath. You dont need to copy conf files in your app lib folder. Thanks, Anil Gupta On Wed, Dec 9, 2015 at 2:54 PM, Biju N wrote: > Thanks Akhilesh/Mujtaba for your suggestions. Adding

RE: [EXTERNAL] Re: Confusion Installing Phoenix Spark Plugin / Various Errors

2015-12-09 Thread Cox, Jonathan A
Josh, Previously, I was using the SPARK_CLASSPATH, but then read that it was deprecated and switched to the spark-defaults.conf file. The result was the same. Also, I was using ‘spark-1.5.2-bin-hadoop2.6.tgz’, which includes some Hadoop 2.6 JARs. This caused the trouble. However, by