Github user jmahonin commented on the issue:
https://github.com/apache/phoenix/pull/221
Thanks @nico-pappagianis I think I've got it squared away. Your changes are
contained in the following commits:
https://github.com/apache/phoenix/c
Github user jmahonin commented on a diff in the pull request:
https://github.com/apache/phoenix/pull/221#discussion_r87471147
--- Diff:
phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
---
@@ -16,19 +16,20 @@ package org.apache.phoenix.spark
Github user jmahonin commented on a diff in the pull request:
https://github.com/apache/phoenix/pull/221#discussion_r87470893
--- Diff:
phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
---
@@ -16,19 +16,20 @@ package org.apache.phoenix.spark
Github user jmahonin commented on a diff in the pull request:
https://github.com/apache/phoenix/pull/221#discussion_r87460429
--- Diff:
phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
---
@@ -16,19 +16,20 @@ package org.apache.phoenix.spark
Github user jmahonin commented on a diff in the pull request:
https://github.com/apache/phoenix/pull/221#discussion_r87457154
--- Diff:
phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
---
@@ -16,19 +16,20 @@ package org.apache.phoenix.spark
Github user jmahonin commented on a diff in the pull request:
https://github.com/apache/phoenix/pull/221#discussion_r87435881
--- Diff:
phoenix-spark/src/main/scala/org/apache/phoenix/spark/ProductRDDFunctions.scala
---
@@ -16,19 +16,20 @@ package org.apache.phoenix.spark
Github user jmahonin closed the pull request at:
https://github.com/apache/phoenix/pull/64
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user jmahonin closed the pull request at:
https://github.com/apache/phoenix/pull/59
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user jmahonin closed the pull request at:
https://github.com/apache/phoenix/pull/63
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user jmahonin closed the pull request at:
https://github.com/apache/phoenix/pull/65
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/142#issuecomment-162538013
Unless it's already on your radar @ndimiduk , I can take a stab some time
this week at updating the scala version to extend the new Java version.
---
If your pr
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/142#issuecomment-162212491
This is a good idea. There are actually a few implementations of DBWritable
in the codebase:
- PhoenixRecordWritable
- PhoenixPigDBWritable
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/137#issuecomment-161512711
Thanks @gliptak for looking into this.
I'll spend a bit of time seeing if I can work something out here. I think
the main goal is just to try be
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/140#issuecomment-161313636
Got this merged in here:
https://github.com/apache/phoenix/commit/b8faae52c6bee91393678e74de09ab8a215da856
Feel free to close this PR. Thanks again
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/137#issuecomment-160831941
Definitely closer, it does run now, but it takes about 4x longer to run
than before. The 'doSetup' and 'doTeardown' used to be run only at the ve
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/137#issuecomment-160761094
Hi @gliptak
I tried to run the tests with the patch applied, and got a number of
failures. I think that 'beforeAll' and 'afterAll' are
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/124#issuecomment-151491230
How's this look, @JamesRTaylor / @ravimagham ?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/124#issuecomment-15027
This looks great @navis
The Spark portion looks fine. I'll leave the updates to ColumnInfo for
@ravimagham @JamesRTaylor et. al. to review
---
If
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/114#issuecomment-134629729
@randerzander Not sure if you saw, but I've got a new version of your patch
up at https://issues.apache.org/jira/browse/PHOENIX-2196
If you could take a
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/114#issuecomment-133939400
Sure thing, on a quick glance on mobile this looks good, but I'll try spend
some time with it tomorrow.
---
If your project is set up for it, you can reply to
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/100#issuecomment-121979444
Yup, we've got it in for the next release. Thanks!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as wel
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/100#issuecomment-121215183
@tianyi Fantastic. Your patch has already been committed here, so don't
worry about adjusting this PR:
https://github.com/apache/phoenix/c
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/100#issuecomment-12071
@tianyi Could you try out the revised patch on the ticket [1], and let me
know if it works for you?
[1] https://issues.apache.org/jira/browse/PHOENIX-2112
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/100#issuecomment-120925695
Thanks for the patch. I'll look into getting this in ASAP.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitH
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/63#issuecomment-92981336
I misread your review at first and had thought you asked to update
all-common-dependencies, rather than all-common JARS.
a), b) and c) are all addressed in the
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/63#issuecomment-92903008
Re: step c, I would lean towards not including either the spark or scala
library JARs. They are provided by the Spark runtime itself, so I'm not sure it
makes sen
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/63#issuecomment-92343659
Thanks for the review @mravi
That HBaseConfiguration.create() step is a great idea, I'll make that
change ASAP.
Re: naming scheme, I'd at
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/63#issuecomment-91239667
Merged in 'master' to update with new integration tests
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitH
GitHub user jmahonin opened a pull request:
https://github.com/apache/phoenix/pull/65
PHOENIX-1071 Get the phoenix-spark integration tests running.
Uses the BaseHBaseManagedTimeIT framework now for creating the
test cluster and setup/teardown.
Tested with Java 7u75 i386
GitHub user jmahonin opened a pull request:
https://github.com/apache/phoenix/pull/64
Update phoenix-spark README.md for website documentation
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/FileTrek/phoenix PHOENIX-1816
GitHub user jmahonin opened a pull request:
https://github.com/apache/phoenix/pull/63
PHOENIX-1815 Use Spark Data Source API in phoenix-spark module
This allows using the SQLContext.load() functionality to create
a Phoenix data frame, which also supports push-down on column
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/59#issuecomment-89020951
@mravi Sounds good. FYI, I was able to run the PhoenixRDDTests in IntelliJ
on Linux with no modifications using the 'ScalaTest' configuration.
In
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/59#issuecomment-88569122
I was able to spend a bit more time on the RelationProvider work. The DDL
for custom providers doesn't work through the 'sql()' method on
SparkSQLCont
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/59#issuecomment-88285575
Can confirm the memory settings needed adjustment on 7u76 on Linux.
Special thanks to @robdaemon who had an excellent library to work with, and
a pre-emptive
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/59#issuecomment-88226194
JDK 1.7 and the ProductRDDFunctions package location have been fixed up.
I tried to make some headway on getting the unit tests to run in the IDE.
If you
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/59#issuecomment-88179901
@mravi Great. I can get 1 and 2 taken care of today.
Are new commits on this PR fine, or would you prefer a new PR with a
rebased commit?
---
If your project
Github user jmahonin commented on the pull request:
https://github.com/apache/phoenix/pull/59#issuecomment-88176830
Thanks for the feedback @mravi , point comments below:
1: Right, I'll try get that sorted out. The original phoenix-spark library
would not work with 1.
GitHub user jmahonin opened a pull request:
https://github.com/apache/phoenix/pull/59
PHOENIX-1071 Add phoenix-spark for Spark integration
This adds a new module which can be used to load and save Phoenix tables
through the Spark 1.3.0+ DataFrame and RDD APIs.
It's
38 matches
Mail list logo