Github user mravi commented on the pull request:

    https://github.com/apache/phoenix/pull/59#issuecomment-88157968
  
    Nice work @jmahonin .  Couple of minor changes
    
    1.  JDK version in the pom.xml is 1.8. You would need to downgrade to 1.7
    2.  For the ProductRDDFunctions.scala file, I notice a mismatch in the 
package declaration.
    3. Would be ideal if we could have a scala file say PhoenixSparkContext 
that merges the functionality you have written in ProductRDDFunctions and  
SparkContextFunctions .
    4. Renaming SparkSqlContextFunctions to just PhoenixSparkSqlContext to make 
easier for end users.
    5. The build goes through fine but when I try to run the PhoenixRDDTest 
from a ScalaIDE , I keep getting errors. It could be more of a IDE thing which 
i will fix and get back on tests results. 
    
    Good to haves
    1. A Java friendly version of the PhoenixSparkContext and 
PhoenixSparkSqlContext classes for a easier adoption for java folks ( like me 
:) )
    2. Extend the org.apache.spark.sql.sources.RelationProvider  and have 
PhoenixDatasource. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to