[ 
https://issues.apache.org/jira/browse/PHOENIX-1815?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14494241#comment-14494241
 ] 

ASF GitHub Bot commented on PHOENIX-1815:
-----------------------------------------

Github user jmahonin commented on the pull request:

    https://github.com/apache/phoenix/pull/63#issuecomment-92903008
  
    Re: step c, I would lean towards not including either the spark or scala 
library JARs. They are provided by the Spark runtime itself, so I'm not sure it 
makes sense to bundle them within the phoenix assembly JAR. Does that make 
sense to you?
    
    ref:
    https://spark.apache.org/docs/latest/submitting-applications.html
    https://github.com/sbt/sbt-assembly#-provided-configuration
    
    After those, I think the only other runtime dependency not already in the 
all-common-jars file is one for snappy-java, which I'm not sure is explicitly 
needed anymore that we're part of a multi-module build. I will double-check.


> Use Spark Data Source API in phoenix-spark module
> -------------------------------------------------
>
>                 Key: PHOENIX-1815
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-1815
>             Project: Phoenix
>          Issue Type: New Feature
>            Reporter: Josh Mahonin
>
> Spark 1.3.0 introduces a new 'Data Source' API to standardize load and save 
> methods for different types of data sources.
> The phoenix-spark module should implement the same API for use as a pluggable 
> data store in Spark.
> ref:
> https://spark.apache.org/docs/latest/sql-programming-guide.html#data-sources
>     
> https://databricks.com/blog/2015/01/09/spark-sql-data-sources-api-unified-data-access-for-the-spark-platform.html



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to