[ 
https://issues.apache.org/jira/browse/HBASE-28213?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Istvan Toth updated HBASE-28213:
--------------------------------
    Description: 
Since 3.2 Spark now uses hadoop-client-api and hadoop-client-runtime.
While we don't actually specify what HBase libraries are needed on the Spark 
client side for the connector, at least the Cloudera docs specify the classes 
provided by "hbase mapredcp"
which includes the full unshaded Hadoop JAR set.

Investigate whether  *hbase-shaded-client-byo-hadoop* and the 
*hbase-client-api* and *hbase-client-runtime* is enough for the connector, and 
if yes, document how to set the Spark classpath.

  was:
Since 3.2 Spark now uses hadoop-client-api and hadoop-client-runtime.
While we don't actually specify what HBase libraries are needed on the Spark 
client side for the connector, at least the Cloudera docs specify the classes 
provided by "hbase mapredcp"
which includes the full unshaded Hadoop JAR set.

Investigate whether  *hbase-shaded-client-byo-hadoop* and the 
*hbase-client-api* and *hbase-client-runtime  __* is enough for the connector, 
and if yes, document how to set the Spark classpath.


> Evalue using hbase-shaded-client-byo-hadoop for Spark connector
> ---------------------------------------------------------------
>
>                 Key: HBASE-28213
>                 URL: https://issues.apache.org/jira/browse/HBASE-28213
>             Project: HBase
>          Issue Type: Improvement
>          Components: spark
>            Reporter: Istvan Toth
>            Priority: Major
>
> Since 3.2 Spark now uses hadoop-client-api and hadoop-client-runtime.
> While we don't actually specify what HBase libraries are needed on the Spark 
> client side for the connector, at least the Cloudera docs specify the classes 
> provided by "hbase mapredcp"
> which includes the full unshaded Hadoop JAR set.
> Investigate whether  *hbase-shaded-client-byo-hadoop* and the 
> *hbase-client-api* and *hbase-client-runtime* is enough for the connector, 
> and if yes, document how to set the Spark classpath.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to