Hi Jordan!

What build system are you using? from that snippet I think it's sbt?

The most expedient change on our end would be to publish a spark 3
compatible version with a maven classifier to distinguish it from the
spark 2 version. Before proceeding with that, do you know off hand if
your build tooling can consume maven dependencies with a classifier?

On Fri, Oct 8, 2021 at 3:53 PM Jordan Hambleton
<[email protected]> wrote:
>
> Hi Sean,
>
> Thanks for the reply on this! We are looking for building spark applications 
> with the dependency such as below. I've seen builds packaging deps with 
> different flavors, ie. provided, assembly, shaded as well. Scala version's 
> built against has typically been with the latest v2.12.x releases.
>
> Publishing it to maven central would work.
>
> basic example (note this will fail w/ v1.0.0) -
>
> libraryDependencies ++= Seq(
>   "org.apache.spark" %% "spark-core" % "3.1.2",
>   "org.apache.spark" %% "spark-sql" % "3.1.2",
>   "org.apache.hbase.connectors.spark" % "hbase-spark" % "1.0.0"
> )
>
> Appreciate the help in making it easier to use the hadoop-connectors for 
> Spark 3. Keep me posted if there's any additional information needed.
>
> regards,
> Jordan
>
> On 2021/10/08 16:09:34, Sean Busbey <[email protected]> wrote:
> > Hi Jordan!
> >
> > How do you currently pull in the dependency? Do you need us to publish an
> > artifact to maven central? Would a convenience binary built against spark-3
> > on downloads.apache.org suffice?
> >
> > On Thu, Oct 7, 2021 at 7:55 PM Jordan Hambleton
> > <[email protected]> wrote:
> >
> > > Hi Peter,
> > >
> > > We're seeing an uptick in usage with Spark 3. While hadoop-connectors have
> > > started supporting Spark 3 (HBASE-25326
> > > <https://issues.apache.org/jira/browse/HBASE-25326>) and while it's fine
> > > pulling down the repo and building from source, it would be a lot easier 
> > > if
> > > we had a release for supporting Spark 3.
> > >
> > > Is it possible to get a next release of the hadoop-connectors out or is
> > > there another way others are using for Spark3 integration with HBase?
> > >
> > > Below is what I'm referring to in building.
> > >
> > > To build the connector with Spark 3.0, compile it with scala 2.12.
> > > Additional configurations that you can customize are the Spark version,
> > > HBase version, and Hadoop version. Example:
> > >
> > > $ mvn -Dspark.version=3.0.1 -Dscala.version=2.12.10
> > > -Dscala.binary.version=2.12 -Dhbase.version=2.2.4 -Dhadoop.profile=3.0
> > > -Dhadoop-three.version=3.2.0 -DskipTests clean package
> > >
> > > best regards,
> > > Jordan Hambleton
> > >
> >

Reply via email to