Re: Spark 3.0.1 and spark 3.2 compatibility
> Do I need to recompile my application with 3.2 dependencies or application > compiled with 3.0.1 will work fine on 3.2 ? Yes. And here is How to compile conditionally for Apache Spark 3.1.x and Apache Spark >= 3.2.x object XYZ { @enableIf(classpathMatches(".*spark-catalyst_2\\.\\d+-3\\.2\\..*".r)) private def getFuncName(f: UnresolvedFunction): String = { // For Spark 3.2.x f.nameParts.last } @enableIf(classpathMatches(".*spark-catalyst_2\\.\\d+-3\\.1\\..*".r)) private def getFuncName(f: UnresolvedFunction): String = { // For Spark 3.1.x f.name.funcName } } For more details, see https://github.com/ThoughtWorksInc/enableIf.scala#enable-different-code-for-apache-spark-31x-and-32x On Fri, 2022-04-08 01:27:42 Pralabh Kumar wrote Hi spark community I have quick question .I am planning to migrate from spark 3.0.1 to spark 3.2. Do I need to recompile my application with 3.2 dependencies or application compiled with 3.0.1 will work fine on 3.2 ? Regards Pralabh kumar
Re: Spark 3.0.1 and spark 3.2 compatibility
Hi, absolutely agree with Sean, besides that please see the release notes as well for SPARK versions, they do mention about any issues around compatibility Regards, Gourav On Thu, Apr 7, 2022 at 6:32 PM Sean Owen wrote: > (Don't cross post please) > Generally you definitely want to compile and test vs what you're running > on. > There shouldn't be many binary or source incompatibilities -- these are > avoided in a major release where possible. So it may need no code change. > But I would certainly recompile just on principle! > > On Thu, Apr 7, 2022 at 12:28 PM Pralabh Kumar > wrote: > >> Hi spark community >> >> I have quick question .I am planning to migrate from spark 3.0.1 to spark >> 3.2. >> >> Do I need to recompile my application with 3.2 dependencies or >> application compiled with 3.0.1 will work fine on 3.2 ? >> >> >> Regards >> Pralabh kumar >> >>
Re: Spark 3.0.1 and spark 3.2 compatibility
(Don't cross post please) Generally you definitely want to compile and test vs what you're running on. There shouldn't be many binary or source incompatibilities -- these are avoided in a major release where possible. So it may need no code change. But I would certainly recompile just on principle! On Thu, Apr 7, 2022 at 12:28 PM Pralabh Kumar wrote: > Hi spark community > > I have quick question .I am planning to migrate from spark 3.0.1 to spark > 3.2. > > Do I need to recompile my application with 3.2 dependencies or application > compiled with 3.0.1 will work fine on 3.2 ? > > > Regards > Pralabh kumar > >
Spark 3.0.1 and spark 3.2 compatibility
Hi spark community I have quick question .I am planning to migrate from spark 3.0.1 to spark 3.2. Do I need to recompile my application with 3.2 dependencies or application compiled with 3.0.1 will work fine on 3.2 ? Regards Pralabh kumar