You can try SequenceFileRDDFunctions.saveAsSequenceFile or RDD.saveAsObjectFile that serialize data to (NullWritable, BytesWritable)
> 2022年11月14日 21:07,Shrikant Prasad <shrikant....@gmail.com> 写道: > > I have tried with that also. It gives same exception: > ClassNotFoundException: sequencefile.DefaultSource > > Regards, > Shrikant > > On Mon, 14 Nov 2022 at 6:35 PM, Jie Han <tunyu...@gmail.com> wrote: > It seems that the name is “sequencefile”. > > > 2022年11月14日 20:59,Shrikant Prasad <shrikant....@gmail.com> 写道: > > > > Hi, > > > > I have an application which writes a dataframe into sequence file using > > df.write.format("sequence").insertInto("hivetable1") > > > > This was working fine with Spark 2.7. > > Now I am trying to migrate to Spark 3.2. Getting ClassNotFoundException: > > sequence.DefaultSource error with Spark 3.2. > > > > Is there any change in sequence file support in 3.2 or any code change is > > required to make it work? > > > > Thanks and regards, > > Shrikant > > > > > > -- > > Regards, > > Shrikant Prasad > > -- > Regards, > Shrikant Prasad --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org