are you referring to data frame after map? You can use the following for columns
.toDF("name1", "name2"....) after the map. Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>* http://talebzadehmich.wordpress.com On 7 April 2016 at 00:34, <mdkhajaasm...@gmail.com> wrote: > Hi, > > I am new to spark and trying to implement the solution without using hive. > We are migrating to new environment where hive is not present intead I need > to use spark to output files. > > I look at case class and maximum number of columns I can use is 22 but I > have 180 columns . In this scenario what is best approach to use spark sql > or data frame without hive. > > Thanks, > Azmath > > Sent from my iPhone > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >