This can help:

import org.apache.spark.sql.DataFrame

def prefixDf(dataFrame: DataFrame, prefix: String): DataFrame = {
  val colNames = dataFrame.columns
  colNames.foldLeft(dataFrame){
        (df, colName) => {
          df.withColumnRenamed(colName, s"${prefix}_${colName}")
        }
    }
}

cheers,
Ardo


On Tue, Apr 19, 2016 at 10:53 AM, nihed mbarek <nihe...@gmail.com> wrote:

> Hi,
>
> I want to prefix a set of dataframes and I try two solutions:
> * A for loop calling withColumnRename based on columns()
> * transforming my Dataframe to and RDD, updating the old schema and
> recreating the dataframe.
>
>
> both are working for me, the second one is faster with tables that contain
> 800 columns but have a more stage of transformation toRDD.
>
> Is there any other solution?
>
> Thank you
>
> --
>
> M'BAREK Med Nihed,
> Fedora Ambassador, TUNISIA, Northern Africa
> http://www.nihed.com
>
> <http://tn.linkedin.com/in/nihed>
>
>

Reply via email to