Hi,

I am new to using Spark and Parquet files,

Below is what i am trying to do, on Spark-shell,

val df =
sqlContext.parquetFile("/data/LM/Parquet/Segment/pages/part-m-00000.gz.parquet")
Have also tried below command,

val
df=sqlContext.read.format("parquet").load("/data/LM/Parquet/Segment/pages/part-m-00000.gz.parquet")

Now i have an other existing parquet file to which i want to append this
Parquet file data of df.

so i use,

df.save("/data/LM/Parquet/Segment/pages2/part-m-00000.gz.parquet","parquet",
SaveMode.Append )

also tried below command,

df.save("/data/LM/Parquet/Segment/pages2/part-m-00000.gz.parquet",
SaveMode.Append )


and it throws me below error,

<console>:26: error: not found: value SaveMode

df.save("/data/LM/Parquet/Segment/pages2/part-m-00000.gz.parquet","parquet",
SaveMode.Append )

Please help me, in case i am doing something wrong here.

Regards,
Satyajit.

Reply via email to