The error was due to blank field being defined twice.

On Tue, Dec 22, 2015 at 12:03 AM, Divya Gehlot <divya.htco...@gmail.com>
wrote:

> Hi,
> I am new bee to Apache Spark ,using  CDH 5.5 Quick start VM.having spark
> 1.5.0.
> I working on custom schema and getting error
>
> import org.apache.spark.sql.hive.HiveContext
>>>
>>> scala> import org.apache.spark.sql.hive.orc._
>>> import org.apache.spark.sql.hive.orc._
>>>
>>> scala> import org.apache.spark.sql.types.{StructType, StructField,
>>> StringType, IntegerType};
>>> import org.apache.spark.sql.types.{StructType, StructField, StringType,
>>> IntegerType}
>>>
>>> scala> val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
>>> 15/12/21 23:41:53 INFO hive.HiveContext: Initializing execution hive,
>>> version 1.1.0
>>> 15/12/21 23:41:53 INFO client.ClientWrapper: Inspected Hadoop version:
>>> 2.6.0-cdh5.5.0
>>> 15/12/21 23:41:53 INFO client.ClientWrapper: Loaded
>>> org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.5.0
>>> hiveContext: org.apache.spark.sql.hive.HiveContext =
>>> org.apache.spark.sql.hive.HiveContext@214bd538
>>>
>>> scala> val customSchema = StructType(Seq(StructField("year",
>>> IntegerType, true),StructField("make", StringType,
>>> true),StructField("model", StringType, true),StructField("comment",
>>> StringType, true),StructField("blank", StringType, true)))
>>> customSchema: org.apache.spark.sql.types.StructType =
>>> StructType(StructField(year,IntegerType,true),
>>> StructField(make,StringType,true), StructField(model,StringType,true),
>>> StructField(comment,StringType,true), StructField(blank,StringType,true))
>>>
>>> scala> val customSchema = (new StructType).add("year", IntegerType,
>>> true).add("make", StringType, true).add("model", StringType,
>>> true).add("comment", StringType, true).add("blank", StringType, true)
>>> customSchema: org.apache.spark.sql.types.StructType =
>>> StructType(StructField(year,IntegerType,true),
>>> StructField(make,StringType,true), StructField(model,StringType,true),
>>> StructField(comment,StringType,true), StructField(blank,StringType,true))
>>>
>>> scala> val customSchema = StructType( StructField("year", IntegerType,
>>> true) :: StructField("make", StringType, true) :: StructField("model",
>>> StringType, true) :: StructField("comment", StringType, true) ::
>>> StructField("blank", StringType, true)::StructField("blank", StringType,
>>> true))
>>> <console>:24: error: value :: is not a member of
>>> org.apache.spark.sql.types.StructField
>>>        val customSchema = StructType( StructField("year", IntegerType,
>>> true) :: StructField("make", StringType, true) :: StructField("model",
>>> StringType, true) :: StructField("comment", StringType, true) ::
>>> StructField("blank", StringType, true)::StructField("blank", StringType,
>>> true))
>>>
>>
> Tried like like below also
>
> scala> val customSchema = StructType( StructField("year", IntegerType,
> true), StructField("make", StringType, true) ,StructField("model",
> StringType, true) , StructField("comment", StringType, true) ,
> StructField("blank", StringType, true),StructField("blank", StringType,
> true))
> <console>:24: error: overloaded method value apply with alternatives:
>   (fields:
> Array[org.apache.spark.sql.types.StructField])org.apache.spark.sql.types.StructType
> <and>
>   (fields:
> java.util.List[org.apache.spark.sql.types.StructField])org.apache.spark.sql.types.StructType
> <and>
>   (fields:
> Seq[org.apache.spark.sql.types.StructField])org.apache.spark.sql.types.StructType
>  cannot be applied to (org.apache.spark.sql.types.StructField,
> org.apache.spark.sql.types.StructField,
> org.apache.spark.sql.types.StructField,
> org.apache.spark.sql.types.StructField,
> org.apache.spark.sql.types.StructField,
> org.apache.spark.sql.types.StructField)
>        val customSchema = StructType( StructField("year", IntegerType,
> true), StructField("make", StringType, true) ,StructField("model",
> StringType, true) , StructField("comment", StringType, true) ,
> StructField("blank", StringType, true),StructField("blank", StringType,
> true))
>                           ^
>    Would really appreciate if somebody share the example which works with
> Spark 1.4 or Spark 1.5.0
>
> Thanks,
> Divya
>
> ^
>

Reply via email to