163.com>;
user@spark.apache.org
Subject: Re: How to specify default value for StructField?
I agree with Yong Zhang,
perhaps spark sql with hive could solve the problem:
http://spark.apache.org/docs/latest/sql-programming-guide.html#hive-tables
On Thu, Feb 16, 2017 at 12:42 AM,
't think it is available, unless someone tells
> me I am wrong.
>
>
> You can create a JIRA to request this feature, but we all know that
> Parquet is the first citizen format [image: ]
>
>
>
> Yong
>
>
> ----------
>
> *From:* Begar, Veena
ubject: RE: How to specify default value for StructField?
Thanks Yong.
I know about merging the schema option.
Using Hive we can read AVRO files having different schemas. And also we can do
the same in Spark also.
Similarly we can read ORC files having different schemas in Hive. But, we can
it using dataframe?
Thanks.
From: Yong Zhang [mailto:java8...@hotmail.com]
Sent: Tuesday, February 14, 2017 8:31 PM
To: Begar, Veena <veena.be...@hpe.com>; smartzjp <zjp_j...@163.com>;
user@spark.apache.org
Subject: Re: How to specify default value for StructField?
You maybe are looking f
[]
Yong
From: Begar, Veena <veena.be...@hpe.com>
Sent: Tuesday, February 14, 2017 10:37 AM
To: smartzjp; user@spark.apache.org
Subject: RE: How to specify default value for StructField?
Thanks, it didn't work. Because, the folder has files from 2 different s
, February 14, 2017 10:32 AM
To: Begar, Veena <veena.be...@hpe.com>; user@spark.apache.org
Subject: Re: How to specify default value for StructField?
You can try the below code.
val df = spark.read.format("orc").load("/user/hos/orc_files_test_together")
df.select(“f1”,”f2”)
You can try the below code.
val df = spark.read.format("orc").load("/user/hos/orc_files_test_together")
df.select(“f1”,”f2”).show
在 2017/2/14
;type": "string",
"default": ""
},
{
"name": "f2",
"type": "string",
"default": ""
}
]
}*
Wondering why it doesn't work with ORC files.
thanks