Would it make sense to add an optional validate parameter to applySchema()
which defaults to False, both to give users the option to check the schema
immediately and to make the default behavior clearer?
​

On Sat Feb 14 2015 at 9:18:59 AM Michael Armbrust <mich...@databricks.com>
wrote:

> Doing runtime type checking is very expensive, so we only do it when
> necessary (i.e. you perform an operation like adding two columns together)
>
> On Sat, Feb 14, 2015 at 2:19 AM, nitin <nitin2go...@gmail.com> wrote:
>
>> AFAIK, this is the expected behavior. You have to make sure that the
>> schema
>> matches the row. It won't give any error when you apply the schema as it
>> doesn't validate the nature of data.
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/SQLContext-applySchema-strictness-tp21650p21653.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to