Arrays need to be a single type, I think you're looking for a Struct
column. See:
https://medium.com/@mrpowers/adding-structtype-columns-to-spark-dataframes-b44125409803

On Wed, Jul 11, 2018 at 6:37 AM, dimitris plakas <dimitrisp...@gmail.com>
wrote:

> Hello everyone,
>
> I am new to Pyspark and i would like to ask if there is any way to have a
> Dataframe column which is ArrayType and have a different DataType for each
> elemnt of the ArrayType. For example
> to have something like :
>
> StructType([StructField("Column_Name", ArrayType(ArrayType(FloatType(),
> FloatType(), DecimalType(), False),False), False)]).
>
> I want to have an ArrayType column with 2 elements as FloatType and 1
> element as DecimalType
>
> Thank you in advance
>

Reply via email to