Hi all,

this is my first message to the Spark mailing list, so please bear with me if I don't fully meet your communication standards. I just wanted to discuss one aspect that I've stumbled across several times over the past few weeks. When working with Spark, I often run into the problem of having to merge two (or more) existing StructTypes into a new one to define a schema. Usually this looks similar (in Python) to the following simplified example:

        a = StructType([StuctField("field_a", StringType())])
        b = StructType([StructField("field_b", IntegerType())])

        combined = StructType( a.fields + b.fields)

My idea, which I would like to discuss, is to shorten the above example in Python as follows by supporting Python's add operator for StructTypes:

        combined = a + b


What do you think of this idea? Are there any reasons why this is not yet part of StructType's functionality? If you support this idea, I could create a first PR for further and deeper discussion.

Best
Tim

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to