Hello,
I am wondering if you do so, then all your executor pods should run on the
same kubernetes worker node since you mount a single volume with a
ReadWriteOnce policy. By design this seems not to be good I assume. You may
need to have a kind of ReadWriteMany policy associated to the volume. The
Hi Manjunath,
Can you share the data example?
>From the information shared above, it seems that you will need to apply
mapping with custom logic on the rows in your RDD to be consistent before
you can apply the schema.
I recommend reading about the mapping functionality here:
https://data-flair.tr
From: Basavaraj
Sent: Friday, May 15, 2020 9:12:01 PM
To: spark users
Subject: unsubscribe
Hi,
I have a dataframe with some columns and data that is fetched from JDBC, as i
have to maintain the schema consistent in the ORC file i have to apply
different schema for that dataframe. Column names will be same, but Data or
Schema may contain some extra columns.
Is there any way i can app