Hi,

What version of Spark and how are you are writing to GBQ table?

Is the source column in ETL has NUMERIC(38) say coming from Oracle?



   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>


 https://en.everybodywiki.com/Mich_Talebzadeh



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Wed, 22 Feb 2023 at 16:01, nidhi kher <kherni...@gmail.com> wrote:

> Hello,
>
> I am facing below issue in spark code /
> We are running spark code using dataproc serverless batch in google cloud
> platform. Spark code is causing issue while writing the data to bigquery
> table. In bigquery table , few of the columns have datatype as bignumeric
> and spark code is changing the datatype from bignumeric to numeric while
> writing the data. We need datatype to be kept as bignumeric only as we need
> data of 38,20 precision.
>
> Please suggest.
>
> Regards,
> Nidhi Kher
>

Reply via email to