Reg read json inference schema

2023-08-31 Thread Manoj Babu
Hi Team, I am getting the below error when reading a column with a value with JSON string. json_schema_ctx_rdd = record_df.rdd.map(lambda row: row.contexts_parsed) spark.read.option("mode", "PERMISSIVE").option("inferSchema", "true").option("inferTimestamp", "false").json(json_schema_ctx_rdd) Th

Re:

2023-08-31 Thread leibnitz
me too ayan guha 于2023年8月24日周四 09:02写道: > Unsubscribe-- > Best Regards, > Ayan Guha >

Re: Okio Vulnerability in Spark 3.4.1

2023-08-31 Thread Bjørn Jørgensen
Have tried to upgrade it. It is from kubernetes-client [SPARK-43990][BUILD] Upgrade kubernetes-client to 6.7.2 tor. 31. aug. 2023 kl. 14:47 skrev Agrawal, Sanket : > I don’t see an entry in pom.xml while building spark. I think

Re: Okio Vulnerability in Spark 3.4.1

2023-08-31 Thread Sean Owen
It's a dependency of some other HTTP library. Use mvn dependency:tree to see where it comes from. It may be more straightforward to upgrade the library that brings it in, assuming a later version brings in a later okio. You can also manage up the version directly with a new entry in However, does

RE: Okio Vulnerability in Spark 3.4.1

2023-08-31 Thread Agrawal, Sanket
I don’t see an entry in pom.xml while building spark. I think it is being downloaded as part of some other dependency. From: Sean Owen Sent: Thursday, August 31, 2023 5:10 PM To: Agrawal, Sanket Cc: user@spark.apache.org Subject: [EXT] Re: Okio Vulnerability in Spark 3.4.1 Does the vulnerabili

Re: Okio Vulnerability in Spark 3.4.1

2023-08-31 Thread Sean Owen
Does the vulnerability affect Spark? In any event, have you tried updating Okio in the Spark build? I don't believe you could just replace the JAR, as other libraries probably rely on it and compiled against the current version. On Thu, Aug 31, 2023 at 6:02 AM Agrawal, Sanket wrote: > Hi All, >

Okio Vulnerability in Spark 3.4.1

2023-08-31 Thread Agrawal, Sanket
Hi All, Amazon inspector has detected a vulnerability in okio-1.15.0.jar JAR in Spark 3.4.1. It suggests to upgrade the jar version to 3.4.0. But when we try this version of jar then the spark application is failing with below error: py4j.protocol.Py4JJavaError: An error occurred while calling