GitHub user jankogasic added a comment to the discussion: Connecting PySpark with Hive tables
After digging deeper here is what I found: - Successfully connected with DBever to Kyuubi and viewing ARRAY column as expected - Still can't pull ARRAY with pyspark. I have found Hive dialect jar definition [here](https://github.com/apache/kyuubi/blob/8a67796984b54abaebe0f0cbeddb7c059f51e531/extensions/spark/kyuubi-extension-spark-jdbc-dialect/src/main/scala/org/apache/spark/sql/dialect/KyuubiHiveDialect.scala) doesn't seem to override ARRAY objects. Does it mean that it is not supported? @bowenliang123 I see that you worked on this feature. Do you possibly know the solution? GitHub link: https://github.com/apache/kyuubi/discussions/7240#discussioncomment-15063801 ---- This is an automatically sent email for [email protected]. To unsubscribe, please send an email to: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
