spark on k8s daemonset collect log
hi, all Spark runs on k8s, uses daemonset filebeat to collect logs, and writes them to elasticsearch. The docker logs are in json format, and each line is a json string. How to merge multi-line exceptions?
Re: How to explode array columns of a dataframe having the same length
sql: select inline(arrays_zip(col1, col2, col3)) as (c1, c2, c3) from t1 Replied Message | From | Enrico Minack | | Date | 02/16/2023 16:06 | | To | , sam smith | | Subject | Re: How to explode array columns of a dataframe having the same length | You have to take each row and z