Greetings Everyone! We are in need to ship spark (driver and executor) logs (not spark event logs) from K8 to cloud bucket ADLS/S3. Using fluentbit we are able to ship the log files but only to one single path container/logs/. This will cause a huge number of files in a single folder and will create performance issues on list and search operation on file. What we would like to do is to dynamically create the output folder which can be as a spark app name.
If anyone has done this please share the details. Regards Jay