phani482 commented on issue #7800: URL: https://github.com/apache/hudi/issues/7800#issuecomment-1410740119
Not slowness, our jobs are failing with above error while hudi write. Is it an issue if we remove archive files from .hoodie folder? 1. Does Hudi ignore archive files from .hoodie folder ? Will it read archive files into timeline server? 2. For a long running streaming Job, what are the best practices to manage metadata folder (.hoodie) to avoid out of memory errors? 3. Are there nay spark heap settings required to be tuned? The hudi documentation is not clear enough on this -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
