Spark uses log4j for logging.  There is a log4j properties template file in the conf directory.  Just remove the "template" extension and change the content of log4j.properties to meet your need.  More info on log4j can be found at logging.apache.org...

On 2/21/22 9:15 AM, Michael Williams (SSI) wrote:

Hello,

We have a POC using Spark 3.2.1 and none of us have any prior Spark experience.  Our setup uses the native Spark REST api (http://localhost:6066/v1/submissions/create) on the master node (not Livy, not Spark Job server).  While we have been successful at submitting python jobs via this endpoint, when we implemented .NET for Spark and have been attempting to trigger those jobs using the api, the driver (on the worker) simply reports failed, but there aren’t any log files created because it is failing before the application starts.

Is there a logging configuration that can be made that might increase the logging detail on the worker for internal Spark processes and possibly tell us specifically the error occurring?

Thank you,

Mike

This electronic message may contain information that is Proprietary, Confidential, or legally privileged or protected. It is intended only for the use of the individual(s) and entity named in the message. If you are not an intended recipient of this message, please notify the sender immediately and delete the material from your computer. Do not deliver, distribute or copy this message and do not disclose its contents or take any action in reliance on the information it contains. Thank You.

Reply via email to