[ 
https://issues.apache.org/jira/browse/SPARK-48217?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Noam Shemesh updated SPARK-48217:
---------------------------------
    Description: 
Hello,

we are running spark job triggered from cloudera hue workflow

and spark printing stdout and stderr logs during execution as expected:

e.g. -  !workflow_running_logs_printed.png!

 

*But stdout and stderr logs getting cleaned when workflows finished/status 
succeeded*

 

 

following is spark-submit command workflow is triggering:

_/usr/bin/spark-submit _
  _--master yarn-client _
  _--driver-memory 4g _
  _--executor-memory 16g _
  _--executor-cores 4 _
  _--class tst _
  _--files `ls -m *.conf | tr -d '\n '` _
  _--conf "spark.dynamicAllocation.maxExecutors=4" _
  _--conf "spark.kryoserializer.buffer.max=1024" _
  _tst.jar $*_

 

 

does someone familiar with this spark job behavior or can advise ideas to fix 
it?

 

Thanks in advance

  was:
Hello,

we are running spark job triggered from cloudera hue workflow

and spark printing stdout and stderr logs during execution as expected:

e.g.

!image-2024-05-09-12-55-55-477.png|width=638,height=332!

*But stdout and stderr logs getting cleaned when workflows finished/status 
succeeded*

!image-2024-05-09-12-57-12-144.png!

 

following is spark-submit command workflow is triggering:

_/usr/bin/spark-submit \_
  _--master yarn-client \_
  _--driver-memory 4g \_
  _--executor-memory 16g \_
  _--executor-cores 4 \_
  _--class tst \_
  _--files `ls -m *.conf | tr -d '\n '` \_
  _--conf "spark.dynamicAllocation.maxExecutors=4" \_
  _--conf "spark.kryoserializer.buffer.max=1024" \_
  _tst.jar $*_

 

 

does someone familiar with this spark job behavior or can advise ideas to fix 
it?

 

Thanks in advance


> Spark stdout and stderr getting removed at end of spark job triggered from 
> cloudera hue workflow
> ------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-48217
>                 URL: https://issues.apache.org/jira/browse/SPARK-48217
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Submit
>    Affects Versions: 1.6.0
>            Reporter: Noam Shemesh
>            Priority: Major
>         Attachments: workflow_running_logs_printed.png, 
> workflow_succeeded_logs_cleaned.png
>
>
> Hello,
> we are running spark job triggered from cloudera hue workflow
> and spark printing stdout and stderr logs during execution as expected:
> e.g. -  !workflow_running_logs_printed.png!
>  
> *But stdout and stderr logs getting cleaned when workflows finished/status 
> succeeded*
>  
>  
> following is spark-submit command workflow is triggering:
> _/usr/bin/spark-submit _
>   _--master yarn-client _
>   _--driver-memory 4g _
>   _--executor-memory 16g _
>   _--executor-cores 4 _
>   _--class tst _
>   _--files `ls -m *.conf | tr -d '\n '` _
>   _--conf "spark.dynamicAllocation.maxExecutors=4" _
>   _--conf "spark.kryoserializer.buffer.max=1024" _
>   _tst.jar $*_
>  
>  
> does someone familiar with this spark job behavior or can advise ideas to fix 
> it?
>  
> Thanks in advance



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to