So I discovered this issue with the Flink Operator producing incorrect
Log4J files
https://github.com/GoogleCloudPlatform/flink-on-k8s-operator/issues/472
I based my config on those.

I updated my configs based on the files I found in the Flink repository.
Unfortunately that didn't fix my issue.
Here's an updated gist.

https://gist.github.com/jlewi/759505b754ea0e84716afd58d59aedc0
J




On Tue, Oct 19, 2021 at 11:06 AM Jeremy Lewi <[email protected]> wrote:

> Hi Folks,
>
> I'm running Python Beam on Flink and I'm trying to change the logging
> levels. Specifically
>
>    - Set the log level to Debug
>    in org.apache.beam.runners.fnexecution.environment (so I log container
>    starts)
>    - Set the log level for Kafka called from the Java Kafka IO to WARN
>    (I'm getting log spam from Kafka resetting the offsets).
>
> I followed the flink guide
> <https://ci.apache.org/projects/flink/flink-docs-master/docs/deployment/advanced/logging/>
>  to
> configure Log4J via log4j.properties and log4j-console.properties (Here's a
> gist <https://gist.github.com/jlewi/4702330e009bec3d0a8d2e77fb536db9>
> with my values). That didn't seem to have any effect.
>
> Is setting log4j.properties and log4j-console.properties for Flink the
> right way to control the debugging for the Java worker or do those files
> only affect flink and not the applications running on Flink?
>
> Thanks
> J
>
>
>

Reply via email to