Hi Folks,

So I tried putting the beam job server on the Flink JobManager and
Taskmanager containers and setting classloader.resolve-order
<https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/deployment/config/#classloader-resolve-order>
to parent-first.

My taskmanager's are now crashing because it looks like the Kafka IO
transform is trying to run docker and it can't because its running in a K8s
pod. Logs attached.

Why would KafkaIO try to launch docker? Does this have something to do with
the expansion service. The docs
<https://beam.apache.org/documentation/programming-guide/#multi-language-pipelines>
make
it seem like the expansion service only runs at job submission time and
only needs to be accessible from the machine where you are running your
python program to submit the job.

Thanks
J

On Wed, Aug 25, 2021 at 12:09 PM Jeremy Lewi <[email protected]> wrote:

> Hi Luke,
>
> Thanks. I've attached the full stack trace. When I reran it gave me an
> error about a different class.
>
> I checked the beam job server jar and as far as I can tell the classes are
> present. So seems like a potential issue with the classpath or staging of
> JARs on the task managers.
>
> Does anyone happen to know how jars get staged onto Flink taskmanagers? On
> the jobmanager I was able to locate the jar in a /tmp directory but I
> couldn't figure out how it was getting staged on taskmanagers.
>
> I tried baking the job server jar into the flink containers. That gave me
> an IllegalAccessError. I assume per the Flink Docs
> <https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/ops/debugging/debugging_classloading/#inverted-class-loading-and-classloader-resolution-order>
>  this
> is indicating a dependency conflict between the system JARs and the
> application JARs.
>
> With the portable runner is there anyway to disable uploading of the JAR
> and instead rely on the JARs being baked into the docker container?
>
> Thanks
> J
>
> On Wed, Aug 25, 2021 at 9:20 AM Luke Cwik <[email protected]> wrote:
>
>> Both those classes exist in beam-vendor-grpc-1_36_0-0.1.jar:
>>
>> lcwik@lcwik:~/Downloads$ jar tf beam-vendor-grpc-1_36_0-0.1.jar | grep
>> Hpack
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackDecoder$1.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackEncoder.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackDecoder$Http2HeadersSink.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackUtil$IndexType.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackUtil.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackDecoder$HeaderType.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackDynamicTable.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackHuffmanDecoder.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackEncoder$1.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackStaticTable.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackHuffmanEncoder.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackHeaderField.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackEncoder$HeaderEntry.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackHuffmanEncoder$1.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackHuffmanEncoder$EncodeProcessor.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackHuffmanEncoder$EncodedLengthProcessor.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackDecoder$Sink.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackDecoder.class
>> lcwik@lcwik:~/Downloads$ jar tf beam-vendor-grpc-1_36_0-0.1.jar | grep
>> DnsNameResolver
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/grpc/internal/DnsNameResolver$ResourceResolverFactory.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/grpc/internal/DnsNameResolver$Resolve$1.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/grpc/internal/DnsNameResolver$ResourceResolver.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/grpc/internal/DnsNameResolver$AddressResolver.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/grpc/internal/DnsNameResolver$InternalResolutionResult.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/grpc/internal/DnsNameResolver$1.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/grpc/internal/DnsNameResolver$Resolve.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/grpc/internal/DnsNameResolver$SrvRecord.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/grpc/internal/DnsNameResolver$JdkAddressResolver.class
>>
>> org/apache/beam/vendor/grpc/v1p36p0/io/grpc/internal/DnsNameResolverProvider.class
>> org/apache/beam/vendor/grpc/v1p36p0/io/grpc/internal/DnsNameResolver.class
>>
>> Java has a tendency to only report the full cause on the first failure of
>> this kind with all subsequent failures only reporting the
>> ClassNotFoundException. This happens because the ClassLoader remembers
>> which classes failed and doesn't try loading them again.
>>
>> Is there more of the stack trace pointing out the actual cause associated
>> with the first time this exception occurred?
>>
>>
>> On Tue, Aug 24, 2021 at 4:32 PM Jeremy Lewi <[email protected]>
>> wrote:
>>
>>> Hi Folks,
>>>
>>> I'm trying to run Beam Python 2.31 on Flink 1.13.
>>>
>>> I've created a simple streaming program to count Kafka messages. Running
>>> on the DirectRunner this works fine. But when I try to submit to my Flink
>>> cluster. I get the exception below in my taskmanager.
>>>
>>> I'm using the PortableRunner. Any suggestions on how to fix or debug
>>> this?
>>>
>>> Running programs that don't use Kafka works.
>>>
>>> Thanks
>>> J
>>>
>>> WARNING: An illegal reflective access operation has occurred
>>>
>>> WARNING: Illegal reflective access by
>>> org.apache.flink.shaded.akka.org.jboss.netty.util.internal.ByteBufferUtil
>>> (file:/opt/flink/lib/flink-dist_2.12-1.13.1.jar) to method
>>> java.nio.DirectByteBuffer.cleaner()
>>>
>>> WARNING: Please consider reporting this to the maintainers of
>>> org.apache.flink.shaded.akka.org.jboss.netty.util.internal.ByteBufferUtil
>>>
>>> WARNING: Use --illegal-access=warn to enable warnings of further illegal
>>> reflective access operations
>>>
>>> WARNING: All illegal access operations will be denied in a future release
>>>
>>> Aug 24, 2021 11:17:12 PM
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ManagedChannelImpl$2
>>> uncaughtException
>>>
>>> SEVERE: [Channel<55>: (localhost:50000)] Uncaught exception in the
>>> SynchronizationContext. Panic!
>>>
>>> java.lang.NoClassDefFoundError:
>>> org/apache/beam/vendor/grpc/v1p36p0/io/netty/handler/codec/http2/HpackDecoder
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.netty.handler.codec.http2.DefaultHttp2HeadersDecoder.<init>(DefaultHttp2HeadersDecoder.java:73)
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.netty.handler.codec.http2.DefaultHttp2HeadersDecoder.<init>(DefaultHttp2HeadersDecoder.java:59)
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.netty.GrpcHttp2HeadersUtils$GrpcHttp2ClientHeadersDecoder.<init>(GrpcHttp2HeadersUtils.java:70)
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.netty.NettyClientHandler.newHandler(NettyClientHandler.java:147)
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.netty.NettyClientTransport.start(NettyClientTransport.java:230)
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ForwardingConnectionClientTransport.start(ForwardingConnectionClientTransport.java:33)
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ForwardingConnectionClientTransport.start(ForwardingConnectionClientTransport.java:33)
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.InternalSubchannel.startNewTransport(InternalSubchannel.java:258)
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.InternalSubchannel.access$400(InternalSubchannel.java:65)
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.InternalSubchannel$2.run(InternalSubchannel.java:200)
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.SynchronizationContext.drain(SynchronizationContext.java:95)
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.SynchronizationContext.execute(SynchronizationContext.java:127)
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ManagedChannelImpl$NameResolverListener.onResult(ManagedChannelImpl.java:1815)
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.DnsNameResolver$Resolve.run(DnsNameResolver.java:333)
>>>
>>> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown
>>> Source)
>>>
>>> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
>>> Source)
>>>
>>> at java.base/java.lang.Thread.run(Unknown Source)
>>>
>>> Caused by: java.lang.ClassNotFoundException:
>>> org.apache.beam.vendor.grpc.v1p36p0.io.netty.handler.codec.http2.HpackDecoder
>>>
>>> at java.base/java.net.URLClassLoader.findClass(Unknown Source)
>>>
>>> at java.base/java.lang.ClassLoader.loadClass(Unknown Source)
>>>
>>> at
>>> org.apache.flink.util.FlinkUserCodeClassLoader.loadClassWithoutExceptionHandling(FlinkUserCodeClassLoader.java:64)
>>>
>>> at
>>> org.apache.flink.util.ChildFirstClassLoader.loadClassWithoutExceptionHandling(ChildFirstClassLoader.java:74)
>>>
>>> at
>>> org.apache.flink.util.FlinkUserCodeClassLoader.loadClass(FlinkUserCodeClassLoader.java:48)
>>>
>>> at java.base/java.lang.ClassLoader.loadClass(Unknown Source)
>>>
>>> ... 17 more
>>>
>>>
>>> Exception in thread "grpc-default-executor-0"
>>> java.lang.NoClassDefFoundError:
>>> org/apache/beam/vendor/grpc/v1p36p0/io/grpc/internal/DnsNameResolver$Resolve$1
>>>
>>> at
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.DnsNameResolver$Resolve.run(DnsNameResolver.java:339)
>>>
>>> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown
>>> Source)
>>>
>>> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
>>> Source)
>>>
>>> at java.base/java.lang.Thread.run(Unknown Source)
>>>
>>> Caused by: java.lang.ClassNotFoundException:
>>> org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.DnsNameResolver$Resolve$1
>>>
>>> at java.base/java.net.URLClassLoader.findClass(Unknown Source)
>>>
>>> at java.base/java.lang.ClassLoader.loadClass(Unknown Source)
>>>
>>> at
>>> org.apache.flink.util.FlinkUserCodeClassLoader.loadClassWithoutExceptionHandling(FlinkUserCodeClassLoader.java:64)
>>>
>>> at
>>> org.apache.flink.util.ChildFirstClassLoader.loadClassWithoutExceptionHandling(ChildFirstClassLoader.java:74)
>>>
>>> at
>>> org.apache.flink.util.FlinkUserCodeClassLoader.loadClass(FlinkUserCodeClassLoader.java:48)
>>>
>>> at java.base/java.lang.ClassLoader.loadClass(Unknown Source)
>>>
>>> ... 4 more
>>>
>>
sed: couldn't open temporary file /opt/flink/conf/sedNm8q1B: Read-only file 
system
sed: couldn't open temporary file /opt/flink/conf/sedwupjrB: Read-only file 
system
sed: couldn't open temporary file /opt/flink/conf/sedRnDuNA: Read-only file 
system
sed: couldn't open temporary file /opt/flink/conf/sedCQZNkB: Read-only file 
system
/docker-entrypoint.sh: line 88: /opt/flink/conf/flink-conf.yaml.tmp: Read-only 
file system
Starting Task Manager
Starting taskexecutor as a console application on host 
primer-kb-cluster-taskmanager-0.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/opt/flink/lib/beam-runners-flink-1.13-job-server-2.31.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/opt/flink/lib/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO: 
--------------------------------------------------------------------------------
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:  Preconfiguration: 
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO: 


RESOURCE_PARAMS extraction logs:
jvm_params: -Xmx677380085 -Xms677380085 -XX:MaxDirectMemorySize=296537295 
-XX:MaxMetaspaceSize=268435456
dynamic_configs: -D taskmanager.memory.network.min=162319567b -D 
taskmanager.cpu.cores=1.0 -D taskmanager.memory.task.off-heap.size=0b -D 
taskmanager.memory.jvm-metaspace.size=268435456b -D external-resources=none -D 
taskmanager.memory.jvm-overhead.min=210181237b -D 
taskmanager.memory.framework.off-heap.size=134217728b -D 
taskmanager.memory.network.max=162319567b -D 
taskmanager.memory.framework.heap.size=134217728b -D 
taskmanager.memory.managed.size=649278268b -D 
taskmanager.memory.task.heap.size=543162357b -D taskmanager.numberOfTaskSlots=1 
-D taskmanager.memory.jvm-overhead.max=210181237b
logs: WARNING: sun.reflect.Reflection.getCallerClass is not supported. This 
will impact performance.
INFO  [] - Loading configuration property: blob.server.port, 6124
INFO  [] - Loading configuration property: classloader.resolve-order, 
parent-first
INFO  [] - Loading configuration property: jobmanager.heap.size, 474m
INFO  [] - Loading configuration property: jobmanager.rpc.address, 
primer-kb-cluster-jobmanager
INFO  [] - Loading configuration property: jobmanager.rpc.port, 6123
INFO  [] - Loading configuration property: query.server.port, 6125
INFO  [] - Loading configuration property: rest.port, 8081
INFO  [] - Loading configuration property: taskmanager.heap.size, 1548m
INFO  [] - Loading configuration property: taskmanager.numberOfTaskSlots, 1
INFO  [] - Loading configuration property: taskmanager.rpc.port, 6122
INFO  [] - 'taskmanager.memory.flink.size' is not specified, use the configured 
deprecated task manager heap value (1.512gb (1623195648 bytes)) for it.
INFO  [] - Final TaskExecutor Memory configuration:
INFO  [] -   Total Process Memory:          1.957gb (2101812341 bytes)
INFO  [] -     Total Flink Memory:          1.512gb (1623195648 bytes)
INFO  [] -       Total JVM Heap Memory:     646.000mb (677380085 bytes)
INFO  [] -         Framework:               128.000mb (134217728 bytes)
INFO  [] -         Task:                    518.000mb (543162357 bytes)
INFO  [] -       Total Off-heap Memory:     902.000mb (945815563 bytes)
INFO  [] -         Managed:                 619.200mb (649278268 bytes)
INFO  [] -         Total JVM Direct Memory: 282.800mb (296537295 bytes)
INFO  [] -           Framework:             128.000mb (134217728 bytes)
INFO  [] -           Task:                  0 bytes
INFO  [] -           Network:               154.800mb (162319567 bytes)
INFO  [] -     JVM Metaspace:               256.000mb (268435456 bytes)
INFO  [] -     JVM Overhead:                200.444mb (210181237 bytes)

Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO: 
--------------------------------------------------------------------------------
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:  Starting TaskManager (Version: 1.13.1, Scala: 2.11, Rev:a7f3192, 
Date:2021-05-25T12:02:11+02:00)
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:  OS current user: flink
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:  Current Hadoop/Kerberos user: <no hadoop dependency found>
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:  JVM: OpenJDK 64-Bit Server VM - Oracle Corporation - 11/11.0.12+7
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:  Maximum heap size: 646 MiBytes
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:  JAVA_HOME: /usr/local/openjdk-11
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:  No Hadoop Dependency available
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:  JVM Options:
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -XX:+UseG1GC
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -Xmx677380085
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -Xms677380085
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -XX:MaxDirectMemorySize=296537295
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -XX:MaxMetaspaceSize=268435456
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     
-Dlog.file=/opt/flink/log/flink--taskexecutor-0-primer-kb-cluster-taskmanager-0.log
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -Dlog4j.configuration=file:/opt/flink/conf/log4j-console.properties
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     
-Dlog4j.configurationFile=file:/opt/flink/conf/log4j-console.properties
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -Dlogback.configurationFile=file:/opt/flink/conf/logback-console.xml
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:  Program Arguments:
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     --configDir
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     /opt/flink/conf
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -D
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     taskmanager.memory.network.min=162319567b
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -D
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     taskmanager.cpu.cores=1.0
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -D
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     taskmanager.memory.task.off-heap.size=0b
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -D
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     taskmanager.memory.jvm-metaspace.size=268435456b
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -D
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     external-resources=none
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -D
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     taskmanager.memory.jvm-overhead.min=210181237b
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -D
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     taskmanager.memory.framework.off-heap.size=134217728b
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -D
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     taskmanager.memory.network.max=162319567b
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -D
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     taskmanager.memory.framework.heap.size=134217728b
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -D
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     taskmanager.memory.managed.size=649278268b
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -D
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     taskmanager.memory.task.heap.size=543162357b
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -D
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     taskmanager.numberOfTaskSlots=1
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     -D
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:     taskmanager.memory.jvm-overhead.max=210181237b
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO:  Classpath: 
/opt/flink/lib/beam-runners-flink-1.13-job-server-2.31.0-SNAPSHOT.jar:/opt/flink/lib/flink-csv-1.13.1.jar:/opt/flink/lib/flink-json-1.13.1.jar:/opt/flink/lib/flink-shaded-zookeeper-3.4.14.jar:/opt/flink/lib/flink-table-blink_2.12-1.13.1.jar:/opt/flink/lib/flink-table_2.12-1.13.1.jar:/opt/flink/lib/log4j-1.2-api-2.12.1.jar:/opt/flink/lib/log4j-api-2.12.1.jar:/opt/flink/lib/log4j-core-2.12.1.jar:/opt/flink/lib/log4j-slf4j-impl-2.12.1.jar:/opt/flink/lib/flink-dist_2.12-1.13.1.jar:::
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.EnvironmentInformation 
logEnvironmentInfo
INFO: 
--------------------------------------------------------------------------------
Aug 25, 2021 10:56:19 PM org.apache.flink.runtime.util.SignalHandler register
INFO: Registered UNIX signal handlers for [TERM, HUP, INT]
Aug 25, 2021 10:56:19 PM 
org.apache.flink.runtime.taskexecutor.TaskManagerRunner main
INFO: Maximum number of open file descriptors is 1048576.
Aug 25, 2021 10:56:19 PM org.apache.flink.configuration.GlobalConfiguration 
loadYAMLResource
INFO: Loading configuration property: blob.server.port, 6124
Aug 25, 2021 10:56:19 PM org.apache.flink.configuration.GlobalConfiguration 
loadYAMLResource
INFO: Loading configuration property: classloader.resolve-order, parent-first
Aug 25, 2021 10:56:19 PM org.apache.flink.configuration.GlobalConfiguration 
loadYAMLResource
INFO: Loading configuration property: jobmanager.heap.size, 474m
Aug 25, 2021 10:56:19 PM org.apache.flink.configuration.GlobalConfiguration 
loadYAMLResource
INFO: Loading configuration property: jobmanager.rpc.address, 
primer-kb-cluster-jobmanager
Aug 25, 2021 10:56:19 PM org.apache.flink.configuration.GlobalConfiguration 
loadYAMLResource
INFO: Loading configuration property: jobmanager.rpc.port, 6123
Aug 25, 2021 10:56:19 PM org.apache.flink.configuration.GlobalConfiguration 
loadYAMLResource
INFO: Loading configuration property: query.server.port, 6125
Aug 25, 2021 10:56:19 PM org.apache.flink.configuration.GlobalConfiguration 
loadYAMLResource
INFO: Loading configuration property: rest.port, 8081
Aug 25, 2021 10:56:19 PM org.apache.flink.configuration.GlobalConfiguration 
loadYAMLResource
INFO: Loading configuration property: taskmanager.heap.size, 1548m
Aug 25, 2021 10:56:19 PM org.apache.flink.configuration.GlobalConfiguration 
loadYAMLResource
INFO: Loading configuration property: taskmanager.numberOfTaskSlots, 1
Aug 25, 2021 10:56:19 PM org.apache.flink.configuration.GlobalConfiguration 
loadYAMLResource
INFO: Loading configuration property: taskmanager.rpc.port, 6122
Aug 25, 2021 10:56:19 PM org.apache.flink.core.fs.FileSystem loadHadoopFsFactory
INFO: Hadoop is not in the classpath/dependencies. The extended set of 
supported File Systems via Hadoop is not available.
Aug 25, 2021 10:56:20 PM 
org.apache.flink.runtime.security.modules.HadoopModuleFactory createModule
INFO: Cannot create Hadoop Security Module because Hadoop cannot be found in 
the Classpath.
Aug 25, 2021 10:56:20 PM org.apache.flink.runtime.security.modules.JaasModule 
install
INFO: Jaas file will be created as /tmp/jaas-15633178046873740144.conf.
Aug 25, 2021 10:56:20 PM 
org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory 
isCompatibleWith
INFO: Cannot install HadoopSecurityContext because Hadoop cannot be found in 
the Classpath.
Aug 25, 2021 10:56:20 PM org.apache.flink.configuration.Configuration 
loggingFallback
INFO: Config uses fallback configuration key 'jobmanager.rpc.address' instead 
of key 'rest.address'
Aug 25, 2021 10:56:20 PM org.apache.flink.runtime.util.LeaderRetrievalUtils 
findConnectingAddress
INFO: Trying to select the network interface and address to use by connecting 
to the leading JobManager.
Aug 25, 2021 10:56:20 PM org.apache.flink.runtime.util.LeaderRetrievalUtils 
findConnectingAddress
INFO: TaskManager will try to connect for PT10S before falling back to 
heuristics
Aug 25, 2021 10:56:22 PM 
org.apache.flink.runtime.net.ConnectionUtils$LeaderConnectingAddressListener 
findConnectingAddress
INFO: Trying to connect to address 
primer-kb-cluster-jobmanager/10.100.112.249:6123
Aug 25, 2021 10:56:22 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address 
'primer-kb-cluster-taskmanager-0/172.23.53.67': connect timed out
Aug 25, 2021 10:56:22 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address '/172.23.53.67': connect timed out
Aug 25, 2021 10:56:22 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address '/127.0.0.1': connect timed out
Aug 25, 2021 10:56:23 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address '/172.23.53.67': connect timed out
Aug 25, 2021 10:56:24 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address '/127.0.0.1': connect timed out
Aug 25, 2021 10:56:24 PM 
org.apache.flink.runtime.net.ConnectionUtils$LeaderConnectingAddressListener 
findConnectingAddress
INFO: Trying to connect to address 
primer-kb-cluster-jobmanager/10.100.112.249:6123
Aug 25, 2021 10:56:25 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address 
'primer-kb-cluster-taskmanager-0/172.23.53.67': connect timed out
Aug 25, 2021 10:56:25 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address '/172.23.53.67': connect timed out
Aug 25, 2021 10:56:25 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address '/127.0.0.1': connect timed out
Aug 25, 2021 10:56:26 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address '/172.23.53.67': connect timed out
Aug 25, 2021 10:56:27 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address '/127.0.0.1': connect timed out
Aug 25, 2021 10:56:27 PM 
org.apache.flink.runtime.net.ConnectionUtils$LeaderConnectingAddressListener 
findConnectingAddress
INFO: Trying to connect to address 
primer-kb-cluster-jobmanager/10.100.112.249:6123
Aug 25, 2021 10:56:27 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address 
'primer-kb-cluster-taskmanager-0/172.23.53.67': connect timed out
Aug 25, 2021 10:56:27 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address '/172.23.53.67': connect timed out
Aug 25, 2021 10:56:27 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address '/127.0.0.1': connect timed out
Aug 25, 2021 10:56:28 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address '/172.23.53.67': connect timed out
Aug 25, 2021 10:56:29 PM org.apache.flink.runtime.net.ConnectionUtils 
tryToConnect
INFO: Failed to connect from address '/127.0.0.1': connect timed out
Aug 25, 2021 10:56:30 PM 
org.apache.flink.runtime.net.ConnectionUtils$LeaderConnectingAddressListener 
findConnectingAddress
WARNING: Could not connect to primer-kb-cluster-jobmanager/10.100.112.249:6123. 
Selecting a local address using heuristics.
Aug 25, 2021 10:56:30 PM 
org.apache.flink.runtime.taskexecutor.TaskManagerRunner 
determineTaskManagerBindAddressByConnectingToResourceManager
INFO: TaskManager will use hostname/address 'primer-kb-cluster-taskmanager-0' 
(172.23.53.67) for communication.
Aug 25, 2021 10:56:30 PM 
org.apache.flink.runtime.clusterframework.BootstrapTools startRemoteActorSystem
INFO: Trying to start actor system, external address 172.23.53.67:6122, bind 
address 0.0.0.0:6122.
Aug 25, 2021 10:56:30 PM akka.event.slf4j.Slf4jLogger$$anonfun$receive$1 
applyOrElse
INFO: Slf4jLogger started
Aug 25, 2021 10:56:30 PM 
akka.event.slf4j.Slf4jLogger$$anonfun$receive$1$$anonfun$applyOrElse$3 
apply$mcV$sp
INFO: Starting remoting
Aug 25, 2021 10:56:30 PM 
akka.event.slf4j.Slf4jLogger$$anonfun$receive$1$$anonfun$applyOrElse$3 
apply$mcV$sp
INFO: Remoting started; listening on addresses 
:[akka.tcp://[email protected]:6122]
Aug 25, 2021 10:56:30 PM 
org.apache.flink.runtime.clusterframework.BootstrapTools startActorSystem
INFO: Actor system started at akka.tcp://[email protected]:6122
Aug 25, 2021 10:56:30 PM org.apache.flink.runtime.metrics.MetricRegistryImpl 
<init>
INFO: No metrics reporter configured, no metrics will be exposed/reported.
Aug 25, 2021 10:56:30 PM 
org.apache.flink.runtime.clusterframework.BootstrapTools startRemoteActorSystem
INFO: Trying to start actor system, external address 172.23.53.67:0, bind 
address 0.0.0.0:0.
Aug 25, 2021 10:56:30 PM akka.event.slf4j.Slf4jLogger$$anonfun$receive$1 
applyOrElse
INFO: Slf4jLogger started
Aug 25, 2021 10:56:30 PM 
akka.event.slf4j.Slf4jLogger$$anonfun$receive$1$$anonfun$applyOrElse$3 
apply$mcV$sp
INFO: Starting remoting
Aug 25, 2021 10:56:30 PM 
akka.event.slf4j.Slf4jLogger$$anonfun$receive$1$$anonfun$applyOrElse$3 
apply$mcV$sp
INFO: Remoting started; listening on addresses 
:[akka.tcp://[email protected]:36029]
Aug 25, 2021 10:56:30 PM 
org.apache.flink.runtime.clusterframework.BootstrapTools startActorSystem
INFO: Actor system started at akka.tcp://[email protected]:36029
Aug 25, 2021 10:56:30 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService 
startServer
INFO: Starting RPC endpoint for 
org.apache.flink.runtime.metrics.dump.MetricQueryService at 
akka://flink-metrics/user/rpc/MetricQueryService_172.23.53.67:6122-ec6d5d .
Aug 25, 2021 10:56:30 PM org.apache.flink.runtime.blob.AbstractBlobCache <init>
INFO: Created BLOB cache storage directory 
/tmp/blobStore-53f26fb9-7366-49cc-9116-f6d64772448b
Aug 25, 2021 10:56:30 PM org.apache.flink.runtime.blob.AbstractBlobCache <init>
INFO: Created BLOB cache storage directory 
/tmp/blobStore-d00cf48a-f833-4de3-8f64-0625fb2c0ce9
Aug 25, 2021 10:56:30 PM 
org.apache.flink.runtime.externalresource.ExternalResourceUtils 
createStaticExternalResourceInfoProviderFromConfig
INFO: Enabled external resources: []
Aug 25, 2021 10:56:30 PM 
org.apache.flink.runtime.taskexecutor.TaskManagerRunner startTaskManager
INFO: Starting TaskManager with ResourceID: 172.23.53.67:6122-ec6d5d
Aug 25, 2021 10:56:30 PM 
org.apache.flink.runtime.taskexecutor.TaskManagerServices checkTempDirs
INFO: Temporary file directory '/tmp': total 199 GB, usable 155 GB (77.89% 
usable)
Aug 25, 2021 10:56:30 PM 
org.apache.flink.runtime.io.disk.FileChannelManagerImpl createFiles
INFO: FileChannelManager uses directory 
/tmp/flink-io-57f0bd53-bd4a-426c-b234-eab95ed27c74 for spill files.
Aug 25, 2021 10:56:30 PM org.apache.flink.runtime.io.network.netty.NettyConfig 
<init>
INFO: NettyConfig [server address: /0.0.0.0, server port: 0, ssl enabled: 
false, memory segment size (bytes): 32768, transport type: AUTO, number of 
server threads: 1 (manual), number of client threads: 1 (manual), server 
connect backlog: 0 (use Netty's default), client connect timeout (sec): 120, 
send/receive buffer size (bytes): 0 (use Netty's default)]
Aug 25, 2021 10:56:30 PM 
org.apache.flink.runtime.io.disk.FileChannelManagerImpl createFiles
INFO: FileChannelManager uses directory 
/tmp/flink-netty-shuffle-e0e489dc-b5b2-474f-96b2-8203f1b04c8e for spill files.
Aug 25, 2021 10:56:30 PM 
org.apache.flink.runtime.io.network.buffer.NetworkBufferPool <init>
INFO: Allocated 154 MB for network buffer pool (number of memory segments: 
4953, bytes per segment: 32768).
Aug 25, 2021 10:56:30 PM 
org.apache.flink.runtime.io.network.NettyShuffleEnvironment start
INFO: Starting the network environment and its components.
Aug 25, 2021 10:56:30 PM org.apache.flink.runtime.io.network.netty.NettyClient 
init
INFO: Transport type 'auto': using EPOLL.
Aug 25, 2021 10:56:30 PM org.apache.flink.runtime.io.network.netty.NettyClient 
init
INFO: Successful initialization (took 39 ms).
Aug 25, 2021 10:56:30 PM org.apache.flink.runtime.io.network.netty.NettyServer 
init
INFO: Transport type 'auto': using EPOLL.
Aug 25, 2021 10:56:30 PM org.apache.flink.runtime.io.network.netty.NettyServer 
init
INFO: Successful initialization (took 27 ms). Listening on SocketAddress 
/0.0.0.0:38823.
Aug 25, 2021 10:56:30 PM org.apache.flink.runtime.taskexecutor.KvStateService 
start
INFO: Starting the kvState service and its components.
Aug 25, 2021 10:56:31 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService 
startServer
INFO: Starting RPC endpoint for 
org.apache.flink.runtime.taskexecutor.TaskExecutor at 
akka://flink/user/rpc/taskmanager_0 .
Aug 25, 2021 10:56:31 PM 
org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService start
INFO: Start job leader service.
Aug 25, 2021 10:56:31 PM org.apache.flink.runtime.filecache.FileCache <init>
INFO: User file cache uses directory 
/tmp/flink-dist-cache-1e3bc514-6b46-4b22-8a6c-bc5da07fdd0d
Aug 25, 2021 10:56:31 PM org.apache.flink.runtime.taskexecutor.TaskExecutor 
connectToResourceManager
INFO: Connecting to ResourceManager 
akka.tcp://flink@primer-kb-cluster-jobmanager:6123/user/rpc/resourcemanager_*(00000000000000000000000000000000).
Aug 25, 2021 10:56:32 PM 
akka.event.slf4j.Slf4jLogger$$anonfun$receive$1$$anonfun$applyOrElse$2 
apply$mcV$sp
WARNING: Remote connection to [null] failed with java.net.ConnectException: 
Connection refused: primer-kb-cluster-jobmanager/10.100.112.249:6123
Aug 25, 2021 10:56:32 PM 
akka.event.slf4j.Slf4jLogger$$anonfun$receive$1$$anonfun$applyOrElse$2 
apply$mcV$sp
WARNING: Association with remote system 
[akka.tcp://flink@primer-kb-cluster-jobmanager:6123] has failed, address is now 
gated for [50] ms. Reason: [Association failed with 
[akka.tcp://flink@primer-kb-cluster-jobmanager:6123]] Caused by: 
[java.net.ConnectException: Connection refused: 
primer-kb-cluster-jobmanager/10.100.112.249:6123]
Aug 25, 2021 10:56:32 PM 
org.apache.flink.runtime.registration.RetryingRegistration 
lambda$startRegistration$1
INFO: Could not resolve ResourceManager address 
akka.tcp://flink@primer-kb-cluster-jobmanager:6123/user/rpc/resourcemanager_*, 
retrying in 10000 ms: Could not connect to rpc endpoint under address 
akka.tcp://flink@primer-kb-cluster-jobmanager:6123/user/rpc/resourcemanager_*.
Aug 25, 2021 10:56:42 PM 
org.apache.flink.runtime.registration.RetryingRegistration 
lambda$startRegistration$0
INFO: Resolved ResourceManager address, beginning registration
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by 
org.apache.flink.shaded.akka.org.jboss.netty.util.internal.ByteBufferUtil 
(file:/opt/flink/lib/beam-runners-flink-1.13-job-server-2.31.0-SNAPSHOT.jar) to 
method java.nio.DirectByteBuffer.cleaner()
WARNING: Please consider reporting this to the maintainers of 
org.apache.flink.shaded.akka.org.jboss.netty.util.internal.ByteBufferUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal 
reflective access operations
WARNING: All illegal access operations will be denied in a future release
Aug 25, 2021 10:56:42 PM 
org.apache.flink.runtime.taskexecutor.TaskExecutorToResourceManagerConnection 
onRegistrationSuccess
INFO: Successful registration at resource manager 
akka.tcp://flink@primer-kb-cluster-jobmanager:6123/user/rpc/resourcemanager_* 
under registration id 12d00de92c39cf7476757d0052509d3d.
Aug 25, 2021 11:00:09 PM org.apache.flink.runtime.taskexecutor.TaskExecutor 
requestSlot
INFO: Receive slot request 90b3bd029e5dd9bf9dbfbea2075f7cc8 for job 
78c1fd20366dfa914b5ed6728aedb87e from resource manager with leader id 
00000000000000000000000000000000.
Aug 25, 2021 11:00:09 PM org.apache.flink.runtime.taskexecutor.TaskExecutor 
allocateSlot
INFO: Allocated slot for 90b3bd029e5dd9bf9dbfbea2075f7cc8.
Aug 25, 2021 11:00:09 PM 
org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService addJob
INFO: Add job 78c1fd20366dfa914b5ed6728aedb87e for job leader monitoring.
Aug 25, 2021 11:00:10 PM 
org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService$JobManagerLeaderListener
 openRpcConnectionTo
INFO: Try to register at job manager 
akka.tcp://flink@primer-kb-cluster-jobmanager:6123/user/rpc/jobmanager_2 with 
leader id 00000000-0000-0000-0000-000000000000.
Aug 25, 2021 11:00:10 PM 
org.apache.flink.runtime.registration.RetryingRegistration 
lambda$startRegistration$0
INFO: Resolved JobManager address, beginning registration
Aug 25, 2021 11:00:10 PM 
org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService$JobManagerLeaderListener$JobManagerRegisteredRpcConnection
 lambda$onRegistrationSuccess$0
INFO: Successful registration at job manager 
akka.tcp://flink@primer-kb-cluster-jobmanager:6123/user/rpc/jobmanager_2 for 
job 78c1fd20366dfa914b5ed6728aedb87e.
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.taskexecutor.TaskExecutor 
establishJobManagerConnection
INFO: Establish JobManager connection for job 78c1fd20366dfa914b5ed6728aedb87e.
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.taskexecutor.TaskExecutor 
internalOfferSlotsToJobManager
INFO: Offer reserved slots to the leader of job 
78c1fd20366dfa914b5ed6728aedb87e.
Aug 25, 2021 11:00:10 PM 
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl 
markExistingSlotActive
INFO: Activate slot 90b3bd029e5dd9bf9dbfbea2075f7cc8.
Aug 25, 2021 11:00:10 PM 
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl 
markExistingSlotActive
INFO: Activate slot 90b3bd029e5dd9bf9dbfbea2075f7cc8.
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.taskexecutor.TaskExecutor 
submitTask
INFO: Received task Source: Impulse -> 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaSDF/{ParDo(GenerateKafkaSourceDescriptor),
 KafkaIO.ReadSourceDescriptors} (1/1)#0 (8e4611ecfd691859d36acddcc7930f71), 
deploy into slot with allocation id 90b3bd029e5dd9bf9dbfbea2075f7cc8.
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: Source: Impulse -> 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaSDF/{ParDo(GenerateKafkaSourceDescriptor),
 KafkaIO.ReadSourceDescriptors} (1/1)#0 (8e4611ecfd691859d36acddcc7930f71) 
switched from CREATED to DEPLOYING.
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.taskmanager.Task doRun
INFO: Loading JAR files for task Source: Impulse -> 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaSDF/{ParDo(GenerateKafkaSourceDescriptor),
 KafkaIO.ReadSourceDescriptors} (1/1)#0 (8e4611ecfd691859d36acddcc7930f71) 
[DEPLOYING].
Aug 25, 2021 11:00:10 PM 
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl 
markExistingSlotActive
INFO: Activate slot 90b3bd029e5dd9bf9dbfbea2075f7cc8.
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.blob.BlobClient 
downloadFromBlobServer
INFO: Downloading 
78c1fd20366dfa914b5ed6728aedb87e/p-3e9840de4106c9672eda7799de0881d160824a9c-3b26896409f8a86df1ace7bc4c4d3d27
 from primer-kb-cluster-jobmanager/10.100.112.249:6124
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.taskexecutor.TaskExecutor 
submitTask
INFO: Received task 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/{KafkaIO.Read, Remove Kafka 
Metadata} -> [3]{convert_to_dict, pair_with_one, window} -> ToKeyedWorkItem 
(1/1)#0 (770bbefcbaab8265297a2f1a185eb5b6), deploy into slot with allocation id 
90b3bd029e5dd9bf9dbfbea2075f7cc8.
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: [3]ReadFromKafka(beam:external:java:kafka:read:v1)/{KafkaIO.Read, Remove 
Kafka Metadata} -> [3]{convert_to_dict, pair_with_one, window} -> 
ToKeyedWorkItem (1/1)#0 (770bbefcbaab8265297a2f1a185eb5b6) switched from 
CREATED to DEPLOYING.
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.taskmanager.Task doRun
INFO: Loading JAR files for task 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/{KafkaIO.Read, Remove Kafka 
Metadata} -> [3]{convert_to_dict, pair_with_one, window} -> ToKeyedWorkItem 
(1/1)#0 (770bbefcbaab8265297a2f1a185eb5b6) [DEPLOYING].
Aug 25, 2021 11:00:10 PM 
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl 
markExistingSlotActive
INFO: Activate slot 90b3bd029e5dd9bf9dbfbea2075f7cc8.
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.taskexecutor.TaskExecutor 
submitTask
INFO: Received task group -> [2]{count, Map(<lambda at count_messages.py:147>)} 
-> [2]WriteToKafka(beam:external:java:kafka:write:v1)/{Kafka ProducerRecord, 
KafkaIO.WriteRecords} (1/1)#0 (7fd3c5eec6e56a441be18ee8e052acee), deploy into 
slot with allocation id 90b3bd029e5dd9bf9dbfbea2075f7cc8.
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: group -> [2]{count, Map(<lambda at count_messages.py:147>)} -> 
[2]WriteToKafka(beam:external:java:kafka:write:v1)/{Kafka ProducerRecord, 
KafkaIO.WriteRecords} (1/1)#0 (7fd3c5eec6e56a441be18ee8e052acee) switched from 
CREATED to DEPLOYING.
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.taskmanager.Task doRun
INFO: Loading JAR files for task group -> [2]{count, Map(<lambda at 
count_messages.py:147>)} -> 
[2]WriteToKafka(beam:external:java:kafka:write:v1)/{Kafka ProducerRecord, 
KafkaIO.WriteRecords} (1/1)#0 (7fd3c5eec6e56a441be18ee8e052acee) [DEPLOYING].
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.blob.BlobClient 
downloadFromBlobServer
INFO: Downloading 
78c1fd20366dfa914b5ed6728aedb87e/p-0ee3882709a8031a1302f7e6553cd551988dee20-d5be9efa92c5a3f66da40056080f2fd8
 from primer-kb-cluster-jobmanager/10.100.112.249:6124
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.blob.BlobClient 
downloadFromBlobServer
INFO: Downloading 
78c1fd20366dfa914b5ed6728aedb87e/p-9d0198c63e9a164bb19f38a01f1234706d007328-1475d40649d58f4d713959c1edc27923
 from primer-kb-cluster-jobmanager/10.100.112.249:6124
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.blob.BlobClient 
downloadFromBlobServer
INFO: Downloading 
78c1fd20366dfa914b5ed6728aedb87e/p-1302ab4f5cc30a74dfccd0b7ec032b34c7d11a07-389d5a212fa7af165dcfee3f7a403383
 from primer-kb-cluster-jobmanager/10.100.112.249:6124
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.blob.BlobClient 
downloadFromBlobServer
INFO: Downloading 
78c1fd20366dfa914b5ed6728aedb87e/p-524e82ba54cee01d3a55e3fffb6f87fdbf0f09b2-7af5e5536bd444b309775c2305788795
 from primer-kb-cluster-jobmanager/10.100.112.249:6124
Aug 25, 2021 11:00:10 PM org.apache.flink.runtime.blob.BlobClient 
downloadFromBlobServer
INFO: Downloading 
78c1fd20366dfa914b5ed6728aedb87e/p-573d84f6c754a89a4c82be5bbb107e1fd51fb7ee-721040722dad94d977d4aaddb64026ca
 from primer-kb-cluster-jobmanager/10.100.112.249:6124
Aug 25, 2021 11:00:11 PM org.apache.flink.runtime.state.StateBackendLoader 
loadFromApplicationOrConfigOrDefaultInternal
INFO: No state backend has been configured, using default (HashMap) 
org.apache.flink.runtime.state.hashmap.HashMapStateBackend@47da832e
Aug 25, 2021 11:00:11 PM org.apache.flink.runtime.state.CheckpointStorageLoader 
createJobManagerCheckpointStorage
INFO: Checkpoint storage is set to 'jobmanager'
Aug 25, 2021 11:00:11 PM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: group -> [2]{count, Map(<lambda at count_messages.py:147>)} -> 
[2]WriteToKafka(beam:external:java:kafka:write:v1)/{Kafka ProducerRecord, 
KafkaIO.WriteRecords} (1/1)#0 (7fd3c5eec6e56a441be18ee8e052acee) switched from 
DEPLOYING to INITIALIZING.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM org.apache.flink.runtime.state.StateBackendLoader 
loadFromApplicationOrConfigOrDefaultInternal
INFO: No state backend has been configured, using default (HashMap) 
org.apache.flink.runtime.state.hashmap.HashMapStateBackend@1198e1f9
Aug 25, 2021 11:00:12 PM org.apache.flink.runtime.state.StateBackendLoader 
loadFromApplicationOrConfigOrDefaultInternal
INFO: No state backend has been configured, using default (HashMap) 
org.apache.flink.runtime.state.hashmap.HashMapStateBackend@111ce51d
Aug 25, 2021 11:00:12 PM org.apache.flink.runtime.state.CheckpointStorageLoader 
createJobManagerCheckpointStorage
INFO: Checkpoint storage is set to 'jobmanager'
Aug 25, 2021 11:00:12 PM org.apache.flink.runtime.state.CheckpointStorageLoader 
createJobManagerCheckpointStorage
INFO: Checkpoint storage is set to 'jobmanager'
Aug 25, 2021 11:00:12 PM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: [3]ReadFromKafka(beam:external:java:kafka:read:v1)/{KafkaIO.Read, Remove 
Kafka Metadata} -> [3]{convert_to_dict, pair_with_one, window} -> 
ToKeyedWorkItem (1/1)#0 (770bbefcbaab8265297a2f1a185eb5b6) switched from 
DEPLOYING to INITIALIZING.
Aug 25, 2021 11:00:12 PM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: Source: Impulse -> 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaSDF/{ParDo(GenerateKafkaSourceDescriptor),
 KafkaIO.ReadSourceDescriptors} (1/1)#0 (8e4611ecfd691859d36acddcc7930f71) 
switched from DEPLOYING to INITIALIZING.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.flink.runtime.metrics.groups.TaskMetricGroup getOrAddOperator
WARNING: The operator name 
[2]WriteToKafka(beam:external:java:kafka:write:v1)/{Kafka ProducerRecord, 
KafkaIO.WriteRecords} exceeded the 80 characters length limit and was truncated.
Aug 25, 2021 11:00:12 PM 
org.apache.flink.runtime.metrics.groups.TaskMetricGroup getOrAddOperator
WARNING: The operator name 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaSDF/{ParDo(GenerateKafkaSourceDescriptor),
 KafkaIO.ReadSourceDescriptors} exceeded the 80 characters length limit and was 
truncated.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.flink.runtime.metrics.groups.TaskMetricGroup getOrAddOperator
WARNING: The operator name 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/{KafkaIO.Read, Remove Kafka 
Metadata} exceeded the 80 characters length limit and was truncated.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:12 PM 
org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder
INFO: The AWS S3 Beam extension was included in this build, but the awsRegion 
flag was not specified. If you don't plan to use S3, then ignore this message.
Aug 25, 2021 11:00:13 PM 
org.apache.beam.runners.fnexecution.environment.DockerCommand runImage
WARNING: Unable to pull docker image apache/beam_java11_sdk:2.31.0, cause: 
Cannot run program "docker": error=2, No such file or directory
Aug 25, 2021 11:00:13 PM 
org.apache.beam.runners.fnexecution.environment.DockerCommand runImage
WARNING: Unable to pull docker image apache/beam_java11_sdk:2.31.0, cause: 
Cannot run program "docker": error=2, No such file or directory
Aug 25, 2021 11:00:13 PM org.apache.flink.runtime.taskmanager.Task 
transitionState
WARNING: group -> [2]{count, Map(<lambda at count_messages.py:147>)} -> 
[2]WriteToKafka(beam:external:java:kafka:write:v1)/{Kafka ProducerRecord, 
KafkaIO.WriteRecords} (1/1)#0 (7fd3c5eec6e56a441be18ee8e052acee) switched from 
INITIALIZING to FAILED with failure cause: 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException:
 java.io.IOException: Cannot run program "docker": error=2, No such file or 
directory
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4966)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:451)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:436)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:303)
        at 
org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:38)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:202)
        at 
org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator.open(ExecutableStageDoFnOperator.java:243)
        at 
org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:437)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.restoreGates(StreamTask.java:582)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$1.call(StreamTaskActionExecutor.java:55)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.executeRestore(StreamTask.java:562)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.runWithCleanUpOnFail(StreamTask.java:647)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.restore(StreamTask.java:537)
        at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:759)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:566)
        at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.io.IOException: Cannot run program "docker": error=2, No such 
file or directory
        at java.base/java.lang.ProcessBuilder.start(Unknown Source)
        at java.base/java.lang.ProcessBuilder.start(Unknown Source)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:189)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:171)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runImage(DockerCommand.java:95)
        at 
org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.createEnvironment(DockerEnvironmentFactory.java:131)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:252)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:231)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
        ... 15 more
Caused by: java.io.IOException: error=2, No such file or directory
        at java.base/java.lang.ProcessImpl.forkAndExec(Native Method)
        at java.base/java.lang.ProcessImpl.<init>(Unknown Source)
        at java.base/java.lang.ProcessImpl.start(Unknown Source)
        ... 31 more

Aug 25, 2021 11:00:13 PM org.apache.flink.runtime.taskmanager.Task 
transitionState
WARNING: Source: Impulse -> 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaSDF/{ParDo(GenerateKafkaSourceDescriptor),
 KafkaIO.ReadSourceDescriptors} (1/1)#0 (8e4611ecfd691859d36acddcc7930f71) 
switched from INITIALIZING to FAILED with failure cause: 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException:
 java.io.IOException: Cannot run program "docker": error=2, No such file or 
directory
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4966)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:451)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:436)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:303)
        at 
org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.getStageBundleFactory(DefaultExecutableStageContext.java:38)
        at 
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.getStageBundleFactory(ReferenceCountingExecutableStageContextFactory.java:202)
        at 
org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator.open(ExecutableStageDoFnOperator.java:243)
        at 
org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:437)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.restoreGates(StreamTask.java:582)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.call(StreamTaskActionExecutor.java:100)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.executeRestore(StreamTask.java:562)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.runWithCleanUpOnFail(StreamTask.java:647)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.restore(StreamTask.java:537)
        at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:759)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:566)
        at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.io.IOException: Cannot run program "docker": error=2, No such 
file or directory
        at java.base/java.lang.ProcessBuilder.start(Unknown Source)
        at java.base/java.lang.ProcessBuilder.start(Unknown Source)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:189)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:171)
        at 
org.apache.beam.runners.fnexecution.environment.DockerCommand.runImage(DockerCommand.java:95)
        at 
org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.createEnvironment(DockerEnvironmentFactory.java:131)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:252)
        at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:231)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
        ... 15 more
Caused by: java.io.IOException: error=2, No such file or directory
        at java.base/java.lang.ProcessImpl.forkAndExec(Native Method)
        at java.base/java.lang.ProcessImpl.<init>(Unknown Source)
        at java.base/java.lang.ProcessImpl.start(Unknown Source)
        ... 31 more

Aug 25, 2021 11:00:13 PM org.apache.flink.runtime.taskmanager.Task doRun
INFO: Freeing task resources for group -> [2]{count, Map(<lambda at 
count_messages.py:147>)} -> 
[2]WriteToKafka(beam:external:java:kafka:write:v1)/{Kafka ProducerRecord, 
KafkaIO.WriteRecords} (1/1)#0 (7fd3c5eec6e56a441be18ee8e052acee).
Aug 25, 2021 11:00:13 PM org.apache.flink.runtime.taskmanager.Task doRun
INFO: Freeing task resources for Source: Impulse -> 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaSDF/{ParDo(GenerateKafkaSourceDescriptor),
 KafkaIO.ReadSourceDescriptors} (1/1)#0 (8e4611ecfd691859d36acddcc7930f71).
Aug 25, 2021 11:00:13 PM org.apache.flink.runtime.taskexecutor.TaskExecutor 
unregisterTaskAndNotifyFinalState
INFO: Un-registering task and sending final execution state FAILED to 
JobManager for task Source: Impulse -> 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaSDF/{ParDo(GenerateKafkaSourceDescriptor),
 KafkaIO.ReadSourceDescriptors} (1/1)#0 8e4611ecfd691859d36acddcc7930f71.
Aug 25, 2021 11:00:13 PM org.apache.flink.runtime.taskexecutor.TaskExecutor 
unregisterTaskAndNotifyFinalState
INFO: Un-registering task and sending final execution state FAILED to 
JobManager for task group -> [2]{count, Map(<lambda at count_messages.py:147>)} 
-> [2]WriteToKafka(beam:external:java:kafka:write:v1)/{Kafka ProducerRecord, 
KafkaIO.WriteRecords} (1/1)#0 7fd3c5eec6e56a441be18ee8e052acee.
Aug 25, 2021 11:00:13 PM org.apache.flink.runtime.taskmanager.Task 
cancelExecution
INFO: Attempting to cancel task 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/{KafkaIO.Read, Remove Kafka 
Metadata} -> [3]{convert_to_dict, pair_with_one, window} -> ToKeyedWorkItem 
(1/1)#0 (770bbefcbaab8265297a2f1a185eb5b6).
Aug 25, 2021 11:00:13 PM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: [3]ReadFromKafka(beam:external:java:kafka:read:v1)/{KafkaIO.Read, Remove 
Kafka Metadata} -> [3]{convert_to_dict, pair_with_one, window} -> 
ToKeyedWorkItem (1/1)#0 (770bbefcbaab8265297a2f1a185eb5b6) switched from 
INITIALIZING to CANCELING.
Aug 25, 2021 11:00:13 PM org.apache.flink.runtime.taskmanager.Task 
cancelOrFailAndCancelInvokableInternal
INFO: Triggering cancellation of task code 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/{KafkaIO.Read, Remove Kafka 
Metadata} -> [3]{convert_to_dict, pair_with_one, window} -> ToKeyedWorkItem 
(1/1)#0 (770bbefcbaab8265297a2f1a185eb5b6).
Aug 25, 2021 11:00:13 PM org.apache.flink.runtime.taskmanager.Task 
transitionState
INFO: [3]ReadFromKafka(beam:external:java:kafka:read:v1)/{KafkaIO.Read, Remove 
Kafka Metadata} -> [3]{convert_to_dict, pair_with_one, window} -> 
ToKeyedWorkItem (1/1)#0 (770bbefcbaab8265297a2f1a185eb5b6) switched from 
CANCELING to CANCELED.
Aug 25, 2021 11:00:13 PM org.apache.flink.runtime.taskmanager.Task doRun
INFO: Freeing task resources for 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/{KafkaIO.Read, Remove Kafka 
Metadata} -> [3]{convert_to_dict, pair_with_one, window} -> ToKeyedWorkItem 
(1/1)#0 (770bbefcbaab8265297a2f1a185eb5b6).
Aug 25, 2021 11:00:13 PM org.apache.flink.runtime.taskexecutor.TaskExecutor 
unregisterTaskAndNotifyFinalState
INFO: Un-registering task and sending final execution state CANCELED to 
JobManager for task 
[3]ReadFromKafka(beam:external:java:kafka:read:v1)/{KafkaIO.Read, Remove Kafka 
Metadata} -> [3]{convert_to_dict, pair_with_one, window} -> ToKeyedWorkItem 
(1/1)#0 770bbefcbaab8265297a2f1a185eb5b6.
Aug 25, 2021 11:00:13 PM 
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal
INFO: Free slot TaskSlot(index:0, state:ACTIVE, resource profile: 
ResourceProfile{cpuCores=1.0000000000000000, taskHeapMemory=518.000mb 
(543162357 bytes), taskOffHeapMemory=0 bytes, managedMemory=619.200mb 
(649278268 bytes), networkMemory=154.800mb (162319567 bytes)}, allocationId: 
90b3bd029e5dd9bf9dbfbea2075f7cc8, jobId: 78c1fd20366dfa914b5ed6728aedb87e).
Aug 25, 2021 11:00:13 PM 
org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService removeJob
INFO: Remove job 78c1fd20366dfa914b5ed6728aedb87e from job leader monitoring.
Aug 25, 2021 11:00:13 PM org.apache.flink.runtime.taskexecutor.TaskExecutor 
disconnectJobManagerConnection
INFO: Close JobManager connection for job 78c1fd20366dfa914b5ed6728aedb87e.

Reply via email to