Hi,

It would be good if you can provide the job manager and task manager log
files, so that others can analysis the problem?

Thank you~

Xintong Song



On Mon, Aug 12, 2019 at 10:12 AM pengcheng...@bonc.com.cn <
pengcheng...@bonc.com.cn> wrote:

> Hi all,
> some slots are not be available,when job is not running.
> I get TM dump when job is not running,and analysis it with *Eclipse
> Memory Analyzer*. Here are some of the results which look useful:
>
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f94420000c8
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> Timestamps/Watermarks -> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR,
> O_ORDERKEY_LONG, O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE,
> O_ORDERDATE, O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT,
> O_TIMESTAMP1, O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT,
> O_TIMESTAMP_EVENT, OVER_TIME) -> where: (AND(LIKE(O_ORDERPRIORITY,
> _UTF-16LE'%M%'), <=(O_CLERK, _UTF-16LE'Clerk#000000144'), >=(O_CLERK,
> _UTF-16LE'Clerk#000000048'), =(O_ORDERSTATUS, _UTF-16LE'O'))), select:
> (O_CLERK, O_ORDERDATE, CAST(_UTF-16LE'O') AS O_ORDERSTATUS,
> O_ORDERPRIORITY, OVER_TIME, EXTRACT(FLAG(MONTH), O_DATE) AS $f5, O_COMMENT)
> -> time attribute: (OVER_TIME) (2/8) 272 1,281,344 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f9442000000 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f9442f5a630
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR, O_ORDERKEY_LONG,
> O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE, O_ORDERDATE,
> O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT, O_TIMESTAMP1,
> O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT, O_TIMESTAMP_EVENT,
> OVER_TIME) -> where: (AND(LIKE(O_ORDERPRIORITY, _UTF-16LE'3%'),
> =(O_ORDERSTATUS, _UTF-16LE'F'), <=(O_CLERK, _UTF-16LE'Clerk#000006144'),
> >=(O_CLERK, _UTF-16LE'Clerk#000000048'))), select: (O_CLERK, O_ORDERDATE,
> CAST(_UTF-16LE'F') AS O_ORDERSTATUS, OVER_TIME, EXTRACT(FLAG(MONTH),
> O_ORDERDATE) AS $f4, O_COMMENT, O_CUSTKEY) (2/8) 272 1,274,312 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f9442f5a268 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f94f4f048a8
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR, O_ORDERKEY_LONG,
> O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE, O_ORDERDATE,
> O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT, O_TIMESTAMP1,
> O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT, O_TIMESTAMP_EVENT,
> OVER_TIME) -> where: (AND(IN(TRIM(FLAG(BOTH), _UTF-16LE' ',
> O_ORDERPRIORITY), _UTF-16LE'1-URGENT', _UTF-16LE'3-MEDIUM',
> _UTF-16LE'5-LOW'), >=(O_CLERK, _UTF-16LE'Clerk#000024400'),
> >(EXTRACT(FLAG(MONTH), O_ORDERDATE), 6), <>(O_ORDERSTATUS, _UTF-16LE'O'),
> <=(O_ORDERKEY_INT, 12889))), select: (O_ORDERKEY_INT, O_CUSTKEY,
> O_ORDERSTATUS, O_ORDERDATE, O_CLERK, O_DATE, OVER_TIME) (2/8) 272
> 1,274,184 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f94f4f04800 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f94441a1aa8
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR, O_ORDERKEY_LONG,
> O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE, O_ORDERDATE,
> O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT, O_TIMESTAMP1,
> O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT, O_TIMESTAMP_EVENT,
> OVER_TIME) -> where: (AND(LIKE(O_ORDERPRIORITY, _UTF-16LE'%M%'),
> <=(O_CLERK, _UTF-16LE'Clerk#000000144'), >=(O_CLERK,
> _UTF-16LE'Clerk#000000048'), =(O_ORDERSTATUS, _UTF-16LE'O'))), select:
> (O_CLERK, O_ORDERDATE, CAST(_UTF-16LE'O') AS O_ORDERSTATUS,
> O_ORDERPRIORITY, CEIL(MOD(O_CUSTKEY, EXTRACT(FLAG(MONTH), O_ORDERDATE))) AS
> $f4, OVER_TIME, EXTRACT(FLAG(MONTH), O_DATE) AS $f6, O_COMMENT) (2/8) 272
> 1,263,416 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f94441a1a00 true
>
>    - org.apache.kafka.common.utils.KafkaThread @ 0x7f91de001b78 »
>
> kafka-producer-network-thread | producer-1 184 342,912 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f91de000040 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f947a57d290
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> from: (ORD_ID, PS_PARTKEY, PS_SUPPKEY, PS_AVAILQTY, PS_SUPPLYCOST,
> PS_COMMENT, PS_INT, PS_LONG, PS_DOUBLE8, PS_DOUBLE14, PS_DOUBLE15,
> PS_NUMBER1, PS_NUMBER2, PS_NUMBER3, PS_NUMBER4, PS_DATE, PS_TIMESTAMP,
> PS_DATE_EVENT, PS_TIMESTAMP_EVENT, OVER_TIME) -> select: (PS_PARTKEY,
> PS_SUPPKEY, PS_AVAILQTY, PS_COMMENT, OVER_TIME) (1/8) 272 243,512 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f947a57c6d0 true
>
>    - akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread @
>    0x7f91e0da4c40 »
>
> flink-akka.remote.default-remote-dispatcher-23 192 143,160 
> sun.misc.Launcher$AppClassLoader
> @ 0x7f91bc253cb0 false
>
>    - akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread @
>    0x7f91bd044988 »
>
> flink-akka.remote.default-remote-dispatcher-38 192 142,712 
> sun.misc.Launcher$AppClassLoader
> @ 0x7f91bc253cb0 false
>
>    - akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread @
>    0x7f9233003d98 »
>
> flink-akka.remote.default-remote-dispatcher-19 192 133,624 
> sun.misc.Launcher$AppClassLoader
> @ 0x7f91bc253cb0 false
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f947a5c68c0
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR, O_ORDERKEY_LONG,
> O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE, O_ORDERDATE,
> O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT, O_TIMESTAMP1,
> O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT, O_TIMESTAMP_EVENT,
> OVER_TIME) -> where: (<>(O_CLERK, _UTF-16LE'Clerk#000010377')), select:
> (O_CUSTKEY, O_ORDERSTATUS, O_ORDERPRIORITY, O_SHIPPRIORITY,
> CAST(O_ORDERKEY_INT) AS O_ORDERKEY_INT0) (6/8) 272 125,144 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f947a57c6d0 true
>
>    - akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread @
>    0x7f91bca9ae08 »
>
> flink-akka.remote.default-remote-dispatcher-6 192 124,856 
> sun.misc.Launcher$AppClassLoader
> @ 0x7f91bc253cb0 false
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f9442002288
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> Timestamps/Watermarks -> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR,
> O_ORDERKEY_LONG, O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE,
> O_ORDERDATE, O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT,
> O_TIMESTAMP1, O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT,
> O_TIMESTAMP_EVENT, OVER_TIME) -> where: (AND(LIKE(O_ORDERPRIORITY,
> _UTF-16LE'%M%'), <=(O_CLERK, _UTF-16LE'Clerk#000000144'), >=(O_CLERK,
> _UTF-16LE'Clerk#000000048'), =(O_ORDERSTATUS, _UTF-16LE'O'))), select:
> (O_CLERK, O_ORDERDATE, CAST(_UTF-16LE'O') AS O_ORDERSTATUS,
> O_ORDERPRIORITY, OVER_TIME, EXTRACT(FLAG(MONTH), O_DATE) AS $f5, O_COMMENT)
> -> time attribute: (OVER_TIME) (3/8) 272 124,808 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f94420079d8 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f91ec800568
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> Timestamps/Watermarks -> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR,
> O_ORDERKEY_LONG, O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE,
> O_ORDERDATE, O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT,
> O_TIMESTAMP1, O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT,
> O_TIMESTAMP_EVENT, OVER_TIME) -> where: (AND(LIKE(O_ORDERPRIORITY,
> _UTF-16LE'3%'), =(O_ORDERSTATUS, _UTF-16LE'F'), <=(O_CLERK,
> _UTF-16LE'Clerk#000006144'), >=(O_CLERK, _UTF-16LE'Clerk#000000048'))),
> select: (O_CLERK, O_ORDERDATE, CAST(_UTF-16LE'F') AS O_ORDERSTATUS,
> OVER_TIME, EXTRACT(FLAG(MONTH), O_ORDERDATE) AS $f4, O_COMMENT, O_CUSTKEY)
> -> time attribute: (OVER_TIME) (1/8) 272 124,800 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f94eb800008 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f91e47ffff8
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> Timestamps/Watermarks -> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR,
> O_ORDERKEY_LONG, O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE,
> O_ORDERDATE, O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT,
> O_TIMESTAMP1, O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT,
> O_TIMESTAMP_EVENT, OVER_TIME) -> where: (AND(LIKE(O_ORDERPRIORITY,
> _UTF-16LE'3%'), =(O_ORDERSTATUS, _UTF-16LE'F'), <=(O_CLERK,
> _UTF-16LE'Clerk#000006144'), >=(O_CLERK, _UTF-16LE'Clerk#000000048'))),
> select: (O_CLERK, O_ORDERDATE, CAST(_UTF-16LE'F') AS O_ORDERSTATUS,
> OVER_TIME, EXTRACT(FLAG(MONTH), O_ORDERDATE) AS $f4, O_COMMENT, O_CUSTKEY)
> -> time attribute: (OVER_TIME) (4/8) 272 124,800 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f94eb800008 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f91e4000000
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> Timestamps/Watermarks -> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR,
> O_ORDERKEY_LONG, O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE,
> O_ORDERDATE, O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT,
> O_TIMESTAMP1, O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT,
> O_TIMESTAMP_EVENT, OVER_TIME) -> where: (AND(LIKE(O_ORDERPRIORITY,
> _UTF-16LE'3%'), =(O_ORDERSTATUS, _UTF-16LE'F'), <=(O_CLERK,
> _UTF-16LE'Clerk#000006144'), >=(O_CLERK, _UTF-16LE'Clerk#000000048'))),
> select: (O_CLERK, O_ORDERDATE, CAST(_UTF-16LE'F') AS O_ORDERSTATUS,
> OVER_TIME, EXTRACT(FLAG(MONTH), O_ORDERDATE) AS $f4, O_COMMENT, O_CUSTKEY)
> -> time attribute: (OVER_TIME) (6/8) 272 124,800 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f94eb800008 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f91ec000158
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> Timestamps/Watermarks -> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR,
> O_ORDERKEY_LONG, O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE,
> O_ORDERDATE, O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT,
> O_TIMESTAMP1, O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT,
> O_TIMESTAMP_EVENT, OVER_TIME) -> where: (AND(LIKE(O_ORDERPRIORITY,
> _UTF-16LE'3%'), =(O_ORDERSTATUS, _UTF-16LE'F'), <=(O_CLERK,
> _UTF-16LE'Clerk#000006144'), >=(O_CLERK, _UTF-16LE'Clerk#000000048'))),
> select: (O_CLERK, O_ORDERDATE, CAST(_UTF-16LE'F') AS O_ORDERSTATUS,
> OVER_TIME, EXTRACT(FLAG(MONTH), O_ORDERDATE) AS $f4, O_COMMENT, O_CUSTKEY)
> -> time attribute: (OVER_TIME) (5/8) 272 124,800 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f91ec000000 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f94f5147840
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR, O_ORDERKEY_LONG,
> O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE, O_ORDERDATE,
> O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT, O_TIMESTAMP1,
> O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT, O_TIMESTAMP_EVENT,
> OVER_TIME) -> where: (AND(IN(TRIM(FLAG(BOTH), _UTF-16LE' ',
> O_ORDERPRIORITY), _UTF-16LE'1-URGENT', _UTF-16LE'3-MEDIUM',
> _UTF-16LE'5-LOW'), >=(O_CLERK, _UTF-16LE'Clerk#000024400'),
> >(EXTRACT(FLAG(MONTH), O_ORDERDATE), 6), <>(O_ORDERSTATUS, _UTF-16LE'O'),
> <=(O_ORDERKEY_INT, 12889))), select: (O_ORDERKEY_INT, O_CUSTKEY,
> O_ORDERSTATUS, O_ORDERDATE, O_CLERK, O_DATE, OVER_TIME) (8/8) 272 124,792 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f94f4f04800 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f94f5037268
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR, O_ORDERKEY_LONG,
> O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE, O_ORDERDATE,
> O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT, O_TIMESTAMP1,
> O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT, O_TIMESTAMP_EVENT,
> OVER_TIME) -> where: (AND(IN(TRIM(FLAG(BOTH), _UTF-16LE' ',
> O_ORDERPRIORITY), _UTF-16LE'1-URGENT', _UTF-16LE'3-MEDIUM',
> _UTF-16LE'5-LOW'), >=(O_CLERK, _UTF-16LE'Clerk#000024400'),
> >(EXTRACT(FLAG(MONTH), O_ORDERDATE), 6), <>(O_ORDERSTATUS, _UTF-16LE'O'),
> <=(O_ORDERKEY_INT, 12889))), select: (O_ORDERKEY_INT, O_CUSTKEY,
> O_ORDERSTATUS, O_ORDERDATE, O_CLERK, O_DATE, OVER_TIME) (6/8) 272 124,792 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f94f4f04800 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f947a5fea38
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR, O_ORDERKEY_LONG,
> O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE, O_ORDERDATE,
> O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT, O_TIMESTAMP1,
> O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT, O_TIMESTAMP_EVENT,
> OVER_TIME) -> where: (<=(O_ORDERKEY_INT, 188)), select: (O_ORDERKEY_INT AS
> A, O_ORDERKEY_LONG AS \u6c49\u5b57, O_CUSTKEY AS _AS, O_CLERK AS FRV,
> O_ORDERPRIORITY AS AHG, O_DATE AS
> SH567_7FGH\u54c8\u89e3\u6563_SH567_8FGH\u54c8\u89e3\u6563_SH567_9FGH\u54c8\u89e3\u6563_SH567_0FGH\u54c8\u89e3\u6563_SH567_1FGH\u54c8\u89e3\u6563_S)
> -> to: Row -> Map -> Sink: b26f1aec-47d2-4d4c-a9f2-fc1b10ae14be (4/8) 272
> 124,656 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f947a675c80 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f947a5c6f20
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR, O_ORDERKEY_LONG,
> O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE, O_ORDERDATE,
> O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT, O_TIMESTAMP1,
> O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT, O_TIMESTAMP_EVENT,
> OVER_TIME) -> where: (<=(O_ORDERKEY_INT, 188)), select: (O_ORDERKEY_INT AS
> A, O_ORDERKEY_LONG AS \u6c49\u5b57, O_CUSTKEY AS _AS, O_CLERK AS FRV,
> O_ORDERPRIORITY AS AHG, O_DATE AS
> SH567_7FGH\u54c8\u89e3\u6563_SH567_8FGH\u54c8\u89e3\u6563_SH567_9FGH\u54c8\u89e3\u6563_SH567_0FGH\u54c8\u89e3\u6563_SH567_1FGH\u54c8\u89e3\u6563_S)
> -> to: Row -> Map -> Sink: b26f1aec-47d2-4d4c-a9f2-fc1b10ae14be (5/8) 272
> 124,656 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f947a675c80 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f947a57d4e0
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR, O_ORDERKEY_LONG,
> O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE, O_ORDERDATE,
> O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT, O_TIMESTAMP1,
> O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT, O_TIMESTAMP_EVENT,
> OVER_TIME) -> where: (<=(O_ORDERKEY_INT, 188)), select: (O_ORDERKEY_INT AS
> A, O_ORDERKEY_LONG AS \u6c49\u5b57, O_CUSTKEY AS _AS, O_CLERK AS FRV,
> O_ORDERPRIORITY AS AHG, O_DATE AS
> SH567_7FGH\u54c8\u89e3\u6563_SH567_8FGH\u54c8\u89e3\u6563_SH567_9FGH\u54c8\u89e3\u6563_SH567_0FGH\u54c8\u89e3\u6563_SH567_1FGH\u54c8\u89e3\u6563_S)
> -> to: Row -> Map -> Sink: b26f1aec-47d2-4d4c-a9f2-fc1b10ae14be (6/8) 272
> 124,656 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f947a675c80 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f947a5f8ea8
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> from: (ORD_ID, O_ORDERKEY_INT, O_ORDERKEY_VARCHAR, O_ORDERKEY_LONG,
> O_ORDERKEY_DOUBLE, O_CUSTKEY, O_ORDERSTATUS, O_TOTALPRICE, O_ORDERDATE,
> O_ORDERPRIORITY, O_CLERK, O_SHIPPRIORITY, O_COMMENT, O_TIMESTAMP1,
> O_TIMESTAMP2, O_DATE, O_TIMESTAMP, O_DATE_EVENT, O_TIMESTAMP_EVENT,
> OVER_TIME) -> where: (LIKE(O_ORDERPRIORITY, _UTF-16LE'%I%')), select:
> (O_ORDERKEY_DOUBLE, O_ORDERSTATUS, O_ORDERPRIORITY, O_CUSTKEY, OVER_TIME)
> (1/8) 272 124,328 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f947a571878 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f93fa000540
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> Timestamps/Watermarks -> from: (ORD_ID, PS_PARTKEY, PS_SUPPKEY,
> PS_AVAILQTY, PS_SUPPLYCOST, PS_COMMENT, PS_INT, PS_LONG, PS_DOUBLE8,
> PS_DOUBLE14, PS_DOUBLE15, PS_NUMBER1, PS_NUMBER2, PS_NUMBER3, PS_NUMBER4,
> PS_DATE, PS_TIMESTAMP, PS_DATE_EVENT, PS_TIMESTAMP_EVENT, OVER_TIME) ->
> select: (PS_PARTKEY, PS_SUPPKEY, PS_COMMENT, CAST(PS_PARTKEY) AS
> PS_PARTKEY0) (4/8) 272 124,208 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f93f98210f8 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f93fa0002a0
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> Timestamps/Watermarks -> from: (ORD_ID, PS_PARTKEY, PS_SUPPKEY,
> PS_AVAILQTY, PS_SUPPLYCOST, PS_COMMENT, PS_INT, PS_LONG, PS_DOUBLE8,
> PS_DOUBLE14, PS_DOUBLE15, PS_NUMBER1, PS_NUMBER2, PS_NUMBER3, PS_NUMBER4,
> PS_DATE, PS_TIMESTAMP, PS_DATE_EVENT, PS_TIMESTAMP_EVENT, OVER_TIME) ->
> select: (PS_PARTKEY, PS_SUPPKEY, PS_COMMENT, CAST(PS_PARTKEY) AS
> PS_PARTKEY0) (5/8) 272 124,208 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f93f98210f8 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f93fa000000
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> Timestamps/Watermarks -> from: (ORD_ID, PS_PARTKEY, PS_SUPPKEY,
> PS_AVAILQTY, PS_SUPPLYCOST, PS_COMMENT, PS_INT, PS_LONG, PS_DOUBLE8,
> PS_DOUBLE14, PS_DOUBLE15, PS_NUMBER1, PS_NUMBER2, PS_NUMBER3, PS_NUMBER4,
> PS_DATE, PS_TIMESTAMP, PS_DATE_EVENT, PS_TIMESTAMP_EVENT, OVER_TIME) ->
> select: (PS_PARTKEY, PS_SUPPKEY, PS_COMMENT, CAST(PS_PARTKEY) AS
> PS_PARTKEY0) (2/8) 272 124,208 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f93f98210f8 true
>
>    - org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread
>    @ 0x7f91e0be5598
>
> Kafka 0.10 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> Map -> where: (LIKE(C1, _UTF-16LE'22%')), select: (C1, C2, C3, C4, C5, C6,
> C7, C8, C9, C10) -> to: Row -> Map -> Sink: cirrostream_yy_job_38 (5/8)
> 272 121,232 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f987777d010 true
>
>    - akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread @
>    0x7f91bd02dd20 »
>
> flink-akka.remote.default-remote-dispatcher-50 192 115,720 
> sun.misc.Launcher$AppClassLoader
> @ 0x7f91bc253cb0 false
>
>    - org.apache.flink.streaming.runtime.io.StreamRecordWriter$OutputFlusher
>    @ 0x7f922c650cd8 »
>
> OutputFlusher for Map -> Filter -> Map -> Map -> where: (AND(>=(_C2, 50),
> <=(_C2, 60))), select: (_C1, _C2, _C3, _C4, _C5, _C6, _C7, _C8, _C9, _C10,
> _C11, _C12, _C13, _C14, _C15) -> to: Row 200 111,048 
> sun.misc.Launcher$AppClassLoader
> @ 0x7f91bc253cb0 true
>
>    - org.apache.flink.streaming.runtime.io.StreamRecordWriter$OutputFlusher
>    @ 0x7f922a650a20 »
>
> OutputFlusher for Map -> Filter -> Map -> Map -> where: (AND(>=(_C2, 50),
> <=(_C2, 60))), select: (_C1, _C2, _C3, _C4, _C5, _C6, _C7, _C8, _C9, _C10,
> _C11, _C12, _C13, _C14, _C15) -> to: Row 200 111,048 
> sun.misc.Launcher$AppClassLoader
> @ 0x7f91bc253cb0 true
>
>    - org.apache.flink.streaming.runtime.io.StreamRecordWriter$OutputFlusher
>    @ 0x7f9227000540 »
>
> OutputFlusher for Map -> Filter -> Map -> Map -> where: (AND(>=(_C2, 50),
> <=(_C2, 60))), select: (_C1, _C2, _C3, _C4, _C5, _C6, _C7, _C8, _C9, _C10,
> _C11, _C12, _C13, _C14, _C15) -> to: Row 200 111,048 
> sun.misc.Launcher$AppClassLoader
> @ 0x7f91bc253cb0 true
>
>    - org.apache.flink.streaming.runtime.io.StreamRecordWriter$OutputFlusher
>    @ 0x7f91c1001628 »
>
> OutputFlusher for Map -> Filter -> Map -> Map -> where: (AND(>=(_C2, 50),
> <=(_C2, 60))), select: (_C1, _C2, _C3, _C4, _C5, _C6, _C7, _C8, _C9, _C10,
> _C11, _C12, _C13, _C14, _C15) -> to: Row 200 111,048 
> sun.misc.Launcher$AppClassLoader
> @ 0x7f91bc253cb0 true
>
>    - org.apache.flink.streaming.runtime.io.StreamRecordWriter$OutputFlusher
>    @ 0x7f91bd04a4d8 »
>
> OutputFlusher for Map -> Filter -> Map -> Map -> where: (AND(>=(_C2, 50),
> <=(_C2, 60))), select: (_C1, _C2, _C3, _C4, _C5, _C6, _C7, _C8, _C9, _C10,
> _C11, _C12, _C13, _C14, _C15) -> to: Row 200 111,048 
> sun.misc.Launcher$AppClassLoader
> @ 0x7f91bc253cb0 true
>
>    - org.apache.flink.streaming.runtime.io.StreamRecordWriter$OutputFlusher
>    @ 0x7f91eb000200 »
>
> OutputFlusher for Map -> Filter -> Map -> Map -> where: (AND(>=(_C2, 50),
> <=(_C2, 60))), select: (_C1, _C2, _C3, _C4, _C5, _C6, _C7, _C8, _C9, _C10,
> _C11, _C12, _C13, _C14, _C15) -> to: Row 200 110,848 
> sun.misc.Launcher$AppClassLoader
> @ 0x7f91bc253cb0 true
>
>    - org.apache.flink.streaming.runtime.io.StreamRecordWriter$OutputFlusher
>    @ 0x7f91bd039360 »
>
> OutputFlusher for Map -> Filter -> Map -> Map -> where: (AND(>=(_C2, 50),
> <=(_C2, 60))), select: (_C1, _C2, _C3, _C4, _C5, _C6, _C7, _C8, _C9, _C10,
> _C11, _C12, _C13, _C14, _C15) -> to: Row 200 110,848 
> sun.misc.Launcher$AppClassLoader
> @ 0x7f91bc253cb0 true
>
>    - sourcesink.kafka.consumer09.KafkaConsumerThread @ 0x7f91f84e9938
>
> Kafka 0.9 Fetcher for Source: Custom Source -> Map -> Filter -> Map ->
> from: (C1, C2, C3, C4, C5, C6, C7, C8, C9, C10, C11, C12, C13, C14, C15,
> C16, C17, C18, C19, C20, CIRROSTREAM_YY_SINK_37) -> select: (C1, C2, C3,
> C4, C5, C6, C7, C8, C9, C10, C11, C12, C13, C14, C15, C16, C17, C18, C19,
> C20) -> to: Row -> Map -> Sink: cirrostream_yy_job_37_6 (8/8) 272 106,120 
> org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader
> @ 0x7f91f8518d70 true
>
>    - org.apache.flink.streaming.runtime.io.StreamRecordWriter$OutputFlusher
>    @ 0x7f922c650b10 »
>
> OutputFlusher for Source: Custom Source -> Map -> Filter -> Map -> from:
> (\u6bd4__DH\u5b8c\u8d5bBUGJIBHBKAA\u4f60\u597dISDUIAAASDHUAHSH\u4e0d\u662fFFR\u6211\u7684\u54271232AA___AAAASWID___HAHAHAHA\u8c01\u53eb\u6211,
> ABC) -> where:
> (LIKE(\u6bd4__DH\u5b8c\u8d5bBUGJIBHBKAA\u4f60\u597dISDUIAAASDHUAHSH\u4e0d\u662fFFR\u6211\u7684\u54271232AA___AAAASWID___HAHAHAHA\u8c01\u53eb\u6211,
> _UTF-16LE'%117111')), select:
> (\u6bd4__DH\u5b8c\u8d5bBUGJIBHBKAA\u4f60\u597dISDUIAAASDHUAHSH\u4e0d\u662fFFR\u6211\u7684\u54271232AA___AAAASWID___HAHAHAHA\u8c01\u53eb\u6211)
> -> to: Row 200 103,048 sun.misc.Launcher$AppClassLoader @ 0x7f91bc253cb0
> true
>
>
>
>
> ------------------------------
> pengcheng...@bonc.com.cn
>

Reply via email to