这个是rpc失败了,可以到 10002 这个 backend 的be.INFO日志看下有什么对应的错误信息。
前提是确保 FE 和 BE 的版本是一致的,并且连接正常(心跳正常) > 在 2019年8月23日,下午7:33,Katte <[email protected]> 写道: > > 您好! > > 你好,我创建了一个job从kafka读取数据,但日志总提示读取超时,还请帮助一下,谢谢! > > 2019-08-22 14:37:07,351 WARN 30 > [RoutineLoadTaskScheduler.submitBatchTasksIfNotEmpty():201] task send error. > backend[10002] > org.apache.thrift.transport.TTransportException: > java.net.SocketTimeoutException: Read timed out > at > org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129) > ~[libthrift-0.9.3.jar:0.9.3] > at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) > ~[libthrift-0.9.3.jar:0.9.3] > at > org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) > ~[libthrift-0.9.3.jar:0.9.3] > at > org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) > ~[libthrift-0.9.3.jar:0.9.3] > at > org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219) > ~[libthrift-0.9.3.jar:0.9.3] > at > org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) > ~[libthrift-0.9.3.jar:0.9.3] > at > org.apache.doris.thrift.BackendService$Client.recv_submit_routine_load_task(BackendService.java:627) > ~[palo-fe.jar:?] > at > org.apache.doris.thrift.BackendService$Client.submit_routine_load_task(BackendService.java:614) > ~[palo-fe.jar:?] > at > org.apache.doris.load.routineload.RoutineLoadTaskScheduler.submitBatchTasksIfNotEmpty(RoutineLoadTaskScheduler.java:194) > [palo-fe.jar:?] > at > org.apache.doris.load.routineload.RoutineLoadTaskScheduler.process(RoutineLoadTaskScheduler.java:96) > [palo-fe.jar:?] > at > org.apache.doris.load.routineload.RoutineLoadTaskScheduler.runOneCycle(RoutineLoadTaskScheduler.java:84) > [palo-fe.jar:?] > at org.apache.doris.common.util.Daemon.run(Daemon.java:108) > [palo-fe.jar:?] > Caused by: java.net.SocketTimeoutException: Read timed out > at java.net.SocketInputStream.socketRead0(Native Method) ~[?:1.8.0_211] > at java.net.SocketInputStream.socketRead(SocketInputStream.java:116) > ~[?:1.8.0_211] > at java.net.SocketInputStream.read(SocketInputStream.java:171) > ~[?:1.8.0_211] > at java.net.SocketInputStream.read(SocketInputStream.java:141) > ~[?:1.8.0_211] > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246) > ~[?:1.8.0_211] > at java.io.BufferedInputStream.read1(BufferedInputStream.java:286) > ~[?:1.8.0_211] > at java.io.BufferedInputStream.read(BufferedInputStream.java:345) > ~[?:1.8.0_211] > at > org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127) > ~[libthrift-0.9.3.jar:0.9.3] > ... 11 more > > 我的job如下: > # create job > CREATE ROUTINE LOAD futurmaster.job05 ON tb01 > COLUMNS TERMINATED BY ",", > COLUMNS > (prod,type,channel,finish_date,value_1,value_2,value_3,value_4,value_5,value_6,value_7,value_8,value_9,value_10,value_11,value_12,value_13,value_14,value_15,value_16,value_17,value_18,value_19,value_20,value_21,value_22,value_23,value_24,value_25,value_26,value_27,value_28,value_29,value_30,value_31,value_32,value_33,value_34,value_35,value_36,value_37,value_38,value_39,value_40,value_41,value_42,value_43,value_44,value_45,value_46,value_47,value_48,value_49,value_50,value_51,value_52,value_53,value_54,value_55,value_56,value_57,value_58,value_59,value_60,value_61,value_62,value_63,value_64,value_65,value_66,value_67,value_68,value_69,value_70,value_71,value_72,value_73,value_74,value_75,value_76,value_77,value_78,value_79,value_80,value_81,value_82,value_83,value_84,value_85,value_86,value_87,value_88,value_89,value_90,value_91,value_92,value_93,value_94,value_95,value_96,value_97,value_98,value_99,value_100,value_101,value_102,value_103,value_104,value_105,value_106,prod_name) > PROPERTIES > ( > "desired_concurrent_number"="2", > "max_batch_interval" = "20", > "max_batch_rows" = "300000", > "max_batch_size" = "209715200" > ) > FROM KAFKA > ( > "kafka_broker_list" = "master:9092", > "kafka_topic" = "T1_A3_doris" > ); > > > > 发送自 Windows 10 版邮件应用 > --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
