Actually, I was mistaken.
Both server and client are using the same protos.
I have checked again and again, and even checked using Beyond Compare - and
yes the protos are definitely the same. I have regenerated both c++ and
python. But I cannot solve this problem.
I turned on tracing on the server, and I see the following, which doesnt
really give any insight.
myserver_1 | E1110 03:29:44.440736118 45 lockfree_event.c:86]
lfev_notify_on[read]: 0x7e3d4510 curr=(nil) closure=0x7e3d46ac
myserver_1 | E1110 03:29:50.048372285 47 lockfree_event.c:202]
lfev_set_ready[read]: 0x7e3d4510 curr=0x7e3d46ac
myserver_1 | E1110 03:29:50.048576913 47 lockfree_event.c:86]
lfev_notify_on[read]: 0x7e3d4510 curr=(nil) closure=0x7e3d46ac
myserver_1 | E1110 03:29:50.048854967 47 lockfree_event.c:86]
lfev_notify_on[read]: 0x7bf007b0 curr=(nil) closure=0x7bf00960
myserver_1 | E1110 03:29:50.049348128 47 lockfree_event.c:202]
lfev_set_ready[write]: 0x7bf007b4 curr=(nil)
myserver_1 | E1110 03:29:50.049500737 47 lockfree_event.c:202]
lfev_set_ready[read]: 0x7bf007b0 curr=0x7bf00960
myserver_1 | E1110 03:29:50.049521129 47 lockfree_event.c:202]
lfev_set_ready[write]: 0x7bf007b4 curr=0x2
myserver_1 | E1110 03:29:50.049655975 47 lockfree_event.c:86]
lfev_notify_on[read]: 0x7bf007b0 curr=(nil) closure=0x7bf00960
myserver_1 | E1110 03:29:50.049876094 47 lockfree_event.c:202]
lfev_set_ready[read]: 0x7bf007b0 curr=0x7bf00960
myserver_1 | E1110 03:29:50.049898436 47 lockfree_event.c:202]
lfev_set_ready[write]: 0x7bf007b4 curr=0x2
myserver_1 | E1110 03:29:50.050109911 47 lockfree_event.c:86]
lfev_notify_on[read]: 0x7bf007b0 curr=(nil) closure=0x7bf00960
myserver_1 | E1110 03:29:50.272122346 47 lockfree_event.c:202]
lfev_set_ready[read]: 0x7bf007b0 curr=0x7bf00960
myserver_1 | E1110 03:29:50.272167534 47 lockfree_event.c:202]
lfev_set_ready[write]: 0x7bf007b4 curr=0x2
myserver_1 | E1110 03:29:50.272284180 47 lockfree_event.c:152]
lfev_set_shutdown: 0x7bf007b0 curr=(nil)
err={"created":"@1510284590.272251275","description":"Endpoint read
failed","file":"src/core/ext/transport/chttp2/transport/chttp2_transport.c","file_line":2320,"grpc_status":14,"occurred_during_write":0,"referenced_errors":[{"created":"@1510284590.272214992","description":"Socket
closed","fd":15,"file":"src/core/lib/iomgr/tcp_posix.c","file_line":285,"target_address":"ipv4:10.5.0.1:44206"}]}
myserver_1 | E1110 03:29:50.272395493 47 lockfree_event.c:152]
lfev_set_shutdown: 0x7bf007b4 curr=0x2
err={"created":"@1510284590.272251275","description":"Endpoint read
failed","file":"src/core/ext/transport/chttp2/transport/chttp2_transport.c","file_line":2320,"grpc_status":14,"occurred_during_write":0,"referenced_errors":[{"created":"@1510284590.272214992","description":"Socket
closed","fd":15,"file":"src/core/lib/iomgr/tcp_posix.c","file_line":285,"target_address":"ipv4:10.5.0.1:44206"}]}
myserver_1 |
On the client, which is vaguely more insightful, but not much more, is:
I1110 16:38:27.564608732 21622 parsing.c:496]
HTTP:1:TRL:CLI: :status: 32 30 30 '200'
I1110 16:38:27.564625831 21622 parsing.c:496]
HTTP:1:TRL:CLI: content-type: 61 70 70 6c 69 63 61 74 69 6f 6e 2f 67 72 70
63 'application/grpc'
D1110 16:38:27.564640069 21622 hpack_parser.c:656] Decode:
'grpc-status: 13', elem_interned=0 [2], k_interned=1, v_interned=0
I1110 16:38:27.564650384 21622 parsing.c:496]
HTTP:1:TRL:CLI: grpc-status: 31 33 '13'
D1110 16:38:27.564662356 21622 hpack_parser.c:656] Decode:
'grpc-message: Did not read entire message', elem_interned=0 [2],
k_interned=1, v_interned=0
I1110 16:38:27.564673985 21622 parsing.c:496]
HTTP:1:TRL:CLI: grpc-message: 44 69 64 20 6e 6f 74 20 72 65 61 64 20 65 6e
74 69 72 65 20 6d 65 73 73 61 67 65 'Did not read entire message'
D1110 16:38:27.564732205 21622 chttp2_transport.c:1113]
complete_closure_step: 0x2aa1040 refs=0 flags=0x0003
desc=recv_trailing_metadata_finished err="No Error"
D1110 16:38:27.564755512 21622 call.c:695]
get_final_status CLI
D1110 16:38:27.564780355 21622 call.c:698] 1:
{"created":"@1510285107.564752594","description":"Error received from
peer","file":"src/core/lib/surface/call.c","file_line":989,"grpc_message":"Did
not read entire message","grpc_status":13}
I1110 16:38:27.564796452 21622 completion_queue.c:620]
cq_end_op_for_next(exec_ctx=0x7ffd25e94350, cq=0x2a972d0,
tag=0x7fbb362c32b8, error="No Error", done=0x7fbb39e26c90,
done_arg=0x2a98648, storage=0x2a98650)
D1110 16:38:27.596187511 21622 combiner.c:284] C:0x2a9def0
finish old_state=3
I1110 16:38:27.596231238 21622 completion_queue.c:937]
RETURN_EVENT[0x2a972d0]: OP_COMPLETE: tag:0x7fbb362c32b8 OK
I1110 16:38:27.596247961 21622 metadata_array.c:27]
grpc_metadata_array_init(array=0x7ffd25e94260)
I1110 16:38:27.596255930 21622 metadata_array.c:32]
grpc_metadata_array_destroy(array=0x7fbb3731d348)
I1110 16:38:27.596262829 21622 metadata_array.c:27]
grpc_metadata_array_init(array=0x7ffd25e94260)
I1110 16:38:27.596267641 21622 metadata_array.c:32]
grpc_metadata_array_destroy(array=0x7fbb3731d378)
--
You received this message because you are subscribed to the Google Groups
"grpc.io" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/grpc-io.
To view this discussion on the web visit
https://groups.google.com/d/msgid/grpc-io/643a6e30-3a94-4bb4-84c1-af0524dafa7a%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.