Right after responding to this, I figured out my issue. The URL I was using
had `tcp://` prepended to it and that was messing up the gRPC client.

On Mon, Oct 23, 2017 at 1:27 PM, Mark Grimes <[email protected]> wrote:

> No, I moved on to other things and never came back to it.  I read
> somewhere since then that Alpine images have problems resolving when inside
> kubernetes clusters, which I suspect is the same issue.
>
> A quick google trying to find a reference for that turned up this:
> https://blog.maio.me/alpine-kubernetes-dns/ which may help.
>
> On 23 October 2017 at 16:35, <[email protected]> wrote:
>
>> Did you ever find a resolution to this?
>>
>>
>> On Friday, April 14, 2017 at 10:34:06 AM UTC-5, Mark Grimes wrote:
>>
>>> I rebuilt everything and tried your suggestion, but still get the same
>>> problem.  The trace is below.  Note that since the first post I've
>>> implemented TLS credentials so the output will be slightly different, but
>>> those differences should be irrelevant.
>>>
>>> Thanks,
>>>
>>> Mark.
>>>
>>> kubectl run -it --rm --image=myimage --env="GRPC_TRACE=all"
>>> --env="GRPC_VERBOSITY=DEBUG" testclient --command /bin/sh
>>>
>>> > export GRPC_DNS_RESOLVER=ares
>>>
>>> > ping myservice.default.svc.cluster.local
>>> PING myservice.default.svc.cluster.local (10.0.0.27): 56 data bytes
>>> 64 bytes from 10.0.0.27: seq=0 ttl=64 time=0.072 ms
>>> 64 bytes from 10.0.0.27: seq=1 ttl=64 time=0.051 ms
>>> 64 bytes from 10.0.0.27: seq=2 ttl=64 time=0.093 ms
>>> ^C
>>> --- myservice.default.svc.cluster.local ping statistics ---
>>> 3 packets transmitted, 3 packets received, 0% packet loss
>>> round-trip min/avg/max = 0.051/0.072/0.093 ms
>>>
>>> > myapp -c --verify certChain.pem myservice.default.svc.cluster.
>>> local:50051
>>> Attempting to connect to myservice.default.svc.cluster.local:50051 to
>>> run a listing
>>> I0414 13:58:16.368200643      53 ev_epoll_linux.c:93]        epoll
>>> engine will be using signal: 40
>>> D0414 13:58:16.368613539      53 ev_posix.c:105]             Using
>>> polling engine: epoll
>>> I0414 13:58:16.368988761      53 init.c:224]
>>> grpc_init(void)
>>> I0414 13:58:16.369249843      53 completion_queue.c:137]
>>> grpc_completion_queue_create(reserved=(nil))
>>> I0414 13:58:16.369850453      52 init.c:224]
>>> grpc_init(void)
>>> I0414 13:58:16.370177696      52 ssl_credentials.c:129]
>>>  grpc_ssl_credentials_create(pem_root_certs=-----BEGIN CERTIFICATE-----
>>> MIIDKDCCApGgAwIBAgIJAJuzbK6eGlXMMA0GCSqGSIb3DQEBBQUAMGwxCzAJBgNV
>>> BAYTAkFVMRMwEQYDVQQIEwpTb21lLVN0YXRlMSMwIQYDVQQKExpUZXN0IENlcnRp
>>> ZmljYXRlIEF1dGhvcml0eTEjMCEGA1UEAxMaVGVzdCBDZXJ0aWZpY2F0ZSBBdXRo
>>> b3JpdHkwHhcNMTUwMjIyMjM0NTI1WhcNMjUwMjE5MjM0NTI1WjBsMQswCQYDVQQG
>>> EwJBVTETMBEGA1UECBMKU29tZS1TdGF0ZTEjMCEGA1UEChMaVGVzdCBDZXJ0aWZp
>>> Y2F0ZSBBdXRob3JpdHkxIzAhBgNVBAMTGlRlc3QgQ2VydGlmaWNhdGUgQXV0aG9y
>>> aXR5MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDhPJZAxch5dyQBpOtAd+si
>>> VlRbPBSC2z7mUOmuF12+6/AMVT9iQjrIqNSiu5odz+alynNSHnO9N0b0GoBq3i9K
>>> qYMll4SFKbVi/Bw2bk8U/Sr+Eih8x6YtaPkQ1vA+xLv90HrzL1K/6ZDg3ECnUJpj
>>> jSprk6hYFYsPGJmRcfImXwIDAQABo4HRMIHOMB0GA1UdDgQWBBS8hNSp+aIX71C6
>>> hVuTNrcy8U9YFjCBngYDVR0jBIGWMIGTgBS8hNSp+aIX71C6hVuTNrcy8U9YFqFw
>>> pG4wbDELMAkGA1UEBhMCQVUxEzARBgNVBAgTClNvbWUtU3RhdGUxIzAhBgNVBAoT
>>> GlRlc3QgQ2VydGlmaWNhdGUgQXV0aG9yaXR5MSMwIQYDVQQDExpUZXN0IENlcnRp
>>> ZmljYXRlIEF1dGhvcml0eYIJAJuzbK6eGlXMMAwGA1UdEwQFMAMBAf8wDQYJKoZI
>>> hvcNAQEFBQADgYEAvmudqofZ9WbCxf4uMTFdcF4Ak/nGG9Ie0i2vgZ6GjbdQVfxN
>>> IR7gzfGXuzNhgZkVfNLhZkvcmYoj7a/VYJcyxbOESZqjk4ZrmINP+e3852Y4yfyf
>>> z7shjjgV0+Dl5tC3ZrAd9FphbJno84aCWlS4KTpMSsJwa25xLLpAn4dkOfM=
>>> -----END CERTIFICATE-----
>>> , pem_key_cert_pair=(nil), reserved=(nil))
>>> I0414 13:58:16.374920331      52 init.c:224]
>>> grpc_init(void)
>>> I0414 13:58:16.375177843      52 init.c:229]
>>> grpc_shutdown(void)
>>> I0414 13:58:16.375475302      52 secure_channel_create.c:100]
>>> grpc_secure_channel_create(creds=0x12a6360,
>>> target=myservice.default.svc.cluster.local:50051, args=0x7ffdac139a60,
>>> reserved=(nil))
>>> I0414 13:58:16.376254751      52 init.c:224]
>>> grpc_init(void)
>>> I0414 13:58:16.376616323      52 channel.c:277]
>>>  grpc_channel_register_call(channel=0x12b12e0,
>>> method=/grpcendpoint.Endpoint/endpointVersion, host=(null),
>>> reserved=(nil))
>>> I0414 13:58:16.376902331      52 channel.c:277]
>>>  grpc_channel_register_call(channel=0x12b12e0,
>>> method=/grpcendpoint.Endpoint/listCollections, host=(null),
>>> reserved=(nil))
>>> I0414 13:58:16.377145645      52 channel.c:277]
>>>  grpc_channel_register_call(channel=0x12b12e0,
>>> method=/grpcendpoint.Endpoint/getCollection, host=(null),
>>> reserved=(nil))
>>> I0414 13:58:16.377479987      52 credentials.c:94]
>>> grpc_channel_credentials_release(creds=0x12a6360)
>>> I0414 13:58:16.377739511      52 init.c:229]
>>> grpc_shutdown(void)
>>> I0414 13:58:16.378190183      52 channel.c:308]
>>>  grpc_channel_create_registered_call(channel=0x12b12e0,
>>> parent_call=(nil), propagation_mask=ffff, completion_queue=0x7f2404001ce0,
>>> registered_call_handle=0x12acca0, deadline=gpr_timespec { tv_sec:
>>> 9223372036854775807, tv_nsec: 0, clock_type: 1 }, reserved=(nil))
>>> I0414 13:58:16.378480207      52 grpc_context.c:41]
>>>  grpc_census_call_set_context(call=0x12b19a0, census_context=(nil))
>>> I0414 13:58:16.378985768      52 call.c:1708]
>>>  grpc_call_start_batch(call=0x12b19a0, ops=0x7ffdac1397e0, nops=3,
>>> tag=0x12a71f8, reserved=(nil))
>>> I0414 13:58:16.379254122      52 call.c:1364]                ops[0]:
>>> SEND_INITIAL_METADATA
>>> I0414 13:58:16.379508782      52 call.c:1364]                ops[1]:
>>> SEND_MESSAGE ptr=0x12a6dd0
>>> I0414 13:58:16.379743213      52 call.c:1364]                ops[2]:
>>> SEND_CLOSE_FROM_CLIENT
>>> I0414 13:58:16.380039441      52 client_channel.c:882]
>>> OP[client-channel:0x12b2458]: [COVERED] SEND_INITIAL_METADATA{key=3a 70 61
>>> 74 68 ':path' value=2f 67 72 70 63 65 6e 64 70 6f 69 6e 74 2e 45 6e 64 70
>>> 6f 69 6e 74 2f 6c 69 73 74 43 6f 6c 6c 65 63 74 69 6f 6e 73
>>> '/grpcendpoint.Endpoint/listCollections', key=3a 61 75 74 68 6f 72 69
>>> 74 79 ':authority' value=74 65 6c 65 68 69 73 74 2e 64 65 66 61 75 6c 74 2e
>>> 73 76 63 2e 63 6c 75 73 74 65 72 2e 6c 6f 63 61 6c 3a 35 30 30 35 31
>>> 'myservice.default.svc.cluster.local:50051', key=67 72 70 63 2d 65 6e
>>> 63 6f 64 69 6e 67 'grpc-encoding' value=69 64 65 6e 74 69 74 79 'identity',
>>> key=67 72 70 63 2d 61 63 63 65 70 74 2d 65 6e 63 6f 64 69 6e 67
>>> 'grpc-accept-encoding' value=69 64 65 6e 74 69 74 79 2c 64 65 66 6c 61 74
>>> 65 2c 67 7a 69 70 'identity,deflate,gzip'} 
>>> SEND_MESSAGE:flags=0x00000000:len=0
>>> SEND_TRAILING_METADATA{}
>>> I0414 13:58:16.380528137      52 call.c:1708]
>>>  grpc_call_start_batch(call=0x12b19a0, ops=0x7ffdac139820, nops=3,
>>> tag=0x12a7298, reserved=(nil))
>>> I0414 13:58:16.380786273      52 call.c:1364]                ops[0]:
>>> RECV_INITIAL_METADATA ptr=0x12a72c0
>>> I0414 13:58:16.381093046      52 call.c:1364]                ops[1]:
>>> RECV_MESSAGE ptr=0x12a72e8
>>> I0414 13:58:16.381368604      52 call.c:1364]                ops[2]:
>>> RECV_STATUS_ON_CLIENT metadata=0x12a7308 status=0x12a7320 details=0x12a7328
>>> I0414 13:58:16.381607886      52 client_channel.c:882]
>>> OP[client-channel:0x12b2458]: [COVERED] RECV_INITIAL_METADATA RECV_MESSAGE
>>> RECV_TRAILING_METADATA
>>> I0414 13:58:16.381897570      53 completion_queue.c:389]
>>> grpc_completion_queue_next(cc=0x7f2404001ce0, deadline=gpr_timespec {
>>> tv_sec: 9223372036854775807, tv_nsec: 0, clock_type: 1 }, reserved=(nil))
>>> I0414 13:58:16.382586568      54 dns_resolver.c:192]         dns
>>> resolution failed (will retry): 
>>> {"created":"@1492178296.382558164","description":"OS
>>> Error","errno":-2,"file":"src/core/lib/iomgr/resolve_address
>>> _posix.c","file_line":115,"os_error":"Name or service not
>>> known","syscall":"getaddrinfo","target_address":"myservice.d
>>> efault.svc.cluster.local:50051"}
>>> D0414 13:58:16.382915126      54 dns_resolver.c:198]         retrying in
>>> 1.000000000 seconds
>>> D0414 13:58:16.383721741      54 connectivity_state.c:163]   SET:
>>> 0x12b1448 client_channel: IDLE --> TRANSIENT_FAILURE [new_lb+resolver]
>>> error=0x7f23fc000ba0 {"created":"@1492178296.383708704","description":"No
>>> load balancing policy","file":"src/core/ext/c
>>> lient_channel/client_channel.c","file_line":273}
>>> I0414 13:58:17.383541404      55 dns_resolver.c:192]         dns
>>> resolution failed (will retry): 
>>> {"created":"@1492178297.383519265","description":"OS
>>> Error","errno":-2,"file":"src/core/lib/iomgr/resolve_address
>>> _posix.c","file_line":115,"os_error":"Name or service not
>>> known","syscall":"getaddrinfo","target_address":"myservice.d
>>> efault.svc.cluster.local:50051"}
>>> D0414 13:58:17.384161567      55 dns_resolver.c:198]         retrying in
>>> 1.000000000 seconds
>>> D0414 13:58:17.384474818      55 connectivity_state.c:163]   SET:
>>> 0x12b1448 client_channel: TRANSIENT_FAILURE --> TRANSIENT_FAILURE
>>> [new_lb+resolver] error=0x7f23fc000c00 
>>> {"created":"@1492178297.384464441","description":"No
>>> load balancing policy","file":"src/core/ext/c
>>> lient_channel/client_channel.c","file_line":273}
>>> I0414 13:58:18.384748114      56 dns_resolver.c:192]         dns
>>> resolution failed (will retry): 
>>> {"created":"@1492178298.384725600","description":"OS
>>> Error","errno":-2,"file":"src/core/lib/iomgr/resolve_address
>>> _posix.c","file_line":115,"os_error":"Name or service not
>>> known","syscall":"getaddrinfo","target_address":"myservice.d
>>> efault.svc.cluster.local:50051"}
>>> D0414 13:58:18.384965866      56 dns_resolver.c:198]         retrying in
>>> 1.000000000 seconds
>>> D0414 13:58:18.385023236      56 connectivity_state.c:163]   SET:
>>> 0x12b1448 client_channel: TRANSIENT_FAILURE --> TRANSIENT_FAILURE
>>> [new_lb+resolver] error=0x7f23fc000ba0 
>>> {"created":"@1492178298.385013911","description":"No
>>> load balancing policy","file":"src/core/ext/c
>>> lient_channel/client_channel.c","file_line":273}
>>> I0414 13:58:19.385829174      57 dns_resolver.c:192]         dns
>>> resolution failed (will retry): 
>>> {"created":"@1492178299.385807307","description":"OS
>>> Error","errno":-2,"file":"src/core/lib/iomgr/resolve_address
>>> _posix.c","file_line":115,"os_error":"Name or service not
>>> known","syscall":"getaddrinfo","target_address":"myservice.d
>>> efault.svc.cluster.local:50051"}
>>> D0414 13:58:19.386378656      57 dns_resolver.c:198]         retrying in
>>> 1.000000000 seconds
>>> D0414 13:58:19.386736068      57 connectivity_state.c:163]   SET:
>>> 0x12b1448 client_channel: TRANSIENT_FAILURE --> TRANSIENT_FAILURE
>>> [new_lb+resolver] error=0x7f23fc000d00 
>>> {"created":"@1492178299.386711433","description":"No
>>> load balancing policy","file":"src/core/ext/c
>>> lient_channel/client_channel.c","file_line":273}
>>> I0414 13:58:20.387789525      58 dns_resolver.c:192]         dns
>>> resolution failed (will retry): 
>>> {"created":"@1492178300.387767146","description":"OS
>>> Error","errno":-2,"file":"src/core/lib/iomgr/resolve_address
>>> _posix.c","file_line":115,"os_error":"Name or service not
>>> known","syscall":"getaddrinfo","target_address":"myservice.d
>>> efault.svc.cluster.local:50051"}
>>> D0414 13:58:20.388320888      58 dns_resolver.c:198]         retrying in
>>> 1.000000000 seconds
>>> D0414 13:58:20.388615388      58 connectivity_state.c:163]   SET:
>>> 0x12b1448 client_channel: TRANSIENT_FAILURE --> TRANSIENT_FAILURE
>>> [new_lb+resolver] error=0x7f23fc000d60 
>>> {"created":"@1492178300.388604683","description":"No
>>> load balancing policy","file":"src/core/ext/c
>>> lient_channel/client_channel.c","file_line":273}
>>> I0414 13:58:21.388326313      59 dns_resolver.c:192]         dns
>>> resolution failed (will retry): 
>>> {"created":"@1492178301.388301905","description":"OS
>>> Error","errno":-2,"file":"src/core/lib/iomgr/resolve_address
>>> _posix.c","file_line":115,"os_error":"Name or service not
>>> known","syscall":"getaddrinfo","target_address":"myservice.d
>>> efault.svc.cluster.local:50051"}
>>> D0414 13:58:21.389040345      59 dns_resolver.c:198]         retrying in
>>> 1.000000000 seconds
>>> D0414 13:58:21.389322099      59 connectivity_state.c:163]   SET:
>>> 0x12b1448 client_channel: TRANSIENT_FAILURE --> TRANSIENT_FAILURE
>>> [new_lb+resolver] error=0x7f23fc0011d0 
>>> {"created":"@1492178301.389311229","description":"No
>>> load balancing policy","file":"src/core/ext/c
>>> lient_channel/client_channel.c","file_line":273}
>>> I0414 13:58:22.388716129      60 dns_resolver.c:192]         dns
>>> resolution failed (will retry): 
>>> {"created":"@1492178302.388693261","description":"OS
>>> Error","errno":-2,"file":"src/core/lib/iomgr/resolve_address
>>> _posix.c","file_line":115,"os_error":"Name or service not
>>> known","syscall":"getaddrinfo","target_address":"myservice.d
>>> efault.svc.cluster.local:50051"}
>>> D0414 13:58:22.389463664      60 dns_resolver.c:198]         retrying in
>>> 1.000000000 seconds
>>> D0414 13:58:22.389888415      60 connectivity_state.c:163]   SET:
>>> 0x12b1448 client_channel: TRANSIENT_FAILURE --> TRANSIENT_FAILURE
>>> [new_lb+resolver] error=0x7f23fc000c00 
>>> {"created":"@1492178302.389876717","description":"No
>>> load balancing policy","file":"src/core/ext/c
>>> lient_channel/client_channel.c","file_line":273}
>>> I0414 13:58:23.389602658      61 dns_resolver.c:192]         dns
>>> resolution failed (will retry): 
>>> {"created":"@1492178303.389580245","description":"OS
>>> Error","errno":-2,"file":"src/core/lib/iomgr/resolve_address
>>> _posix.c","file_line":115,"os_error":"Name or service not
>>> known","syscall":"getaddrinfo","target_address":"myservice.d
>>> efault.svc.cluster.local:50051"}
>>> D0414 13:58:23.390165421      61 dns_resolver.c:198]         retrying in
>>> 1.000000000 seconds
>>> D0414 13:58:23.390466294      61 connectivity_state.c:163]   SET:
>>> 0x12b1448 client_channel: TRANSIENT_FAILURE --> TRANSIENT_FAILURE
>>> [new_lb+resolver] error=0x7f23fc000e50 
>>> {"created":"@1492178303.390455437","description":"No
>>> load balancing policy","file":"src/core/ext/c
>>> lient_channel/client_channel.c","file_line":273}
>>> I0414 13:58:24.389912635      62 dns_resolver.c:192]         dns
>>> resolution failed (will retry): 
>>> {"created":"@1492178304.389883555","description":"OS
>>> Error","errno":-2,"file":"src/core/lib/iomgr/resolve_address
>>> _posix.c","file_line":115,"os_error":"Name or service not
>>> known","syscall":"getaddrinfo","target_address":"myservice.d
>>> efault.svc.cluster.local:50051"}
>>> D0414 13:58:24.389948542      62 dns_resolver.c:198]         retrying in
>>> 1.000000000 seconds
>>> D0414 13:58:24.389964969      62 connectivity_state.c:163]   SET:
>>> 0x12b1448 client_channel: TRANSIENT_FAILURE --> TRANSIENT_FAILURE
>>> [new_lb+resolver] error=0x7f23fc000d60 
>>> {"created":"@1492178304.389957359","description":"No
>>> load balancing policy","file":"src/core/ext/c
>>> lient_channel/client_channel.c","file_line":273}
>>> ^C
>>>
>>> On 13 April 2017 at 22:25, Yuchen Zeng <[email protected]> wrote:
>>>
>>>> Currently grpc is using get getaddrinfo to do the name resolution. I'm
>>>> investigating why ping can resolve the name of the service but getaddrinfo
>>>> can not.
>>>> Could you please also try it with --env="GRPC_DNS_RESOLVER=ares" on
>>>> master? It will use a c-ares based resolver instead of getaddrinfo.
>>>>
>>>> On Sunday, April 9, 2017 at 3:27:38 PM UTC-7, Mark Grimes wrote:
>>>>>
>>>>> Yes, I tried that and get the same problem.  Pinging the fully
>>>>> qualified name works fine (so I know it is the correct long name).
>>>>>
>>>>> On 7 April 2017 at 13:22, Jan Tattermusch <[email protected]>
>>>>> wrote:
>>>>>
>>>>>> Out of curiosity, have you tried with the long DNS name of the
>>>>>> service?
>>>>>> (I think that's something in the sense of myservice.default.svc.clust
>>>>>> er.local)
>>>>>>
>>>>>> On Wednesday, March 15, 2017 at 6:37:30 PM UTC+1, [email protected]
>>>>>> wrote:
>>>>>>>
>>>>>>> Hi,
>>>>>>> This is about the interplay of gRPC and Kubernetes, but I'm pretty
>>>>>>> sure the problem is with the gRPC name resolution.
>>>>>>> I have a gRPC application running on a kubernetes cluster exposed
>>>>>>> using a headless service.  For those that don't know, this means (as I
>>>>>>> understand it) kubernetes sets up a DNS entry pointing to the 
>>>>>>> application
>>>>>>> container.  If I then try and run the client in the same cluster it 
>>>>>>> hangs
>>>>>>> during name resolution to the kubernetes service.  Connecting directly 
>>>>>>> to
>>>>>>> the IP address of the service works fine.
>>>>>>> E.g. with the server already running and listening on port 50051
>>>>>>>
>>>>>>> kubectl run -it --rm --image=myimage --env="GRPC_TRACE=all"
>>>>>>> --env="GRPC_VERBOSITY=DEBUG" testclient --command /bin/sh
>>>>>>>
>>>>>>> > ping myservice
>>>>>>> PING myservice (10.0.0.13): 56 data bytes
>>>>>>> 64 bytes from 10.0.0.13: seq=0 ttl=64 time=0.121 ms
>>>>>>> (etc; service name resolves fine)
>>>>>>>
>>>>>>> > myapp -c 10.0.0.13:50051  # connect as client to the given server
>>>>>>> and port using the ip address resolved by ping
>>>>>>> (loads of debug output, but I get the expected response and
>>>>>>> everything works fine)
>>>>>>>
>>>>>>> > myapp -c myservice:50051
>>>>>>> (hangs indefinitely trying to resolve the name, full output below)
>>>>>>>
>>>>>>> The fact it hangs indefinitely appears to be related to
>>>>>>> https://github.com/grpc/grpc/issues/9481 (DNS resolver in C core
>>>>>>> never gives up resolving a nonexistent hostname), except in my case the
>>>>>>> hostname is valid, as shown by the ping.  The container image is busybox
>>>>>>> plus the C++ application and required libraries.  Is there anything gRPC
>>>>>>> needs in the OS (i.e. my container image) in order to perform name
>>>>>>> resolution?  Ping manages it fine, why not gRPC?  Forgive my ignorance 
>>>>>>> of
>>>>>>> how the underlying name resolution is provided.
>>>>>>>
>>>>>>> Thanks,
>>>>>>>
>>>>>>> Mark.
>>>>>>>
>>>>>>> Output when trying to resolve to the service:
>>>>>>>
>>>>>>> I0315 17:08:46.596502176      15 ev_epoll_linux.c:85]        epoll
>>>>>>> engine will be using signal: 36
>>>>>>> D0315 17:08:46.596889365      15 ev_posix.c:106]             Using
>>>>>>> polling engine: epoll
>>>>>>> I0315 17:08:46.597180247      15 init.c:193]
>>>>>>> grpc_init(void)
>>>>>>> I0315 17:08:46.597458188      15 completion_queue.c:139]
>>>>>>> grpc_completion_queue_create(reserved=(nil))
>>>>>>> I0315 17:08:46.597765727      14 init.c:193]
>>>>>>> grpc_init(void)
>>>>>>> I0315 17:08:46.598071841      14 channel_create.c:235]
>>>>>>> grpc_insecure_channel_create(target=0xaf2e48, args=0x7ffd594b2fd0,
>>>>>>> reserved=(nil))
>>>>>>> I0315 17:08:46.598358396      14 init.c:193]
>>>>>>> grpc_init(void)
>>>>>>> I0315 17:08:46.598675472      14 init.c:198]
>>>>>>> grpc_shutdown(void)
>>>>>>> I0315 17:08:46.598951207      14 channel.c:263]
>>>>>>>  grpc_channel_register_call(channel=0xaeba20,
>>>>>>> method=/grpcendpoint.Endpoint/endpointVersion, host=(null),
>>>>>>> reserved=(nil))
>>>>>>> I0315 17:08:46.599216112      14 channel.c:263]
>>>>>>>  grpc_channel_register_call(channel=0xaeba20,
>>>>>>> method=/grpcendpoint.Endpoint/listCollections, host=(null),
>>>>>>> reserved=(nil))
>>>>>>> I0315 17:08:46.599476727      14 channel.c:263]
>>>>>>>  grpc_channel_register_call(channel=0xaeba20,
>>>>>>> method=/grpcendpoint.Endpoint/getCollection, host=(null),
>>>>>>> reserved=(nil))
>>>>>>> I0315 17:08:46.599896046      14 channel.c:291]
>>>>>>>  grpc_channel_create_registered_call(channel=0xaeba20,
>>>>>>> parent_call=(nil), propagation_mask=ffff, 
>>>>>>> completion_queue=0x7fa7c8001b40,
>>>>>>> registered_call_handle=0xaf1910, deadline=gpr_timespec { tv_sec:
>>>>>>> 9223372036854775807, tv_nsec: 0, clock_type: 1 }, reserved=(nil))
>>>>>>> I0315 17:08:46.600314315      14 grpc_context.c:41]
>>>>>>>  grpc_census_call_set_context(call=0xaf4ec0, census_context=(nil))
>>>>>>> I0315 17:08:46.600804961      14 call.c:1690]
>>>>>>>  grpc_call_start_batch(call=0xaf4ec0, ops=0x7ffd594b2cf0, nops=3,
>>>>>>> tag=0xaf0318, reserved=(nil))
>>>>>>> I0315 17:08:46.601111821      14 call.c:1366]                ops[0]:
>>>>>>> SEND_INITIAL_METADATA
>>>>>>> I0315 17:08:46.601392365      14 call.c:1366]                ops[1]:
>>>>>>> SEND_MESSAGE ptr=0xaebc90
>>>>>>> I0315 17:08:46.601653874      14 call.c:1366]                ops[2]:
>>>>>>> SEND_CLOSE_FROM_CLIENT
>>>>>>> I0315 17:08:46.602006491      14 client_channel.c:109]
>>>>>>> OP[client-channel:0xaf54c8]: SEND_INITIAL_METADATA{key=3a 70 61 74 68
>>>>>>> ':path' value=2f 67 72 70 63 65 6e 64 70 6f 69 6e 74 2e 45 6e 64 70 6f 
>>>>>>> 69
>>>>>>> 6e 74 2f 65 6e 64 70 6f 69 6e 74 56 65 72 73 69 6f 6e
>>>>>>> '/grpcendpoint.Endpoint/endpointVersion', key=3a 61 75 74 68 6f 72
>>>>>>> 69 74 79 ':authority' value=74 65 6c 65 68 69 73 74 3a 35 30 30 35 31
>>>>>>> 'myservice:50051', key=67 72 70 63 2d 65 6e 63 6f 64 69 6e 67
>>>>>>> 'grpc-encoding' value=69 64 65 6e 74 69 74 79 'identity', key=67 72 70 
>>>>>>> 63
>>>>>>> 2d 61 63 63 65 70 74 2d 65 6e 63 6f 64 69 6e 67 'grpc-accept-encoding'
>>>>>>> value=69 64 65 6e 74 69 74 79 2c 64 65 66 6c 61 74 65 2c 67 7a 69 70
>>>>>>> 'identity,deflate,gzip'} SEND_MESSAGE:flags=0x00000000:len=0
>>>>>>> SEND_TRAILING_METADATA{}
>>>>>>> I0315 17:08:46.602499932      14 call.c:1690]
>>>>>>>  grpc_call_start_batch(call=0xaf4ec0, ops=0x7ffd594b2d30, nops=3,
>>>>>>> tag=0xaf03b8, reserved=(nil))
>>>>>>> I0315 17:08:46.602729139      14 call.c:1366]                ops[0]:
>>>>>>> RECV_INITIAL_METADATA ptr=0xaf03e0
>>>>>>> I0315 17:08:46.603054019      14 call.c:1366]                ops[1]:
>>>>>>> RECV_MESSAGE ptr=0xaf0408
>>>>>>> I0315 17:08:46.603314856      14 call.c:1366]                ops[2]:
>>>>>>> RECV_STATUS_ON_CLIENT metadata=0xaf0428 status=0xaf0440 details=0xaf0448
>>>>>>> I0315 17:08:46.603639848      14 client_channel.c:109]
>>>>>>> OP[client-channel:0xaf54c8]: RECV_INITIAL_METADATA RECV_MESSAGE
>>>>>>> RECV_TRAILING_METADATA
>>>>>>> I0315 17:08:46.603924761      15 completion_queue.c:333]
>>>>>>> grpc_completion_queue_next(cc=0x7fa7c8001b40, deadline=gpr_timespec
>>>>>>> { tv_sec: 9223372036854775807, tv_nsec: 0, clock_type: 1 }, 
>>>>>>> reserved=(nil))
>>>>>>> D0315 17:08:46.604521071      16 dns_resolver.c:192]         dns
>>>>>>> resolution failed: {"created":"@1489597726.604500252","description":"OS
>>>>>>> Error","errno":-2,"file":"src/core/lib/iomgr/resolve_address
>>>>>>> _posix.c","file_line":115,"os_error":"Name or service not
>>>>>>> known","syscall":"getaddrinfo","target_address":"myservice:50051"}
>>>>>>> D0315 17:08:46.604820308      16 dns_resolver.c:201]
>>>>>>> retrying immediately
>>>>>>> D0315 17:08:46.605088317      16 connectivity_state.c:156]   SET:
>>>>>>> 0xaebb68 client_channel: IDLE --> TRANSIENT_FAILURE [new_lb+resolver]
>>>>>>> error=0x7fa7c0000b90 
>>>>>>> {"created":"@1489597726.605077608","description":"No
>>>>>>> load balancing policy","file":"src/core/ext/c
>>>>>>> lient_config/client_channel.c","file_line":188}
>>>>>>> D0315 17:08:47.605552537      17 dns_resolver.c:192]         dns
>>>>>>> resolution failed: {"created":"@1489597727.605530416","description":"OS
>>>>>>> Error","errno":-2,"file":"src/core/lib/iomgr/resolve_address
>>>>>>> _posix.c","file_line":115,"os_error":"Name or service not
>>>>>>> known","syscall":"getaddrinfo","target_address":"myservice:50051"}
>>>>>>> D0315 17:08:47.606272692      17 dns_resolver.c:201]
>>>>>>> retrying immediately
>>>>>>> D0315 17:08:47.606719769      17 connectivity_state.c:156]   SET:
>>>>>>> 0xaebb68 client_channel: TRANSIENT_FAILURE --> TRANSIENT_FAILURE
>>>>>>> [new_lb+resolver] error=0x7fa7c0000bf0 
>>>>>>> {"created":"@1489597727.606709040","description":"No
>>>>>>> load balancing policy","file":"src/core/ext/c
>>>>>>> lient_config/client_channel.c","file_line":188}
>>>>>>> D0315 17:08:49.501768132      18 dns_resolver.c:192]         dns
>>>>>>> resolution failed: {"created":"@1489597729.501745552","description":"OS
>>>>>>> Error","errno":-2,"file":"src/core/lib/iomgr/resolve_address
>>>>>>> _posix.c","file_line":115,"os_error":"Name or service not
>>>>>>> known","syscall":"getaddrinfo","target_address":"myservice:50051"}
>>>>>>> D0315 17:08:49.502278763      18 dns_resolver.c:201]
>>>>>>> retrying immediately
>>>>>>> D0315 17:08:49.502683440      18 connectivity_state.c:156]   SET:
>>>>>>> 0xaebb68 client_channel: TRANSIENT_FAILURE --> TRANSIENT_FAILURE
>>>>>>> [new_lb+resolver] error=0x7fa7c0000b90 
>>>>>>> {"created":"@1489597729.502672595","description":"No
>>>>>>> load balancing policy","file":"src/core/ext/c
>>>>>>> lient_config/client_channel.c","file_line":188}
>>>>>>> D0315 17:08:52.737487943      19 dns_resolver.c:192]         dns
>>>>>>> resolution failed: {"created":"@1489597732.737457632","description":"OS
>>>>>>> Error","errno":-2,"file":"src/core/lib/iomgr/resolve_address
>>>>>>> _posix.c","file_line":115,"os_error":"Name or service not
>>>>>>> known","syscall":"getaddrinfo","target_address":"myservice:50051"}
>>>>>>> D0315 17:08:52.737695112      19 dns_resolver.c:201]
>>>>>>> retrying immediately
>>>>>>> D0315 17:08:52.737751585      19 connectivity_state.c:156]   SET:
>>>>>>> 0xaebb68 client_channel: TRANSIENT_FAILURE --> TRANSIENT_FAILURE
>>>>>>> [new_lb+resolver] error=0x7fa7c0000bf0 
>>>>>>> {"created":"@1489597732.737742549","description":"No
>>>>>>> load balancing policy","file":"src/core/ext/c
>>>>>>> lient_config/client_channel.c","file_line":188}
>>>>>>> D0315 17:08:56.890396846      20 dns_resolver.c:192]         dns
>>>>>>> resolution failed: {"created":"@1489597736.890373998","description":"OS
>>>>>>> Error","errno":-2,"file":"src/core/lib/iomgr/resolve_address
>>>>>>> _posix.c","file_line":115,"os_error":"Name or service not
>>>>>>> known","syscall":"getaddrinfo","target_address":"myservice:50051"}
>>>>>>> D0315 17:08:56.890592615      20 dns_resolver.c:201]
>>>>>>> retrying immediately
>>>>>>> D0315 17:08:56.890664375      20 connectivity_state.c:156]   SET:
>>>>>>> 0xaebb68 client_channel: TRANSIENT_FAILURE --> TRANSIENT_FAILURE
>>>>>>> [new_lb+resolver] error=0x7fa7c0000b90 
>>>>>>> {"created":"@1489597736.890640201","description":"No
>>>>>>> load balancing policy","file":"src/core/ext/c
>>>>>>> lient_config/client_channel.c","file_line":188}
>>>>>>> (etc in a seemingly infinite loop)
>>>>>>>
>>>>>>
>>>>>
>>> --
>> You received this message because you are subscribed to a topic in the
>> Google Groups "grpc.io" group.
>> To unsubscribe from this topic, visit https://groups.google.com/d/to
>> pic/grpc-io/5NxFzVcIvmo/unsubscribe.
>> To unsubscribe from this group and all its topics, send an email to
>> [email protected].
>> To post to this group, send email to [email protected].
>> Visit this group at https://groups.google.com/group/grpc-io.
>> To view this discussion on the web visit https://groups.google.com/d/ms
>> gid/grpc-io/8cc16dbd-f2e8-481c-92f7-24d32b60537a%40googlegroups.com
>> <https://groups.google.com/d/msgid/grpc-io/8cc16dbd-f2e8-481c-92f7-24d32b60537a%40googlegroups.com?utm_medium=email&utm_source=footer>
>> .
>>
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>


-- 
<[email protected]>

-- 
You received this message because you are subscribed to the Google Groups 
"grpc.io" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/grpc-io.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/grpc-io/CANfdt26u04%3Dk31kvE0x4f6nmN5jgaojzPJ2f5Xvd-kd3vUgfnw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to