On Sat, 30 Sep 2017 at 1:59 am, Nathaniel Manista <[email protected]> wrote:
> On Fri, Sep 29, 2017 at 6:35 AM, Amit Saha <[email protected]> wrote: > >> Hi all, >> >> I am playing around with the interceptor implementation ( >> https://github.com/grpc/grpc/issues/8767) and wanted to start exporting >> metrics such as per-request latency. To this end, is the servicer context >> the right place to keep track of such data? >> > > I'm tempted to say no? I think we want to avoid supporting use of > grpc.ServicerContext objects as places to store arbitrary data. > > For example: >> >> class MetricInterceptor(UnaryServerInterceptor, StreamServerInterceptor): >> def intercept_unary(self, request, servicer_context, server_info, >> handler): >> response = None >> try: >> servicer_context.start_time = time.time() >> print('I was called') >> response = handler(request) >> except: >> e = sys.exc_info()[0] >> print(str(e)) >> raise >> print('Request took {0} >> seconds'.format(time.time()-servicer_context.start_time)) >> return response >> > > Why not just use a local field? Is there some subtlety I'm missing in your > example? > I decided to use server context because i assumed that it would be a request local object. When my server is handling concurrent requests, i would not want to use a local variable to keep state (without any locking i.e). Please correct me where I am going wrong. -N > -- You received this message because you are subscribed to the Google Groups "grpc.io" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at https://groups.google.com/group/grpc-io. To view this discussion on the web visit https://groups.google.com/d/msgid/grpc-io/CANODV3kEqzjy8tNZ%3D7KUW0H0-9v-Qat4Bs-gH9QWVUC2wDd5sw%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.
