AyWa commented on a change in pull request #3337: [ZEPPELIN-4078] Ipython queue 
performance
URL: https://github.com/apache/zeppelin/pull/3337#discussion_r268481246
 
 

 ##########
 File path: python/src/main/resources/grpc/python/ipython_server.py
 ##########
 @@ -52,24 +52,19 @@ def execute(self, request, context):
         print("execute code:\n")
         print(request.code.encode('utf-8'))
         sys.stdout.flush()
-        stdout_queue = queue.Queue(maxsize = 10)
-        stderr_queue = queue.Queue(maxsize = 10)
-        image_queue = queue.Queue(maxsize = 5)
-
+        stream_reply_queue = queue.Queue(maxsize = 20)
 
 Review comment:
   When the queue is full , Add element to the queue is a blocking call, so 
message will just be delay. 
   It could be also related to how the `execute_interactive` from 
`jupyter_client` is implemented. But there is no timeout so far: 
https://github.com/jupyter/jupyter_client/blob/44980c13680f4e4226cf25f199ce4e4bb6e11296/jupyter_client/blocking/client.py#L338
   and they are using queue under the hood too. So even with many message to 
handle, it will just bring some slow down.
   
   (relative slow down, because `while not stream_reply_queue.empty():` is 
likely to never stop when there is a lot of message to process.) So thats why 
there is at most `0.05sc` delay compare to before. (but with less cpu usage, so 
likely to be faster at the end) 
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to