Okay, sure. I will share that once I have some experimental results.

On Friday, 21 May 2021 at 02:11:31 UTC-4 [email protected] wrote:

> On Thu, May 20, 2021 at 5:35 PM Bill Li <[email protected]> wrote:
>
>> Cool thank you!
>>
>> One additional question, given the application and scenario I have above, 
>> will setting a higher value for MAX_CONCURRENT_STREAMS help in terms of the 
>> throughput from server to the client?
>>
>
> Hard to say. Increasing the concurrent streams (calls) on a single 
> connection can result in higher throughput if there is spare capacity in 
> the connection that can be utilized by the concurrent streams.
>
> On the other hand it could have an adverse impact on the throughput if the 
> streams are bottlenecked on that single connection when there is no spare 
> capacity and/or the connection has some network capacity/connectivity 
> issues.
>
> It will be nice if you could share your results after experimenting 
> with MAX_CONCURRENT_STREAMS.
>  
>
>>
>> Thanks,
>> Bill
>>
>> On Thursday, 20 May 2021 at 16:34:55 UTC-4 [email protected] wrote:
>>
>>> On Wed, May 19, 2021 at 2:32 PM Bill Li <[email protected]> wrote:
>>>
>>>> ....
>>>>
>>>
>>>> Upon adding the block, I was able to make multiple threads executing 
>>>> onNext() concurrently.
>>>>
>>>> I am just curious about whether this is the right way of doing 
>>>> synchronization. From the best practice perspective, what is the best way 
>>>> of doing synchronization?
>>>>
>>>
>>> Your code should work since you are synchronizing on 
>>> *serverCallStreamObserver* which is being used in multiple threads. I 
>>> can't think of anything better in this particular case. 
>>>
>>> Is multithreading a common thing to do or recommended when calling 
>>>> onNext()?
>>>>
>>>
>>> Most non-trivial applications would use multiple threads and they should 
>>> be able to use gRPC streams with appropriate synchronization in place (as 
>>> you have done above). Another (and a better?) way to do this would be to 
>>> use a Queue and have a single thread reading from the Queue to feed the 
>>> responseObserver and your producer threads feeding the Queue. 
>>>  
>>>
>>>>
>>>> Thanks,
>>>> Bill
>>>>
>>>> On Wednesday, 19 May 2021 at 01:47:54 UTC-4 [email protected] wrote:
>>>>
>>>>> Pls include a code snippet of what you want to do. Show how you intend 
>>>>> to share "one ResponseObserver".
>>>>>
>>>>> On Tue, May 18, 2021 at 6:56 PM Bill Li <[email protected]> wrote:
>>>>>
>>>>>> Got it, thanks!
>>>>>>
>>>>>> I am currently implementing a server-side streaming application. Can 
>>>>>> one ResponseObserver be shared by multiple threads sending response 
>>>>>> stream 
>>>>>> back to the client through onNext() method? Just want to confirm if 
>>>>>> there 
>>>>>> is a race condition in calling onNext() at the same time.
>>>>>>
>>>>>> On Tuesday, 18 May 2021 at 19:28:43 UTC-4 [email protected] wrote:
>>>>>>
>>>>>>> With NettyServerBuilder you can use maxConcurrentCallsPerConnection(int 
>>>>>>> maxCalls) 
>>>>>>> <https://github.com/grpc/grpc-java/blob/master/netty/src/main/java/io/grpc/netty/NettyServerBuilder.java#L397>
>>>>>>>  
>>>>>>>
>>>>>>> This is the same as setting MAX_CONCURRENT_STREAMS per connection.
>>>>>>>
>>>>>>> On Tue, May 18, 2021 at 3:36 PM Bill Li <[email protected]> wrote:
>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> Does anyone know or have an example for configuring the parameter 
>>>>>>>> MAX_CONCURRENT_STREAMS for gRPC server written in Java?
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> Bill
>>>>>>>>
>>>>>>>> -- 
>>>>>>>> You received this message because you are subscribed to the Google 
>>>>>>>> Groups "grpc.io" group.
>>>>>>>> To unsubscribe from this group and stop receiving emails from it, 
>>>>>>>> send an email to [email protected].
>>>>>>>> To view this discussion on the web visit 
>>>>>>>> https://groups.google.com/d/msgid/grpc-io/cbb2fd35-a01a-4128-879d-08cbc91049b0n%40googlegroups.com
>>>>>>>>  
>>>>>>>> <https://groups.google.com/d/msgid/grpc-io/cbb2fd35-a01a-4128-879d-08cbc91049b0n%40googlegroups.com?utm_medium=email&utm_source=footer>
>>>>>>>> .
>>>>>>>>
>>>>>>> -- 
>>>>>> You received this message because you are subscribed to the Google 
>>>>>> Groups "grpc.io" group.
>>>>>> To unsubscribe from this group and stop receiving emails from it, 
>>>>>> send an email to [email protected].
>>>>>>
>>>>> To view this discussion on the web visit 
>>>>>> https://groups.google.com/d/msgid/grpc-io/0eb808f0-b1e0-4b5f-86e6-ffa15b7149d8n%40googlegroups.com
>>>>>>  
>>>>>> <https://groups.google.com/d/msgid/grpc-io/0eb808f0-b1e0-4b5f-86e6-ffa15b7149d8n%40googlegroups.com?utm_medium=email&utm_source=footer>
>>>>>> .
>>>>>>
>>>>> -- 
>>>> You received this message because you are subscribed to the Google 
>>>> Groups "grpc.io" group.
>>>> To unsubscribe from this group and stop receiving emails from it, send 
>>>> an email to [email protected].
>>>>
>>> To view this discussion on the web visit 
>>>> https://groups.google.com/d/msgid/grpc-io/065aecd9-0190-4bde-8a91-aae0edc2a0e5n%40googlegroups.com
>>>>  
>>>> <https://groups.google.com/d/msgid/grpc-io/065aecd9-0190-4bde-8a91-aae0edc2a0e5n%40googlegroups.com?utm_medium=email&utm_source=footer>
>>>> .
>>>>
>>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "grpc.io" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to [email protected].
>>
> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/grpc-io/0736928a-648c-4d5c-9e9a-4fd42b103d5cn%40googlegroups.com
>>  
>> <https://groups.google.com/d/msgid/grpc-io/0736928a-648c-4d5c-9e9a-4fd42b103d5cn%40googlegroups.com?utm_medium=email&utm_source=footer>
>> .
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"grpc.io" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/grpc-io/a6a439a1-f604-4ec2-9c92-d0767e477f65n%40googlegroups.com.

Reply via email to