stillalex commented on PR #2101:
URL: https://github.com/apache/solr/pull/2101#issuecomment-1847834842

   > I played a bit with my idea but it wasn't working out; perhaps because 
internal complexities around searcher lifecycle.
   
   I hear you. as an aside if (at least) 3 people are trying to make sense of 
this code and can't - this is a pretty strong signal it needs some proper 
refactoring.
   
   > I still don't understand why getOpenCount() <= 1; could be correct because 
a metrics request isn't going to increment the core count. Maybe we should 
change the initial value to 0 and increment at the end of the constructor; 
albeit it will then appear as "closed" during this time? A value of "-1" could 
be used to differentiate.
   
   I used this as an indirect indicator that the init part is completed. 
metrics do not increase open count, they only expose some lazy computations 
which can trigger the deadlocking flow. as far as I can read this code the 
'open count > 1' (strictly bigger) signals the 'open' method has been called, 
which is post-init phase. I find the 'open count = 1 by default' confusing too 
but I avoided messing around too much with it because that was not the core 
issue being addressed.
   
   the way I understand it:
    - init phase -- open count = 1
    - "open" / "close" calls will increase and decrease open count
    - final 'close' will set open count to 0 and actually perform resource 
cleanup (doClose)
   
   If the open count is not to your liking (I am not married to it, it was just 
a proposal) I think you had some thoughts on alternatives. it's just a matter 
of getting the timing right for the concurrent calls to the metrics, where 
simply moving the code below a few lines is not sufficient.
   
   > There are no validations for, say, "segments" (exposed as INDEX.segments), 
but if you look at the response in a debugger, there is a null value. It should 
instead be the number of segments. Obviously we have a testing gap here.
   
   I think some nulls can creep in if you call the metrics before they are made 
available in the metrics context. otherwise I agree there might be gaps here, 
and we can look at improving existing tests.
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to