Describe an implementation of Random(a, b) that only make calls to Random(0, 
1)?
Well I am thinking this way:

   - Divide the range (a,b) in to 2 parts like (a, mid) and (mid, b) where 
   mid = (a+b)/2
   - Select one of the range using a call to Random(0, 1).
   - Then continue dividing the new range and selecting a new based on a 
   call to Random(0, 1) until we have two elements left.
   - From the two elements one can easily be selected by a call to Random(a, 
   b).

Is this approach correct? Or something else can be done?

-- 
You received this message because you are subscribed to the Google Groups 
"Algorithm Geeks" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/algogeeks/-/kCRJ22w0_wMJ.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/algogeeks?hl=en.

Reply via email to