@Dave
yes Dave, you got it right. I assumed that the problem is to find a missing
number out of given 300 million consecutive (but not neccessarily ordered)
9 digit numbers. And so my algo looks strictly in the given range.


Thanks,
- Ravindra

On Sun, Oct 30, 2011 at 2:30 AM, Dave <[email protected]> wrote:

> @Ravindra: As given in the problem, the lower bound is 100,000,000 (my
> interpretation of "9 digit number") and the upper bound is
> 999,999,999. Suppose that the numbers in the file are the first 300
> million of these, i.e., 100,000,000 to 299,999,999. It is not obvious
> that your algorithm finds a number in the range 300,000,000 to
> 999,999,999. Does it?
>
> Dave
>
> On Oct 29, 12:30 pm, ravindra patel <[email protected]> wrote:
> > Assuming that we know the lower bound and upper bound of the range of
> > numbers (If not then we can determine it in one pass).
> >
> > // Solution 1
> > let lb = lower bound, ub = upper bound, sum = 0;
> >
> > for each number read from file - sum = sum - number + ub--;
> >
> > at the end of for loop sum += lb; // This is the missing number.
> >
> > // Problem with above solution is that the variable sum may overflow if
> the
> > contiguous numbers read from the file are very small
> > // This problem can be solved by changing the second line to this -
> >
> > for each number read from file -
> >        sum -= number;
> >        if(sum > 0)  sum += lb++;
> >        else sum += ub--;
> >
> > Feedback welcome :-).
> > - Ravindra
> >
> > On Sat, Oct 29, 2011 at 9:17 PM, bharat b <[email protected]
> >wrote:
> >
> >
> >
> > > @Dave
> > > Your solution works if the total no.of records(ssn numbers) is 1000
> > > million.
> > > But the question states that there are only 300 million numbers.
> >
> > > I think some modification is needed to your answer.
> > > Correct me if i am wrong.
> >
> > > On Fri, Oct 28, 2011 at 2:04 AM, Dave <[email protected]> wrote:
> >
> > >> @Shiva: Using an integer array a[100000], initialized to 0, read
> > >> through the file and for each number n, increment a[n%100000]. At the
> > >> end of the file, find any k such that a[k] != 10000. Then read through
> > >> the file again. For any number n such that n%100000 == k, set a[n/
> > >> 100000] = -1. At the end of file, find any j < 10000 such that a[j] >=
> > >> 0. Then 100000 * j + k is a number that is missing from the file.
> >
> > >> Dave
> >
> > >> On Oct 27, 10:25 am, "shiva@Algo" <[email protected]> wrote:
> > >> > Given a file containing roughly 300 million social security
> > >> > numbers(9-digit numbers), find a 9-digit number that is not in the
> file.
> > >> > You have unlimited drive space but only 2megabytes of RAM at your
> > >> > disposal.
> >
> > >> --
> > >> You received this message because you are subscribed to the Google
> Groups
> > >> "Algorithm Geeks" group.
> > >> To post to this group, send email to [email protected].
> > >> To unsubscribe from this group, send email to
> > >> [email protected].
> > >> For more options, visit this group at
> > >>http://groups.google.com/group/algogeeks?hl=en.
> >
> > >  --
> > > You received this message because you are subscribed to the Google
> Groups
> > > "Algorithm Geeks" group.
> > > To post to this group, send email to [email protected].
> > > To unsubscribe from this group, send email to
> > > [email protected].
> > > For more options, visit this group at
> > >http://groups.google.com/group/algogeeks?hl=en.
>
> --
> You received this message because you are subscribed to the Google Groups
> "Algorithm Geeks" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected].
> For more options, visit this group at
> http://groups.google.com/group/algogeeks?hl=en.
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Algorithm Geeks" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/algogeeks?hl=en.

Reply via email to