On Wednesday, 19 February 2014 at 09:46:04 UTC, Gopan wrote:
On Wednesday, 19 February 2014 at 09:09:25 UTC, Tobias Pankrath
It's O(k) with k = vals.length, which is the number of
arguments given to the function, which is a compile time
constant and not dependent on any input of the final program.
In O(n) the n always relates to the input somehow.
I agree that vals.length is known at compile time. But while
running the application, it iterates through every val in the
input list until a match is found.
So, how can that be considered O(1)?
...
Or, is my undertanding about Big-O notation of complexity wrong?
Your understanding is probably fine. It's just one of those
technicalities which is a bit misleading.
The algorithm is O(vals.length), there's no arguing about that,
but since vals.length is a constant, it's also O(1) because O(1)
= O(2) = O(1,000,000) = O(k) for any constant k.
If you want to get even more anal about it, searching an array is
technically O(1) because an array cannot be bigger than
size_t.max, and since size_t.max is a constant, O(size_t.max) is
O(1). Again, completely misleading but technically correct in an
esoteric, academic way.