3

The question is pretty much what the title says, with a slight variation. If I remember correctly, finding an entry in an array of size 'n' has as the average case the complexity O(n).

I assume that is also the case if there is a fixed number of elements in the vector, of which we want to find one.

But how is it if the amount of entries, of which we still only try to find one, is in some way related to the size of the vector, i.e. grows in some way with it? I have such a case at hand, but I don't know the exact relation between array size and number of searched-for entries. Might be linear, might be logarithmically.. Is the average case still O(n)?

I would be grateful for any insights.

edit: an example

array size: 100

array content: at each position, a number of 1-10, completely random which one.

what we seek: the first occurrence of "1"

from a naive point of view, we should on average find an entry after 10 lookups in any kind of linear searches (which we have to do, as the content is not sorted.)

As factors are usually omitted in big-O, does that mean that we still need O(n) in time, even though it should be O(n)

5
  • If the array is sorted, then the relationship goes as O(log n) in the worst-case for element to be found in the array OR vector by doing binary-search! Commented Aug 9, 2014 at 18:01
  • I think a specific example would make the question much clearer. Commented Aug 9, 2014 at 18:06
  • O(n) would be the worst case, with O(n/2) being the average, no? (assuming an unsorted array) Commented Aug 9, 2014 at 18:07
  • 1
    Ain't O(n/2)=O(n)??? @500-InternalServerError Or I am missing something! Commented Aug 9, 2014 at 18:09
  • Ah, yes, I believe you are correct. Commented Aug 9, 2014 at 18:10

3 Answers 3

3

It is O(n) anyway.

Think about finding 1 here:

[9,9,9,9,9,1]
Sign up to request clarification or add additional context in comments.

5 Comments

You are correct undoubtedly but he is also asking about effect on Time-Complexity with increasing number of elements in Vector. Please see my answer too.
OP is also asking about the average complexity, not just the worst case.
@EmilLundberg It is not enough data for average complexity estimation. E.g. probability of '1' appearance in input set. If '1' appears only once among other numbers then it is O(n). If distribution is linear then average is O(n/C) / 2 which is still O(n) as C and 2 are constants.
@c-smile So in a vector of infinite size and a 99% chance of encountering the right solution we still have O(n) -> infinite time requirement on average? It is somewhat what I expected, but counter-intuitive still.
@ArneRecknagel yes, the worst case will be infinite time.
1

If you're doing a linear search through the array, then the average time complexity of finding one of M elements in an array with N elements will be O(I) where I is the average index of the first of the sought elements. If the array is randomly ordered, then I will be O(N/M) on average, and so the time complexity will also be O(N/M) on average and O(N-M) in the worst case.

1 Comment

So, if m = log_2(n), would it really be written as O(n/log_2(n)) ?
1

I have two minds over this question.

First, if you'll consider an unsorted array (which the case seems here), the asymptotic complexity for average case will be surely O(n).

Let's take an example. We have n elements in the array or better to say Vector. Now,average case will be searching in a linear fashion by node to node. Which appears to be n/2 in general for average or O(n) as an average case. See,if the elements are added, then the complexity's nature won't change but, the effect is clear,it's n/2 comparisons for average---which is directly 1/2 (half) of n. The effect for m elements now after insertion in array will be O(n-m),or in comparison wise,(n-m)/2 comparisons added as a result to addition of elements in the Vector!

So,we find that with increase in size of array or better to say Vector---the complexity's nature won't change though the no. of comparisons required would be more as it is equal to n/2 in average case.

Second, if the array or vector is sorted, then performing binary-searches will have worst-cases of order log(n+1)---again dependent on n. Also, the average case will increase the comparisons logarithmically,but the complexity order O(log n) won't change!

2 Comments

What if we have information on the probability of finding a solution? Say, the probability of finding a solution is 50%. In such a case, the problem would not depend on the size of the vector at all, as we can on average (which is what I am wondering about) expect to find a solution after two lookups.
I'd like to highlight that probability infact is calculated based on the number of sample-spaces if you agree! SO, in any way it'd have to depend on the number of elements---hence,it'd get affected with the growth of the vector!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.