0

i'm trying to known time complexity about the following algorithms :

static int g(int[] a) {
  return g(a, 0, a.length-1);
}

static int g(int[] a, int i, int j) {
  if(i == j) return a[i];
  int onefourth = (j+1-i)/4;
  return g(a, i, i+onefourth) + g(a, i+onefourth+1, j);
}

This is my attempt :

The algorithm g(int[] a, int i, int j) split the dimension of array a by 4, and call recursively itself via multiple recursion. I can write the following recurrencies equation T(n) = T(n/4) + T(3n/4) + c = .... = T(n/4^k) + T(3n/4^k) + kc. Here, i have problem to choose a value of k. Anyone can help me ?

14
  • T(n/4) + T(3n/4) + c = .... = T(n/4^k) + T(3n/4^k) + kc - this is incorrect, try to think of why. Commented Sep 13, 2018 at 13:21
  • T(n) = T(n/4) + T(3n/4) + c is just the equation, higher powers will appear if you substitute n for n/4, that is why it is a recurrence equation in the first place. Now to me, it appears that if you could solve it using substitution ( T(n) = O(nlogn)) you could prove that it is O(nlogn). Commented Sep 13, 2018 at 14:12
  • @SomeDude substituting basically any function would satisfy the equation - O(n), O(n^2), O(sqrt(n)) etc all work. Commented Sep 13, 2018 at 14:26
  • @meowgoesthedog Not every function. You should prove that there exists values that satisfy T(n) <= O(nlogn) I suspect you can prove there exists values for every function. Commented Sep 13, 2018 at 14:29
  • 1
    @SomeDude Are you aware that the correct answer is O(n), and not O(n log(n))? Commented Sep 13, 2018 at 17:02

1 Answer 1

2

I don't know what techniques you were taught, but I know how I would figure this out from scratch.

When you divide the problem, apportion the cost of the recursive calls to the lower level proportionately to their sizes. Then ask the question of what the largest value is that could be apportioned to any value on the bottom.

Here is what I mean.

If you're looking at a range of length 1 you'll have some constant cost c.

If you're looking at a range of length 2 you'll have a constant recursion cost r divided evenly for a cost per element of c+r/2.

If you're looking at a range of length 3 the first element will get a cost of c + r/3 but the latter pair first gets 2/3 r at the top level, which is then split in 2 with another recursion cost for a total cost of c + r/2 + r/3.

And so on.

Now here is the challenge. What is the largest recursive cost that can be attributed to a particular call? Worst case somewhere in the chain is r for its level, plus 3/4 r for the level above it, plus (3/4)^2 r for the level above that, and so on. Can you find an upper bound?

Can you turn that upper bound into an upper bound on the cost attributed to a single element at the bottom?

Multiply that bound by the number of elements and you'll have your O(n).

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks you very much. In few words you have consider only the most expensive recursive call for each level, in that case g(a, i+onefourth+1, j).

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.