From a high level perspective, your algorithm acts as if it is traversing a balanced binary tree, where each node covers a specific interval [i, j]. Their children divide the interval into 2, roughly equal parts, namely [i, (i+j)/2] and [(i+j)/2 + 1, j].
Let's assume that they are, in this case equal. (in other words, for the sake of the proof, the length of the array n is a power of 2)
Think of it in the following way. There are n leaves of this balanced binary tree your algorithm is traversing. Each are responsible from an interval of length 1. There are n/2 nodes of the tree that are the parents of these n leaves. Those n/2 nodes have n/4 parents. This goes all the way until you reach the root node of the tree, which covers the entire interval.
Think of how many nodes there are in this tree. n + (n/2) + (n/4) + (n/8) + ... + 2 + 1. Since we initially assumed that n = 2^k, we can formulate this sum as the sum of exponents, for which the summation formula is well known. It turns out that there are 2^(k+1) - 1 = 2 * (2^k) - 1 = 2n - 1 nodes in that tree. So, obviously traversing all nodes of that tree would take O(n) time.
The problem always gets divided in two new problems so that would be O(log n)-no. You are not discarding either half each time you divide. If you discard either half as in binary search then it is O(logn) otherwise you would compute all leaves of the recursion tree which is of the order of O(n).