how to calculate binary search complexity

It doesn't half search time, that wouldn't make it log(n). It decreases it logarithmicly. Think about this for a moment. If you had 128 entries in a table and had to search linearly for your value, it would probably take around 64 entries on average to find your value. That's n/2 or linear time. With a binary search, you eliminate 1/2 the possible entries each iteration, such that at most it would only take 7 compares to find your value (log base 2 of 128 is 7 or 2 to the 7 power is 128.) This is the power of binary search.


Here a more mathematical way of seeing it, though not really complicated. IMO much clearer as informal ones:

The question is, how many times can you divide N by 2 until you have 1? This is essentially saying, do a binary search (half the elements) until you found it. In a formula this would be this:

1 = N / 2x

multiply by 2x:

2x = N

now do the log2:

log2(2x)    = log2 N
x * log2(2) = log2 N
x * 1         = log2 N

this means you can divide log N times until you have everything divided. Which means you have to divide log N ("do the binary search step") until you found your element.


T(n)=T(n/2)+1

T(n/2)= T(n/4)+1+1

Put the value of The(n/2) in above so T(n)=T(n/4)+1+1 . . . . T(n/2^k)+1+1+1.....+1

=T(2^k/2^k)+1+1....+1 up to k

=T(1)+k

As we taken 2^k=n

K = log n

So Time complexity is O(log n)


For Binary Search, T(N) = T(N/2) + O(1) // the recurrence relation

Apply Masters Theorem for computing Run time complexity of recurrence relations : T(N) = aT(N/b) + f(N)

Here, a = 1, b = 2 => log (a base b) = 1

also, here f(N) = n^c log^k(n) //k = 0 & c = log (a base b)

So, T(N) = O(N^c log^(k+1)N) = O(log(N))

Source : http://en.wikipedia.org/wiki/Master_theorem