O(Log(n)) - Logarithmic Time Complexity
Description: An operation where the time it takes to complete is proportional to the logarithm of the input size. This usually happens in algorithms that divide the input in half each time.
Mathematical Expression of Logarithmic Time Complexity
Let's say you have a number x = 1
, and at each step, you are doubling it.
For example:
Alternatively, you can express this in reverse:
This above Expression - 1 can also be written as:
n / 2<sup>0</sup>, n / 2<sup>1</sup>, n / 2<sup>2</sup>, n / 2<sup>3</sup>, n / 2<sup>4</sup>, n / 2<sup>5</sup>, n / 2<sup>6</sup>, n / 2<sup>7</sup>, n / 2<sup>k</sup> = 1
To find the value of k, we can rearrange the expression:
n / 2<sup>k</sup> = 1
Multiplying both sides by 2<sup>k</sup> gives us:
n = 1 * 2<sup>k</sup>
Simplifying this:
n = 2<sup>k</sup>
Taking the logarithm of both sides:
log(n) = log(2<sup>k</sup>)
Using the logarithmic identity log(a<sup>b</sup>) = b * log(a), we get:
Since log(2) is a very small constant value, we can simplify it to:
This expression means that for a given input size n, an algorithm with logarithmic time complexity will take approximately k operations to solve the problem. Therefore, k is logarithmically proportional to the input size.
Logarithmic time complexity is typical in efficient algorithms such as binary search, where the problem size is halved with each step. This leads to significantly faster performance compared to linear time complexity as the input size increases.