Big-(O) Time Complexity Calculations
What is Time Complexity
Time complexity is a way to measure how the running time of an algorithm increases as the size of its input grows. We usually denote time complexity using Big-O notation, like O(n)
, where n
represents the size of the input.
For example:
- If you have an array of
n
numbers, n
would be the length of that array.
- If you have a string of
n
characters, n
would be the length of the string.
Common Patterns in Time Complexity
Loops
Loops are a common cause of increasing time complexity. The more loops your algorithm has, especially if they are nested (one inside another), the slower your algorithm might be.
- Single Loop:
- Time Complexity:
O(n)
because the loop runs n
times.
- Nested Loops:
- Time Complexity:
O(n^2)
because the outer loop runs n
times, and for each iteration of the outer loop, the inner loop also runs n
times.
Order of Magnitude
Time complexity focuses on the order of magnitude rather than the exact number of operations. This means it captures the general growth trend of an algorithm as the input size increases.
For example:
Combining Different Phases
Sometimes, an algorithm is divided into multiple phases, each with its own time complexity. The overall time complexity is determined by the slowest phase.
For example:
- Total Time Complexity:
O(n^2)
because the second phase dominates the overall time.
Time Complexity with Multiple Variables
Sometimes, the time complexity depends on more than one factor. In such cases, the complexity is expressed with multiple variables.
Example:
- Time Complexity: O(n * m), where n and m represent different dimensions of the input.
Recursion and Its Time Complexity
The time complexity of a recursive function depends on the number of times the function is called and the time complexity of each call. The total time complexity is the product of these values.
For example, consider the following function:
The function f(n)
makes n
function calls, and each call has a time complexity of O(1)
. Therefore, the total time complexity is O(n)
.
Now, consider a different recursive function:
Here, each function call generates two additional calls, except for n = 1
. The number of calls grows exponentially, leading to a time complexity of O(2^n)
.
Common Complexity Classes
Here are some common time complexities of algorithms:
-
O(1): Constant Time
The algorithm runs in the same amount of time regardless of the input size.
-
O(log n): Logarithmic Time
The algorithm's running time grows logarithmically as the input size increases, often found in algorithms that divide the input in each step (e.g., binary search).
-
O(n): Linear Time
The algorithm's running time grows linearly with the input size, meaning it processes each element once.
-
O(n log n): Linearithmic Time
Common in efficient sorting algorithms like Merge Sort and Quick Sort.
-
O(n^2): Quadratic Time
Typically seen in algorithms with nested loops, processing all pairs of input elements.
-
O(n^3): Cubic Time
Seen in algorithms with three nested loops, processing all triplets of input elements.
-
O(2^n): Exponential Time
Often found in algorithms that explore all subsets of the input (e.g., some recursive algorithms).
-
O(n!): Factorial Time
Often associated with algorithms that generate all permutations of the input.
An algorithm is considered efficient if its time complexity is polynomial, i.e., O(n^k)
for some constant k
.
Estimating Efficiency
Understanding time complexity allows you to estimate whether an algorithm will run efficiently for large inputs. For instance:
- If an algorithm has a time complexity of
O(n^2)
and the input size is n = 100,000
, it might take too long to run because it would require 10 billion operations.
- On the other hand, an
O(n log n)
or O(n)
algorithm is likely to run efficiently, even for large inputs.
The table below gives a rough estimate of what time complexity is acceptable based on the input size:
Input Size (n) | Required Time Complexity |
---|
n ≤ 10 | O(n!) |
n ≤ 20 | O(2^n) |
n ≤ 500 | O(n^3) |
n ≤ 5,000 | O(n^2) |
n ≤ 1,000,000 | O(n log n) or O(n) |
n is large | O(1) or O(log n) |
Remember, time complexity is a theoretical measure, and actual performance may vary depending on the specific details of the algorithm and the hardware it runs on.