Introduction
In this lecture, we focus on how to analyze recursive algorithms — algorithms that call themselves on smaller subproblems. Understanding how their running time grows helps us estimate efficiency and scalability. Key tools include recurrence relations, iteration method, recursion tree, master theorem, and substitution proofs.
Table of Contents
-
What is a Recursive Algorithm?
-
Recurrence Relations in Algorithm Analysis
-
Common Recurrence Examples
-
Methods for Solving Recurrences
-
Iteration (Repeated Expansion)
-
Recursion Tree Method
-
Master Theorem
-
Substitution Method
-
-
Examples of Recursive Algorithm Analysis
-
Sequential Search
-
Binary Search
-
Merge Sort
-
-
Summary of Important Recurrences
-
Conclusion
What is a Recursive Algorithm?
A recursive algorithm solves a problem by calling itself with smaller inputs until it reaches a base case.
Example:
int factorial(int n){
if (n == 0)
return 1; // base case
else
return n * factorial(n - 1); // recursive case
}
Every recursive algorithm can be represented mathematically by a recurrence relation.
Recurrence Relations in Algorithm Analysis
A recurrence relation expresses the running time T(n) of a recursive algorithm in terms of smaller inputs.
Examples:
-
Sequential Search → T(n)=T(n−1)+c
-
Binary Search → T(n)=T(n/2)+c
-
Merge Sort → T(n)=2T(n/2)+n
Common Recurrence Examples
| Algorithm | Recurrence Relation | Complexity |
|---|---|---|
| Sequential Search | T(n) = T(n−1) + c | O(n) |
| Binary Search | T(n) = T(n/2) + c | O(log n) |
| Merge Sort | T(n) = 2T(n/2) + n | O(n log n) |
Methods for Solving Recurrences
Iteration (Repeated Expansion)
-
Expand recursively until base case is reached.
-
Sum all terms and simplify using series formulas.
Example:
T(n)=T(n−1)+c=T(1)+c(n−1)=Θ(n)
Example 2:
T(n)=2T(n/2)+n=nT(1)+nlogn=Θ(nlogn)
Recursion Tree Method
Visualizes each recursive call as a node in a tree.
Each level represents the total work at that stage.
Example:
T(n)=T(n/2)+n
Levels:
-
Level 0: n
-
Level 1: n/2
-
Level 2: n/4
→ Total cost ≈ 2n = O(n)
Master Theorem
For recurrences of the form:
T(n)=aT(n/b)+f(n),a≥1,b>1
Compare f(n) with n^{log_b a}:
| Case | Condition | Result |
|---|---|---|
| Case 1 | f(n) = O(n^d), a < b^d | Θ(n^d) |
| Case 2 | f(n) = Θ(n^d), a = b^d | Θ(n^d log n) |
| Case 3 | f(n) = Ω(n^d), a > b^d | Θ(n^{log_b a}) |
Example: Merge Sort
T(n)=2T(n/2)+n⇒a=2,b=2,f(n)=n⇒Θ(nlogn)
Substitution Method
Used when Master Theorem doesn’t apply.
Steps:
-
Guess the form of the solution (e.g., T(n)=O(n²)).
-
Use induction hypothesis to prove the guess holds.
-
Adjust constants to make inequality true.
Example:
T(n)=2T(n/2)+n
Assume T(n/2)≤c(n/2)log(n/2)
Then
T(n)≤2[c(n/2)log(n/2)]+n=cnlogn−cn+n=O(nlogn)
Examples of Recursive Algorithm Analysis
Sequential Search
T(n)=T(n−1)+c=O(n)
Binary Search
T(n)=T(n/2)+c=O(logn)
Merge Sort
T(n)=2T(n/2)+n=O(nlogn)
Summary of Important Recurrences
| Recurrence | Method | Result | Example |
|---|---|---|---|
| T(n)=T(n−1)+c | Iteration | O(n) | Linear Recursion |
| T(n)=T(n/2)+c | Iteration | O(log n) | Binary Search |
| T(n)=2T(n/2)+n | Master | O(n log n) | Merge Sort |
| T(n)=3T(n/4)+n² | Master | O(n²) | Advanced D&C |
| T(n)=T(n−1)+n | Iteration | O(n²) | Selection Sort |
Conclusion
Analyzing recursive algorithms involves translating the code into a recurrence relation and solving it using methods like iteration, recursion tree, master theorem, or substitution. These techniques reveal whether recursion, combination, or work outside recursion dominates runtime. Understanding them helps in designing efficient algorithms such as Merge Sort, Binary Search, and Quick Sort.
Here is complete lecture Lecture # 08#