In this tutorial, you will learn what master theorem is and how it is used for solving recurrence relations.
In this article, you will learn-
What is the Master Theorem?
At the point when we need to take care of an issue and when numerous ways are free to tackle that issue (for instance Matrix multiplication problem), around then analysis of the algorithm is required.
Analysis of algorithms intends to appraise their complexity in an asymptotic perspective. The expression “Analysis of algorithms” was coined by Donald Knuth. Analysis of algorithms is the determination of the measure of time and space resources required to execute it.
Most of the algorithms are recursive in nature, they use the divide and conquer strategy. The recursive algorithms call themselves for different inputs. That is normally essential for the original input however has a smaller size (sub-issue). There are numerous approaches to solve the recurrence relation. They are as per the following :
- Master Theorem
- Recurrence tree method
- Substitution method
- Change of variable method
Among every one of these strategies, the Master Theorem is the quickest technique to find the time complexity of the function. It is straightforward and simple to apply.
The Master Theorem is a formula for addressing recurrence relations of the structure:
T(n) = aT(n/b) + f(n), where, n = size of input a = number of subproblems in the recursion n/b = size of each subproblem. All subproblems are assumed to have the same size. f(n) = cost of the work done outside the recursive call, which includes the cost of dividing the problem and cost of merging the solutions Here, a ≥ 1 and b > 1 are constants, and f(n) is an asymptotically positive function.
An asymptotically positive function means that for a sufficiently large value of n, we have f(n) > 0.
The master theorem is used in calculating the time complexity of recurrence relations (divide and conquer algorithms) in a basic and fast way.
Master Theorem
In the event that a ≥ 1 and b > 1 are constants and f(n) is an asymptotically positive function, at that point the time complexity of a recursive connection is given by
T(n) = aT(n/b) + f(n) where, T(n) has the following asymptotic bounds: 1. If f(n) = O(nlogb a-ϵ), then T(n) = Θ(nlogb a). 2. If f(n) = Θ(nlogb a), then T(n) = Θ(nlogb a * log n). 3. If f(n) = Ω(nlogb a+ϵ), then T(n) = Θ(f(n)). ϵ > 0 is a constant.
Every one of the above conditions can be deciphered as:
- On the off chance that the expense of taking care of the sub-issues at each level increments by a specific factor, the value of f(n) will turn out to be polynomially smaller than nlogb a. Consequently, the time complexity is abused by the expense of the last level ie. nlogb a
- In the event that the expense of tackling the sub-issue at each level is almost equivalent, at that point the value of f(n) will be nlogb a. Consequently, the time intricacy will be f(n) times the all outnumber of levels ie. nlogb a * log n
- On the off chance that the cost of solving the subproblems at each level decreases by a specific factor, the value of f(n) will turn out to be polynomially bigger than nlogb a. In this manner, the time complexity is oppressed by the cost of f(n).
Solved Example of Master Theorem
T(n) = 3T(n/2) + n2 Here, a = 3 n/b = n/2 f(n) = n2 logb a = log2 3 ≈ 1.58 < 2 ie. f(n) < nlogb a+ϵ , where, ϵ is a constant. Case 3 implies here. Thus, T(n) = f(n) = Θ(n2)
Master Theorem Limitations
The master theorem cannot be used if:
- T(n) is not monotone. eg. T(n) = sin n
- f(n) is not a polynomial. eg. f(n) = 2n
- a is not a constant. eg. a = 2n
- a < 1
Thanks for reading! We hope you found this tutorial helpful and we would love to hear your feedback in the Comments section below. And show us what you’ve learned by sharing your photos and creative projects with us.