Asymptotic Notations forAnalysis of Algorithms
• We have discussed Asymptotic Analysis, and Worst, Average,
and Best Cases of Algorithms. The main idea of asymptotic
analysis is to have a measure of the efficiency of algorithms
that don't depend on machine-specific constants and don't
require algorithms to be implemented and time taken by
programs to be compared. Asymptotic notations are
mathematical tools to represent the time complexity of
algorithms for asymptotic analysis.
3.
Asymptotic Notations:
• AsymptoticNotations are mathematical tools used to
analyze the performance of algorithms by understanding
how their efficiency changes as the input size grows.
• These notations provide a concise way to express the
behavior of an algorithm's time or space complexity as the
input size approaches infinity.
• Rather than comparing algorithms directly, asymptotic
analysis focuses on understanding the relative growth rates
of algorithms' complexities.
4.
Asymptotic Notations:
• Asymptoticanalysis allows for the comparison of
algorithms' space and time complexities by examining their
performance characteristics as the input size varies.
• By using asymptotic notations, such as Big O, Big Omega,
and Big Theta, we can categorize algorithms based on their
worst-case, best-case, or average-case time or space
complexities, providing valuable insights into their
efficiency.
5.
Why we usethem:
• Machine independent – We don’t care if one computer is faster than
another. Growth rate matters more.
• Focus on large inputs – Small inputs run fast anyway. For big inputs,
efficiency really matters.
• Compare algorithms – They give us a fair way to compare two
algorithms solving the same problem.
6.
Simple Example:
Imagine youwant to search for a number in a list of size n.
1.Linear Search
•You may check one by one.
•In worst case, you check all n items.
•Time taken grows linearly → O(n).
2.Binary Search
•Works only on sorted list.
•You cut the list in half each step.
•After about log₂(n) steps, you find the item.
•Time taken grows logarithmically → O(log n).
7.
Why not measureexact time?
• If list has 1,000,000 numbers:
• Linear search may take 1,000,000 steps.
• Binary search may take only about 20 steps.
• No matter if your computer is fast or slow, the difference in growth is
huge.
• That’s why we use asymptotic notations like O(n) or O(log n).
8.
There are mainlythree asymptotic
notations:
1.Big-O Notation (O-notation)
2.Omega Notation (Ω-notation)
3.Theta Notation (Θ-notation)
9.
Big-O (O) →Upper Bound
• Tells us the maximum time (worst-case growth).
• “This algorithm will not take longer than this rate of growth.”
10.
• We wantto describe how fast (or slow) an algorithm grows as the
input size n gets very large.
• Suppose the actual running time of the algorithm is ( ).
𝑓 𝑛
• We compare it with some simpler function ( ) (like , 2, log ,
𝑔 𝑛 𝑛 𝑛 𝑛
etc.) which represents a "growth rate."
12.
Big-Omega (Ω) →Lower Bound
• Tells us the minimum time (best-case growth).
• This algorithm will take at least this much work.
• Linear Search again.
• If the element is the very first one, you need only 1 check.
• We say: Ω(1)
• So no matter what, the algorithm takes at least constant time.
13.
Omega (Ω) notation
•Ω notation gives the lower bound of an algorithm’s running time.
• It tells us the best-case performance — the minimum time an
algorithm will take (when everything goes as smoothly as possible).
• If f(n) is Ω(g(n)), it means:
• For large enough input size n, the algorithm will always take at least g(n) time.
• There exists a constant c > 0 and a number n0 such that:
• f(n) ≥ c·g(n) for all n ≥ n0
15.
Θ-notation (Theta notation)
•Theta notation (Θ) gives us a tight bound on the running time of an
algorithm.
• That means it describes the exact rate of growth of the algorithm’s
running time — not just the fastest (best case) or slowest (worst
case), but the "typical growth."
16.
How it works
•If f(n) is Θ(g(n)), it means:
• There exist constants c1 and c2 and a point n0 such that:
• c1·g(n) ≤ f(n) ≤ c2·g(n) for all n ≥ n0
• Theta notation tells us the true running time behavior of an
algorithm, because it bounds it from above and below.