Unit 2.1
Big O Notation
Intro
• When solving a computer science problem there will usually be more than just
one solution.
• These solutions will often be in the form of different algorithms, and you will
generally want to compare the algorithms to see which one is more efficient.
• This is where Big O analysis helps – it gives us some basis for measuring the
efficiency of an algorithm.
• Big O measures the efficiency of an algorithm based on the time it takes for
the algorithm to run as a function of the input size.
Ashim Lamichhane 2
Asymptotic notation
• For example, suppose that an algorithm, running on an input of size n,
takes 6n2 + 100n + 300 machine instructions.
• The 6n2 term becomes larger than the remaining terms, 100 n + 300,
once n becomes large enough, 20 in this case.
• Here's a chart showing values of
6n2 and 100n + 300 for values of
n from 0 to 100:
• We could say “f(n) grows at the order
Of n2”. And write f(n)=O(n2)
Ashim Lamichhane 3
Asymptotic notation
• By dropping the less significant terms and the constant coefficients, we can focus
on the important part of an algorithm's running time.
• When we drop the constant coefficients and the less significant terms, we use
asymptotic notation.
• We'll see three forms of it: big-Θ notation, big-O notation, and big-Ω notation.
Ashim Lamichhane 4
Big-O Notation
Ashim Lamichhane 5
• A function f(x)=O(g(x)) (read as f(x) is big oh of g(x) ) iff there exists two positive
constants c and x0 such that
for all x >= x0, f(x) <= c*g(x)
• Consider an example:
f(x)=5x3+3x2+4 find big O of f(x)
solution:
f(x)= 5x3+3x2+4
we can exclude the lower order polynomial so, f(x)=5x3
since, f(x)<c*g(x), where c=5 and x=3
Thus by definition of big oh O(f(x))=O(x3)
• We use "big-O" notation for occasions such as "the running time
grows at most this much, but it could grow more slowly.”
Ashim Lamichhane 6
• We have a Fibonacci algorithm.
• We're just going to run through it operation by operation and ask how long it
takes.
Ashim Lamichhane 7
Ashim Lamichhane 8
Big Omega (Ω) notation
• Sometimes, we want to say that an algorithm takes at least a certain amount of
time, without providing an upper bound.
• We use big-Ω notation; that's the Greek letter "omega.”
• A function f(x) =Ω (g(x)) (read as f(x) is big omega of g(x) ) iff there exists two
positive constants c and x0 such that for all x >= x0, 0 <= c*g(x) <= f(x).
• Therefore,
f(x)>=c*g(x)
• The above relation says that g(x) is a lower bound of f(x).
Ashim Lamichhane 9
Big Omega (Ω) notation
Ashim Lamichhane 10
Big Theta (Θ) notation
• When we need asymptotically tight bound then we use notation.
• A function f(x) = (g(x)) (read as f(x) is big theta of g(x) ) iff there exists three
positive constants c1, c2 and x0 such that
for all x >= x0, c1*g(x) <= f(x) <= c2*g(x)
Ashim Lamichhane 11
Assignment
• https://github.com/ashim888/dataStructureAndAlgorithm/tree/master/Assignme
nts/assignment_4
NOTE: before submitting please do check
• https://github.com/ashim888/dataStructureAndAlgorithm/tree/master/Assignme
nts
Ashim Lamichhane 12

Algorithm big o

  • 1.
  • 2.
    Intro • When solvinga computer science problem there will usually be more than just one solution. • These solutions will often be in the form of different algorithms, and you will generally want to compare the algorithms to see which one is more efficient. • This is where Big O analysis helps – it gives us some basis for measuring the efficiency of an algorithm. • Big O measures the efficiency of an algorithm based on the time it takes for the algorithm to run as a function of the input size. Ashim Lamichhane 2
  • 3.
    Asymptotic notation • Forexample, suppose that an algorithm, running on an input of size n, takes 6n2 + 100n + 300 machine instructions. • The 6n2 term becomes larger than the remaining terms, 100 n + 300, once n becomes large enough, 20 in this case. • Here's a chart showing values of 6n2 and 100n + 300 for values of n from 0 to 100: • We could say “f(n) grows at the order Of n2”. And write f(n)=O(n2) Ashim Lamichhane 3
  • 4.
    Asymptotic notation • Bydropping the less significant terms and the constant coefficients, we can focus on the important part of an algorithm's running time. • When we drop the constant coefficients and the less significant terms, we use asymptotic notation. • We'll see three forms of it: big-Θ notation, big-O notation, and big-Ω notation. Ashim Lamichhane 4
  • 5.
    Big-O Notation Ashim Lamichhane5 • A function f(x)=O(g(x)) (read as f(x) is big oh of g(x) ) iff there exists two positive constants c and x0 such that for all x >= x0, f(x) <= c*g(x) • Consider an example: f(x)=5x3+3x2+4 find big O of f(x) solution: f(x)= 5x3+3x2+4 we can exclude the lower order polynomial so, f(x)=5x3 since, f(x)<c*g(x), where c=5 and x=3 Thus by definition of big oh O(f(x))=O(x3)
  • 6.
    • We use"big-O" notation for occasions such as "the running time grows at most this much, but it could grow more slowly.” Ashim Lamichhane 6
  • 7.
    • We havea Fibonacci algorithm. • We're just going to run through it operation by operation and ask how long it takes. Ashim Lamichhane 7
  • 8.
  • 9.
    Big Omega (Ω)notation • Sometimes, we want to say that an algorithm takes at least a certain amount of time, without providing an upper bound. • We use big-Ω notation; that's the Greek letter "omega.” • A function f(x) =Ω (g(x)) (read as f(x) is big omega of g(x) ) iff there exists two positive constants c and x0 such that for all x >= x0, 0 <= c*g(x) <= f(x). • Therefore, f(x)>=c*g(x) • The above relation says that g(x) is a lower bound of f(x). Ashim Lamichhane 9
  • 10.
    Big Omega (Ω)notation Ashim Lamichhane 10
  • 11.
    Big Theta (Θ)notation • When we need asymptotically tight bound then we use notation. • A function f(x) = (g(x)) (read as f(x) is big theta of g(x) ) iff there exists three positive constants c1, c2 and x0 such that for all x >= x0, c1*g(x) <= f(x) <= c2*g(x) Ashim Lamichhane 11
  • 12.
    Assignment • https://github.com/ashim888/dataStructureAndAlgorithm/tree/master/Assignme nts/assignment_4 NOTE: beforesubmitting please do check • https://github.com/ashim888/dataStructureAndAlgorithm/tree/master/Assignme nts Ashim Lamichhane 12