Data Structures and 
Algorithms 
Introduction 
Muhammad Hammad Waseem 
m.hammad.wasim@gmail.com
Course Outlines 
• Asymptotic and Algorithm Analysis 
• Complexity Analysis of Algorithms 
• Abstract Lists and Implementations 
• Arrays 
• Linked Lists, Doubly Linked Lists 
• Stacks 
• Queues, Priority Queues, DE-queues, 
• Trees, Binary Search Trees, B-Trees, Balanced Search Trees, 
• Graph and Direct Acyclic Graph Algorithms, 
• Sorting Algorithms: Bubble Sort, Quick Sort, Merge Sort, Heap sort, 
• Minimum spanning trees, Shortest Path Problems, 
• Hashing, Chained Hash Tables, Double Hashing. 
MHW 2
Course Objective 
• Enable the students to understand various ways to organize data, 
• Understand algorithm to manipulate data, 
• Use analyses compare various data structure and algorithms, 
• Select, apply and implement relevant data structure and algorithm for 
specific problem. 
MHW 3
Text/Reference Books 
• Introduction to Algorithms, 2nd Edition, Cormen, Leiserson, Rivest, 
and Stein (CLRS), MIT Press, (2001). 
• Design and Analysis of Algorithm by SartajSahni and Ellis Horwitz 
• Data Structures and Algorithm Analysis in C++, 3rd Ed., Mark Allen 
Weiss, Addison Wesley, (2006). 
• Data Structures (Special Indian Edition) (Schaum’s Outline Series), 
Lipschutz&Pai (Paperback) 
MHW 4
Algorithms 
Data Structure and Algorithms 
MHW 5
Outlines 
• Defining Algorithm 
• Understanding Algorithm 
• Analysis framework 
• Time and space complexity 
• Efficiency of Algorithm 
• Worst case 
• Best case 
• Average case 
• Asymptotic Notations 
• Big ‘Oh’ Notation (O) 
• Omega Notation (Ω) 
• Theta Notation (Θ) 
MHW 6
Algorithms 
• “Algorithmic is more than the branch of computer science. It is the 
core of computer science, and, in all fairness, can be said to be 
relevant it most of science, business and technology”. 
• “In mathematics and computer science, an algorithm is a step-by-step 
procedure for calculations”. 
• An algorithm is an effective method expressed as a finite list of well-defined 
instructions for calculating a function. 
MHW 7
Understanding of Algorithm 
• An algorithm is a sequence of unambiguous instruction for solving a 
problem, for obtaining a required output for any legitimate input in a 
finite amount of time. 
Problem 
Algorithm 
Input Computer Output 
MHW 8
ALGORITHM DESIGN AND ANALYSIS PROCESS 
Understood the Problem 
Decide On: 
Computational Means, 
Exact vs Approx Solving, 
Data Structures Algo 
Design Techniques 
Design the Algorithm 
Prove Correctness 
Analyze the Algorithm 
Code the Algorithm 
MHW 9
ANALYSIS FRAME WORK 
• There are two kinds of efficiency 
• Time efficiency - indicates how fast an algorithm in question runs. 
• Space efficiency - deals with the extra space the algorithm requires. 
MHW 10
MEASURING AN INPUT SIZE 
• An algorithm's efficiency as a function of some parameter n indicating the algorithm's 
input size. 
• In most cases, selecting such a parameter is quite straightforward. For example, it will be 
the size of the list for problems of sorting, searching, finding the list's smallest element, 
and most other problems dealing with lists. 
• There are situations, of course, where the choice of a parameter indicating an input size 
does matter. E.g. computing the product of two n-by-n matrices. 
• There are two natural measures of size for this problem. 
• The matrix order n. 
• The total number of elements N in the matrices being multiplied. 
• Since there is a simple formula relating these two measures, we can easily switch from 
one to the other, but the answer about an algorithm's efficiency will be qualitatively 
different depending on which of the two measures we use. 
• For such algorithms, computer scientists prefer measuring size by the number b of bits in 
the n's binary representation: b=[log2n] +1 
MHW 11
UNITS FOR MEASURING RUN TIME: 
• We can simply use some standard unit of time measurement-a second, a millisecond, and so on-to 
measure the running time of a program implementing the algorithm. 
• There are obvious drawbacks to such an approach. They are 
• Dependence on the speed of a particular computer 
• Dependence on the quality of a program implementing the algorithm 
• The compiler used in generating the machine code 
• The difficulty of clocking the actual running time of the program. 
• Since we are in need to measure algorithm efficiency, we should have a metric that does not 
depend on these extraneous factors. 
• One possible approach is to count the number of times each of the algorithm's operations is 
executed. This approach is both difficult and unnecessary. 
• The main objective is to identify the most important operation of the algorithm, called the basic 
operation, the operation contributing the most to the total running time, and compute the 
number of times the basic operation is executed. 
• As a rule, it is not difficult to identify the basic operation of an algorithm. 
MHW 12
WORST CASE, BEST CASE AND AVERAGE CASE 
EFFICIENCES 
• Worst case efficiency: 
• The worst-case efficiency of an algorithm is its efficiency for the worst-case input of size n, 
which is an input (or inputs) of size n for which the algorithm runs the longest among all 
possible inputs of that size. Cworst(n) = n. 
• Best case Efficiency 
• The best-case efficiency of an algorithm is its efficiency for the best-case input of size n, 
which is an input (or inputs) of size n for which the algorithm runs the fastest among all 
possible inputs of that size. 
• We can analyze the best case efficiency as follows: 
• First, determine the kind of inputs for which the count C (n) will be the smallest among all possible 
inputs of size n. 
• Note: the best case does not mean the smallest input; it means the input of size n for which the algorithm runs the fastest. 
• Then determine the value of C (n) on these most convenient inputs. Example: for sequential 
search, best-case inputs will be lists of size n with their first elements equal to a search key; 
accordingly, Cbest(n) = 1. 
MHW 13
WORST CASE, BEST CASE AND AVERAGE CASE 
EFFICIENCES 
• Average case efficiency 
• It yields the information about an algorithm or algorithm‘s behaviour on a 
―typical‖ and ―random‖ input. 
• To analyze the algorithm's average-case efficiency, we must make some 
assumptions about possible inputs of size n. 
• The investigation of the average case efficiency is considerably more difficult 
than investigation of the worst case and best case efficiency. 
• It involves dividing all instances of size n into several classes so that for each 
instance of the class the number of times the algorithm's basic operation is 
executed is the same. 
• Then a probability distribution of inputs needs to be obtained or assumed so 
that the expected value of the basic operation's count can then be derived. 
MHW 14
Asymptotic Notations 
• Step count is to compare time complexity of two programs that compute 
same function and also to predict the growth in run time as instance 
characteristics changes. Determining exact step count is difficult and not 
necessary also. Because the values are not exact quantities. We need only 
comparative statements like c1n2 ≤ tp(n) ≤ c2n2. 
• For example 
• Consider two programs with complexities c1n2 + c2n and c3n respectively. For small 
values of n, complexity depend upon values of c1, c2 and c3. But there will also be an 
n beyond which complexity of c3n is better than that of c1n2 + c2n. 
• This value of n is called break-even point. If this point is zero, c3n is always faster (or 
at least as fast). 
MHW 15
Common asymptotic functions 
MHW 16
Big ‘Oh’ Notation (O) 
• O(g(n)) = { f(n) : there exist positive constants c and n0 such that 0 ≤ 
f(n) ≤ cg(n) for all n ≥ n0 } 
• It is the upper bound of any function. Hence it denotes the worse 
case complexity of any algorithm. We can represent it graphically as 
MHW 17
Omega Notation (Ω) 
• Ω (g(n)) = { f(n) : there exist positive constants c and n0 such that 0 ≤ 
cg(n) ≤ f(n) for all n ≥ n0 } 
• It is the lower bound of any function. Hence it denotes the best case 
complexity of any algorithm. 
• Example: f(n) = 3n + 2 
• 3n + 2 > 3n for all n. 
• Hence f(n) = Ω(n) 
• Similarly we can solve all the examples specified under Big ‘Oh‘. 
MHW 19
Theta Notation (Θ) 
• Θ(g(n)) = {f(n) : there exist positive constants c1,c2 and n0 such that 
c1g(n) ≤f(n) ≤c2g(n) for all n ≥ n0 } 
• If f(n) = Θ(g(n)), all values of n right to n0 f(n) lies on or above c1g(n) 
and on or below c2g(n). Hence it is asymptotic tight bound for f(n). 
• Example f(n) = 3n + 2 
• f(n) = Θ(n) because f(n) = O(n) , n ≥ 2. 
• Similarly we can solve all examples specified under Big ‘Oh‘. 
MHW 20

Data Structures - Lecture 1 [introduction]

  • 1.
    Data Structures and Algorithms Introduction Muhammad Hammad Waseem m.hammad.wasim@gmail.com
  • 2.
    Course Outlines •Asymptotic and Algorithm Analysis • Complexity Analysis of Algorithms • Abstract Lists and Implementations • Arrays • Linked Lists, Doubly Linked Lists • Stacks • Queues, Priority Queues, DE-queues, • Trees, Binary Search Trees, B-Trees, Balanced Search Trees, • Graph and Direct Acyclic Graph Algorithms, • Sorting Algorithms: Bubble Sort, Quick Sort, Merge Sort, Heap sort, • Minimum spanning trees, Shortest Path Problems, • Hashing, Chained Hash Tables, Double Hashing. MHW 2
  • 3.
    Course Objective •Enable the students to understand various ways to organize data, • Understand algorithm to manipulate data, • Use analyses compare various data structure and algorithms, • Select, apply and implement relevant data structure and algorithm for specific problem. MHW 3
  • 4.
    Text/Reference Books •Introduction to Algorithms, 2nd Edition, Cormen, Leiserson, Rivest, and Stein (CLRS), MIT Press, (2001). • Design and Analysis of Algorithm by SartajSahni and Ellis Horwitz • Data Structures and Algorithm Analysis in C++, 3rd Ed., Mark Allen Weiss, Addison Wesley, (2006). • Data Structures (Special Indian Edition) (Schaum’s Outline Series), Lipschutz&Pai (Paperback) MHW 4
  • 5.
    Algorithms Data Structureand Algorithms MHW 5
  • 6.
    Outlines • DefiningAlgorithm • Understanding Algorithm • Analysis framework • Time and space complexity • Efficiency of Algorithm • Worst case • Best case • Average case • Asymptotic Notations • Big ‘Oh’ Notation (O) • Omega Notation (Ω) • Theta Notation (Θ) MHW 6
  • 7.
    Algorithms • “Algorithmicis more than the branch of computer science. It is the core of computer science, and, in all fairness, can be said to be relevant it most of science, business and technology”. • “In mathematics and computer science, an algorithm is a step-by-step procedure for calculations”. • An algorithm is an effective method expressed as a finite list of well-defined instructions for calculating a function. MHW 7
  • 8.
    Understanding of Algorithm • An algorithm is a sequence of unambiguous instruction for solving a problem, for obtaining a required output for any legitimate input in a finite amount of time. Problem Algorithm Input Computer Output MHW 8
  • 9.
    ALGORITHM DESIGN ANDANALYSIS PROCESS Understood the Problem Decide On: Computational Means, Exact vs Approx Solving, Data Structures Algo Design Techniques Design the Algorithm Prove Correctness Analyze the Algorithm Code the Algorithm MHW 9
  • 10.
    ANALYSIS FRAME WORK • There are two kinds of efficiency • Time efficiency - indicates how fast an algorithm in question runs. • Space efficiency - deals with the extra space the algorithm requires. MHW 10
  • 11.
    MEASURING AN INPUTSIZE • An algorithm's efficiency as a function of some parameter n indicating the algorithm's input size. • In most cases, selecting such a parameter is quite straightforward. For example, it will be the size of the list for problems of sorting, searching, finding the list's smallest element, and most other problems dealing with lists. • There are situations, of course, where the choice of a parameter indicating an input size does matter. E.g. computing the product of two n-by-n matrices. • There are two natural measures of size for this problem. • The matrix order n. • The total number of elements N in the matrices being multiplied. • Since there is a simple formula relating these two measures, we can easily switch from one to the other, but the answer about an algorithm's efficiency will be qualitatively different depending on which of the two measures we use. • For such algorithms, computer scientists prefer measuring size by the number b of bits in the n's binary representation: b=[log2n] +1 MHW 11
  • 12.
    UNITS FOR MEASURINGRUN TIME: • We can simply use some standard unit of time measurement-a second, a millisecond, and so on-to measure the running time of a program implementing the algorithm. • There are obvious drawbacks to such an approach. They are • Dependence on the speed of a particular computer • Dependence on the quality of a program implementing the algorithm • The compiler used in generating the machine code • The difficulty of clocking the actual running time of the program. • Since we are in need to measure algorithm efficiency, we should have a metric that does not depend on these extraneous factors. • One possible approach is to count the number of times each of the algorithm's operations is executed. This approach is both difficult and unnecessary. • The main objective is to identify the most important operation of the algorithm, called the basic operation, the operation contributing the most to the total running time, and compute the number of times the basic operation is executed. • As a rule, it is not difficult to identify the basic operation of an algorithm. MHW 12
  • 13.
    WORST CASE, BESTCASE AND AVERAGE CASE EFFICIENCES • Worst case efficiency: • The worst-case efficiency of an algorithm is its efficiency for the worst-case input of size n, which is an input (or inputs) of size n for which the algorithm runs the longest among all possible inputs of that size. Cworst(n) = n. • Best case Efficiency • The best-case efficiency of an algorithm is its efficiency for the best-case input of size n, which is an input (or inputs) of size n for which the algorithm runs the fastest among all possible inputs of that size. • We can analyze the best case efficiency as follows: • First, determine the kind of inputs for which the count C (n) will be the smallest among all possible inputs of size n. • Note: the best case does not mean the smallest input; it means the input of size n for which the algorithm runs the fastest. • Then determine the value of C (n) on these most convenient inputs. Example: for sequential search, best-case inputs will be lists of size n with their first elements equal to a search key; accordingly, Cbest(n) = 1. MHW 13
  • 14.
    WORST CASE, BESTCASE AND AVERAGE CASE EFFICIENCES • Average case efficiency • It yields the information about an algorithm or algorithm‘s behaviour on a ―typical‖ and ―random‖ input. • To analyze the algorithm's average-case efficiency, we must make some assumptions about possible inputs of size n. • The investigation of the average case efficiency is considerably more difficult than investigation of the worst case and best case efficiency. • It involves dividing all instances of size n into several classes so that for each instance of the class the number of times the algorithm's basic operation is executed is the same. • Then a probability distribution of inputs needs to be obtained or assumed so that the expected value of the basic operation's count can then be derived. MHW 14
  • 15.
    Asymptotic Notations •Step count is to compare time complexity of two programs that compute same function and also to predict the growth in run time as instance characteristics changes. Determining exact step count is difficult and not necessary also. Because the values are not exact quantities. We need only comparative statements like c1n2 ≤ tp(n) ≤ c2n2. • For example • Consider two programs with complexities c1n2 + c2n and c3n respectively. For small values of n, complexity depend upon values of c1, c2 and c3. But there will also be an n beyond which complexity of c3n is better than that of c1n2 + c2n. • This value of n is called break-even point. If this point is zero, c3n is always faster (or at least as fast). MHW 15
  • 16.
  • 17.
    Big ‘Oh’ Notation(O) • O(g(n)) = { f(n) : there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 } • It is the upper bound of any function. Hence it denotes the worse case complexity of any algorithm. We can represent it graphically as MHW 17
  • 18.
    Omega Notation (Ω) • Ω (g(n)) = { f(n) : there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 } • It is the lower bound of any function. Hence it denotes the best case complexity of any algorithm. • Example: f(n) = 3n + 2 • 3n + 2 > 3n for all n. • Hence f(n) = Ω(n) • Similarly we can solve all the examples specified under Big ‘Oh‘. MHW 19
  • 19.
    Theta Notation (Θ) • Θ(g(n)) = {f(n) : there exist positive constants c1,c2 and n0 such that c1g(n) ≤f(n) ≤c2g(n) for all n ≥ n0 } • If f(n) = Θ(g(n)), all values of n right to n0 f(n) lies on or above c1g(n) and on or below c2g(n). Hence it is asymptotic tight bound for f(n). • Example f(n) = 3n + 2 • f(n) = Θ(n) because f(n) = O(n) , n ≥ 2. • Similarly we can solve all examples specified under Big ‘Oh‘. MHW 20