Greedy Algorithms General principle of greedy algorithm Activity-selection problem - Optimal substructure - Recursive solution - Greedy-choice property - Recursive algorithm Minimum spanning trees - Generic algorithm - Definition: cuts, light edges - Prim’s algorithm
Overview Like dynamic programming (DP), used to solve optimization problems. Problems exhibit optimal substructure (like DP). Problems also exhibit the  greedy-choice  property. When we have a choice to make, make the one that looks best  right now . Make a  locally optimal choice   in hope of getting a  globally optimal solution .
Greedy Strategy The choice that seems best at the moment is the one we go with. Prove that when there is a choice to make, one of the optimal choices is the greedy choice. Therefore, it’s always safe to make the greedy choice. Show that all but one of the subproblems resulting from the greedy choice are empty.
Activity-selection Problem Input:  Set  S  of  n  activities,  a 1 ,  a 2 , …,  a n . s i  = start time of activity  i . f i  = finish time of activity  i . Output:  Subset A   of maximum number of compatible activities. Two activities are compatible, if their intervals don’t overlap. Example: Activities in each line are compatible.
Example: i  1 2 3 4 5 6 7 8 9 10 11 s i 1 3 0 5 3 5 6 8 8 2 12 f i 4 5 6 7 8 9 10 11 12 13 14 { a 3 ,  a 9 ,  a 11 }  consists of mutually compatible activities. But it is not a maximal set. { a 1 ,  a 4 ,  a 8 ,  a 11 }  is a largest subset of mutually compatible activities. Another largest subset is  { a 2 ,  a 4 ,  a 9 ,  a 11 } .
Optimal Substructure Assume activities are sorted by finishing times. f 1      f 2     …     f n . Suppose an optimal solution includes activity  a k . This generates two subproblems. Selecting from  a 1 , …,  a k -1 , activities compatible with one another, and  that finish before  a k  starts  (compatible with  a k ). Selecting from  a k +1 , …,  a n , activities compatible with one another, and  that start after  a k  finishes . The solutions to the two subproblems must be optimal.
Recursive Solution Let  S ij  = subset of activities in  S  that start after  a i  finishes and finish before  a j  starts. Subproblems:  Selecting maximum number of mutually compatible activities from  S ij . Let  c [ i ,  j ]  = size of maximum-size subset of mutually compatible activities in  S ij . Recursive  Solution:               ij j k i ij S j k c k i c S j i c if } 1 ] , [ ] , [ max{ if 0 ] , [
Greedy-choice Property The problem also exhibits the  greedy-choice property . There is an optimal solution to the subproblem  S ij , that includes the activity with the smallest finish time in set  S ij . Can be proved easily. Hence,  there is an optimal solution to S that includes  a 1 . Therefore,  make  this  greedy choice  without solving subproblems first and evaluating them. Solve the subproblem that ensues as a result of making this greedy choice. Combine the greedy choice and the solution to the subproblem.
Recursive Algorithm Recursive-Activity-Selector ( s ,  f ,  i ,  j ) m      i +1 while   m  <  j  and  s m  <  f i do   m     m +1 if   m  <  j then   return  { a m }     Recursive-Activity-Selector( s ,  f ,  m ,  j ) else return   Initial Call:  Recursive-Activity-Selector (s, f, 0, n+1) Complexity:    (n) Straightforward to convert the algorithm to an iterative one. See the text.
Typical Steps Cast the optimization problem as one in which we make a choice and are left with one subproblem to solve. Prove that there’s always an optimal solution that makes the greedy choice , so that the greedy choice is always safe. Show that greedy choice and optimal solution to subproblem     optimal solution to the problem. Make the greedy choice and  solve top-down . May have to  preprocess input to put it into greedy order . Example:  Sorting activities by finish time.
Elements of Greedy Algorithms Greedy-choice Property. A globally optimal solution can be arrived at by making a locally optimal (greedy) choice. Optimal Substructure.
Minimum Spanning Trees Given:  Connected, undirected, weighted graph,  G Find:   Minimum - weight spanning tree,  T Example: b c a d e f 5 11 0 3 1 7 -3 2 a b c f e d 5 3 -3 1 0 Acyclic  subset of edges( E ) that connects all vertices of  G . w ( T ) =  weight of  T :
Generic Algorithm A  - subset of some Minimum Spanning tree (MST). “Grow”  A  by adding “ safe ” edges one by one. Edge is  “safe”  if it can be added to A without destroying this  invariant. A :=   ; while  A not complete tree  do find a safe edge (u, v); A := A    {(u, v)} od
Definitions cut  partitions vertices into disjoint sets,  S  and  V  –  S . b c a d e f 5 11 0 3 1 7 -3 2 a  light  edge crossing cut (could be more than one) cut that  respects  an edge set  A =  {(a, b), (b, c)} Cut  – A cut  ( S ,  V –  S)   of an undirected graph  G  = ( V ,  E ) is a partition of  V . A cut  respects a set  A   of edges if no edge in  A  crosses the cut. An edge is a  light edge  crossing a cut if its weight is the minimum of any edge crossing the cut.
Theorem 23.1 Proof: Let  T  be a MST that includes  A . Case:  ( u ,  v ) in  T . We’re done. Case:  ( u ,  v ) not in  T .  We have the following:  u y x v edge in  A cut shows edges in T Theorem 23.1:  Let ( S ,  V  -  S ) be any cut that respects  A , and let ( u ,  v )  be a light edge crossing ( S ,  V  -  S ). Then, ( u , v) is safe for  A . ( x ,  y )  crosses cut. Let  T ´  = { T  - {( x ,  y )}}    {( u ,  v )} . Because ( u ,  v ) is light for cut, w ( u ,  v )     w ( x ,  y ). Thus,  w ( T ´) =  w ( T ) -  w ( x ,  y ) +  w ( u ,  v )     w ( T ). Hence,  T ´ is also a MST.  So,  ( u ,  v ) is safe for  A .
Corollary In general, A will consist of several connected components (CC). Corollary:  If ( u ,  v ) is a light edge connecting one CC in  G A   = ( V ,  A ) to another CC in  G A , then ( u ,  v ) is safe for  A .
Kruskal’s Algorithm Starts with each vertex in its own component. Repeatedly merges two components  into one by choosing a light edge that connects them (i.e., a light edge crossing the cut between them). Scans the set of edges in monotonically increasing order by weight. Uses a  disjoint-set data structure  to determine whether an edge connects vertices in different components.
Prim’s Algorithm Builds  one tree . So  A  is always a tree. Starts from an arbitrary “root”  r  . At each step,  adds a light edge  crossing cut  ( V A ,  V  -  V A )  to  A . V A   =  vertices that  A  is incident on.
Prim’s Algorithm Uses a  priority queue  Q  to find a light edge quickly. Each object in  Q  is a vertex in  V  -  V A . implemented as a min-heap Max-heap as a binary tree. 11 14 13 18 17 19 20 18 24 26 1 2 3 4 5 6 7 8 9 10
key ( v ) (key of  v )   is minimum weight of any edge ( u, v ), where  u     V A . Then the vertex returned by Extract-Min is  v  such that there exists  u     V A   and ( u, v )   is light edge crossing ( V A , V  -  V A ). key ( v ) is    if  v  is not adjacent to any vertex in  V A . Prim’s Algorithm
Prim’s Algorithm Q := V[G]; for  each u    Q  do key[u] :=   od ; key[r] := 0;  [r] := NIL; while  Q        do u := Extract - Min(Q); for  each v    Adj[u]  do if  v    Q    w(u, v) < key[v]  then  [v] := u; key[v] := w(u, v) fi od od Complexity: Using binary heaps: O(E lg V). Initialization – O(V). Building initial queue – O(V). V Extract-Min’s – O(V lgV). E Decrease-Key’s – O(E lg V). Using Fibonacci heaps: O(E + V lg V). (see book) Note:   A = {(v,   [v]) : v    v - {r} - Q}. decrease-key operation
Example of Prim’s Algorithm b/  c/  a/0 d/  e/  f/  5 11 0 3 1 7 -3 2 Q = a  b  c  d  e  f 0      Not in tree
Example of Prim’s Algorithm b/5 c/  a/0 d/11 e/  f/  5 11 0 3 1 7 -3 2 Q = b  d  c  e  f 5 11     
Example of Prim’s Algorithm Q = e  c  d  f 3  7 11   b/5 c/7 a/0 d/11 e/3 f/  5 11 0 3 1 7 -3 2
Example of Prim’s Algorithm Q = d  c  f 0  1  2 b/5 c/1 a/0 d/0 e/3 f/2 5 11 0 3 1 7 -3 2
Example of Prim’s Algorithm Q = c  f 1  2 b/5 c/1 a/0 d/11 e/3 f/2 5 11 0 3 1 7 -3 2
Example of Prim’s Algorithm Q = f -3 b/5 c/1 a/0 d/11 e/3 f/-3 5 11 0 3 1 7 -3 2
Example of Prim’s Algorithm Q =   b/5 c/1 a/0 d/11 e/3 f/-3 5 11 0 3 1 7 -3 2
Example of Prim’s Algorithm 0 b/5 c/1 a/0 d/0 e/3 f/-3 5 3 1 -3

test pre

  • 1.
    Greedy Algorithms Generalprinciple of greedy algorithm Activity-selection problem - Optimal substructure - Recursive solution - Greedy-choice property - Recursive algorithm Minimum spanning trees - Generic algorithm - Definition: cuts, light edges - Prim’s algorithm
  • 2.
    Overview Like dynamicprogramming (DP), used to solve optimization problems. Problems exhibit optimal substructure (like DP). Problems also exhibit the greedy-choice property. When we have a choice to make, make the one that looks best right now . Make a locally optimal choice in hope of getting a globally optimal solution .
  • 3.
    Greedy Strategy Thechoice that seems best at the moment is the one we go with. Prove that when there is a choice to make, one of the optimal choices is the greedy choice. Therefore, it’s always safe to make the greedy choice. Show that all but one of the subproblems resulting from the greedy choice are empty.
  • 4.
    Activity-selection Problem Input: Set S of n activities, a 1 , a 2 , …, a n . s i = start time of activity i . f i = finish time of activity i . Output: Subset A of maximum number of compatible activities. Two activities are compatible, if their intervals don’t overlap. Example: Activities in each line are compatible.
  • 5.
    Example: i 1 2 3 4 5 6 7 8 9 10 11 s i 1 3 0 5 3 5 6 8 8 2 12 f i 4 5 6 7 8 9 10 11 12 13 14 { a 3 , a 9 , a 11 } consists of mutually compatible activities. But it is not a maximal set. { a 1 , a 4 , a 8 , a 11 } is a largest subset of mutually compatible activities. Another largest subset is { a 2 , a 4 , a 9 , a 11 } .
  • 6.
    Optimal Substructure Assumeactivities are sorted by finishing times. f 1  f 2  …  f n . Suppose an optimal solution includes activity a k . This generates two subproblems. Selecting from a 1 , …, a k -1 , activities compatible with one another, and that finish before a k starts (compatible with a k ). Selecting from a k +1 , …, a n , activities compatible with one another, and that start after a k finishes . The solutions to the two subproblems must be optimal.
  • 7.
    Recursive Solution Let S ij = subset of activities in S that start after a i finishes and finish before a j starts. Subproblems: Selecting maximum number of mutually compatible activities from S ij . Let c [ i , j ] = size of maximum-size subset of mutually compatible activities in S ij . Recursive Solution:               ij j k i ij S j k c k i c S j i c if } 1 ] , [ ] , [ max{ if 0 ] , [
  • 8.
    Greedy-choice Property Theproblem also exhibits the greedy-choice property . There is an optimal solution to the subproblem S ij , that includes the activity with the smallest finish time in set S ij . Can be proved easily. Hence, there is an optimal solution to S that includes a 1 . Therefore, make this greedy choice without solving subproblems first and evaluating them. Solve the subproblem that ensues as a result of making this greedy choice. Combine the greedy choice and the solution to the subproblem.
  • 9.
    Recursive Algorithm Recursive-Activity-Selector( s , f , i , j ) m  i +1 while m < j and s m < f i do m  m +1 if m < j then return { a m }  Recursive-Activity-Selector( s , f , m , j ) else return  Initial Call: Recursive-Activity-Selector (s, f, 0, n+1) Complexity:  (n) Straightforward to convert the algorithm to an iterative one. See the text.
  • 10.
    Typical Steps Castthe optimization problem as one in which we make a choice and are left with one subproblem to solve. Prove that there’s always an optimal solution that makes the greedy choice , so that the greedy choice is always safe. Show that greedy choice and optimal solution to subproblem  optimal solution to the problem. Make the greedy choice and solve top-down . May have to preprocess input to put it into greedy order . Example: Sorting activities by finish time.
  • 11.
    Elements of GreedyAlgorithms Greedy-choice Property. A globally optimal solution can be arrived at by making a locally optimal (greedy) choice. Optimal Substructure.
  • 12.
    Minimum Spanning TreesGiven: Connected, undirected, weighted graph, G Find: Minimum - weight spanning tree, T Example: b c a d e f 5 11 0 3 1 7 -3 2 a b c f e d 5 3 -3 1 0 Acyclic subset of edges( E ) that connects all vertices of G . w ( T ) = weight of T :
  • 13.
    Generic Algorithm A - subset of some Minimum Spanning tree (MST). “Grow” A by adding “ safe ” edges one by one. Edge is “safe” if it can be added to A without destroying this invariant. A :=  ; while A not complete tree do find a safe edge (u, v); A := A  {(u, v)} od
  • 14.
    Definitions cut partitions vertices into disjoint sets, S and V – S . b c a d e f 5 11 0 3 1 7 -3 2 a light edge crossing cut (could be more than one) cut that respects an edge set A = {(a, b), (b, c)} Cut – A cut ( S , V – S) of an undirected graph G = ( V , E ) is a partition of V . A cut respects a set A of edges if no edge in A crosses the cut. An edge is a light edge crossing a cut if its weight is the minimum of any edge crossing the cut.
  • 15.
    Theorem 23.1 Proof:Let T be a MST that includes A . Case: ( u , v ) in T . We’re done. Case: ( u , v ) not in T . We have the following: u y x v edge in A cut shows edges in T Theorem 23.1: Let ( S , V - S ) be any cut that respects A , and let ( u , v ) be a light edge crossing ( S , V - S ). Then, ( u , v) is safe for A . ( x , y ) crosses cut. Let T ´ = { T - {( x , y )}}  {( u , v )} . Because ( u , v ) is light for cut, w ( u , v )  w ( x , y ). Thus, w ( T ´) = w ( T ) - w ( x , y ) + w ( u , v )  w ( T ). Hence, T ´ is also a MST. So, ( u , v ) is safe for A .
  • 16.
    Corollary In general,A will consist of several connected components (CC). Corollary: If ( u , v ) is a light edge connecting one CC in G A = ( V , A ) to another CC in G A , then ( u , v ) is safe for A .
  • 17.
    Kruskal’s Algorithm Startswith each vertex in its own component. Repeatedly merges two components into one by choosing a light edge that connects them (i.e., a light edge crossing the cut between them). Scans the set of edges in monotonically increasing order by weight. Uses a disjoint-set data structure to determine whether an edge connects vertices in different components.
  • 18.
    Prim’s Algorithm Builds one tree . So A is always a tree. Starts from an arbitrary “root” r . At each step, adds a light edge crossing cut ( V A , V - V A ) to A . V A = vertices that A is incident on.
  • 19.
    Prim’s Algorithm Usesa priority queue Q to find a light edge quickly. Each object in Q is a vertex in V - V A . implemented as a min-heap Max-heap as a binary tree. 11 14 13 18 17 19 20 18 24 26 1 2 3 4 5 6 7 8 9 10
  • 20.
    key ( v) (key of v ) is minimum weight of any edge ( u, v ), where u  V A . Then the vertex returned by Extract-Min is v such that there exists u  V A and ( u, v ) is light edge crossing ( V A , V - V A ). key ( v ) is  if v is not adjacent to any vertex in V A . Prim’s Algorithm
  • 21.
    Prim’s Algorithm Q:= V[G]; for each u  Q do key[u] :=  od ; key[r] := 0;  [r] := NIL; while Q   do u := Extract - Min(Q); for each v  Adj[u] do if v  Q  w(u, v) < key[v] then  [v] := u; key[v] := w(u, v) fi od od Complexity: Using binary heaps: O(E lg V). Initialization – O(V). Building initial queue – O(V). V Extract-Min’s – O(V lgV). E Decrease-Key’s – O(E lg V). Using Fibonacci heaps: O(E + V lg V). (see book) Note: A = {(v,  [v]) : v  v - {r} - Q}. decrease-key operation
  • 22.
    Example of Prim’sAlgorithm b/  c/  a/0 d/  e/  f/  5 11 0 3 1 7 -3 2 Q = a b c d e f 0   Not in tree
  • 23.
    Example of Prim’sAlgorithm b/5 c/  a/0 d/11 e/  f/  5 11 0 3 1 7 -3 2 Q = b d c e f 5 11  
  • 24.
    Example of Prim’sAlgorithm Q = e c d f 3 7 11  b/5 c/7 a/0 d/11 e/3 f/  5 11 0 3 1 7 -3 2
  • 25.
    Example of Prim’sAlgorithm Q = d c f 0 1 2 b/5 c/1 a/0 d/0 e/3 f/2 5 11 0 3 1 7 -3 2
  • 26.
    Example of Prim’sAlgorithm Q = c f 1 2 b/5 c/1 a/0 d/11 e/3 f/2 5 11 0 3 1 7 -3 2
  • 27.
    Example of Prim’sAlgorithm Q = f -3 b/5 c/1 a/0 d/11 e/3 f/-3 5 11 0 3 1 7 -3 2
  • 28.
    Example of Prim’sAlgorithm Q =  b/5 c/1 a/0 d/11 e/3 f/-3 5 11 0 3 1 7 -3 2
  • 29.
    Example of Prim’sAlgorithm 0 b/5 c/1 a/0 d/0 e/3 f/-3 5 3 1 -3