Problem Types
Decision Problem:
•Is the input a YES or NO input?
• Example: Given graph G, nodes s, t, is there a path from s to t in G?
Search Problem:
• Find a solution if input is a YES input.
• Example: Given graph G, nodes s, t, find an s-t path.
Optimization Problem:
• Find a best solution among all solutions for the input.
• Example: Given graph G, nodes s, t, find a shortest s-t path.
2
3.
Optimization Algorithms
Definition:
• Optimizationalgorithms are a class of algorithms that are used to find the
best possible solution to a given problem.
• e.g., Greedy algorithms, Dynamic programming, Gradient descent.
Objective:
• Reduce cost and maximize/minimize result or finding an optimal solution.
Exhaustive approach:
• Find all possible solutions and select the one with minimum cost.
Optimization problems: Problem that require minimization or maximization of results.
3
4.
Greedy Algorithm
• Greedychoice property:
• Globally optimal solution can be obtained by making a locally optimal
solution.
• Choice that looks best at the moment.
• Never reconsiders its choices.
• Depends on choices made so far, but not on future choices.
• Optimal substructure property:
• An optimal solution can be constructed from optimal solutions of its
subproblems.
4
5.
Greedy Choice
A/5
D/
100
C/10
E/99 F/12
B/7
G/6
D/
100
B/7C/10
A/5
F/12
A greedy algorithm always makes the choice that looks best at the moment. That
is, it makes a locally optimal choice in the hope that this choice leads to a globally
optimal solution.
5
Pros and Cons
Pros:
•Usually (too) easy to design greedy algorithms.
• Easy to implement and often run fast since they are simple.
• Several important cases where they are effective/optimal.
• Lead to first-cut heuristic when problem not well understood.
Cons:
• Very often greedy algorithms don’t work.
• Easy to fool oneself into believing they work.
• Many greedy algorithms possible for a problem and no structured way to find
effective ones.
8
9.
Huffman Coding
A BC D E F
Frequency (thousands) 45 13 12 16 9 5
Fixed Length codeword 000 001 010 011 100 101
Variable Length codeword 0 101 100 111 1101 1100
If you use a fixed-length code, you need 𝑙𝑔𝑛 bits to represent n>= 2 characters.
Fixed Length Code: 300,000 bits
Variable Length Code: 224,000 bits
Taken from “Cormen, T. H., Leiserson, C. E., Rivest, R. L., & Stein, C. (2022). Introduction to algorithms. MIT press”.
9
10.
A B CD E F
Frequency 45 13 12 16 9 5
Variable Length codeword 0 101 100 111 1101 1100
[5,9,12,13,16,45]
[12,13,14,16,45]
[14,16,25,45]
[25,30,45]
[55,45]
10
Time Complexity
• Therunning time of Huffman’s algorithm depends on how the min-
priority queue Q is implemented.
• Let’s assume that it’s implemented as a binary min-heap.
• For a set C of n characters, the BUILD-MIN-HEAP procedure can
initialize Q in line 2 in O (n)time.
• The for loop in lines 3-10 executes exactly n-1 times, and since each
heap operation runs in O(lg n) time, the loop contributes O(n lg n) to
the running time. Thus, the total running time of HUFFMAN on a set
of n characters is O(n lg n)
13
14.
Minimum Spanning Tree
ProblemDefinition:
Given a connected, undirected graph G=(V , E) where V is the set of vertices, E is
the set of edges, and for each edge (u , v) ∈ E, a weight w(u , v) specifies the cost to
connect u and v. The goal is to find an acyclic subset T ⊆ E that connects all of the
vertices and whose total weight is minimized.
14
Prim’s Algorithm
25
vertex known?cost parent
A yes 0
B yes 4 A
C yes 8 B
D yes 7 C
E yes 9 D
F yes 4 C
G yes 2 F
H yes 1 G
I Yes 2 C
26.
Prim’s Algorithm
The stepsfor implementing Prim's algorithm are as follows:
1. Initialize the minimum spanning tree with a vertex chosen at
random.
2. Find all the edges that connect the tree to new vertices, find the
minimum and add it to the tree
3. Keep repeating step 2 until we get a minimum spanning tree
Taken from https://www.programiz.com/dsa/prim-algorithm
26
Time Complexity Analysis
•The running time of Prim’s algorithm depends on the specific
implementation of the min-priority queue Q.
• Suppose we use binary min-heap i.e.
• For every node 𝑖 except the 𝑟𝑜𝑜𝑡, 𝑝𝑎𝑟𝑒𝑛𝑡 𝑖 .𝑣𝑎𝑙𝑢𝑒 ≤ 𝑖.𝑣𝑎𝑙𝑢𝑒
• The BUILD-MIN-HEAP procedure can perform lines 5-7 in O(V) time.
In fact, there is no need to call BUILD-MIN-HEAP. You can just put the
key of r at the root of the min-heap, and because all other keys are ∞,
they can go anywhere else in the min-heap.
28
29.
• The bodyof the while loop executes O(V) times, and since each
EXTRACT-MIN operation takes O(lgV) time, the total time for all calls
to EXTRACT-MIN is O(V lgV).
• The for loop in lines 10-14 executes O(E) times altogether, since the
sum of the lengths of all adjacency lists is 2|E|. Within the for loop,
the test for membership in Q in line 11 can take constant time if you
keep a bit for each vertex that indicates whether it belongs to Q and
update the bit when the vertex is removed from Q.
29
30.
• Each callto DECREASE-KEY in line 14 takes O(lg V) time. Thus, the
total time for Prim’s algorithm is O(V lg V + E lg V)=O(E lg V)
• DECREASE-KEY (S, x, k) decreases the value of element x’s key to the
new value k, which is assumed to be at least as small as x’s current
key value.
30