Dynamic Programming Problem
A given problem
is a dynamic programming problem if
- Optimal Substructure: The optimal solution to the problem is formed by optimal solutions to sub-problems.
- Overlapping Sub-problems: Some of the sub-problems are the same, and therefore if we remember the solution, we only need to solve these sub-problems once.
Chain Matrix Multiplication
Chain Matrix Multiplication Problem
Given a chain of matrices
decide which order should we multiply this chain of matrices such that total number of arithmetic operations is minimized.
Proposition
The time complexity of computing
is in .
Algorithm
Algorithm Consider a 2-dimensional table
, where for is the optimal cost for computing. We do not care about the values for . Then
Proposition
The time complexity of the above algorithm is in
. Proof We spend at most for each entry of the table and iterate half of the table entries which is in . Thus the overall time complexity is in .
Longest Increasing Subsequence(LIS)
Longest Increasing Subsequence Problem
Given an array
of size , the task is to find the length of the longest increasing subsequence that is, the longest possible subsequence in which the elements of the subsequence are sorted in increasing order. e.g. The LIS of sequence is .
Algorithm
Let
be the length of LIS of . Then can be determined recursively: Then solve it by dynamic programming.
Proposition
The time complexity of the above algorithm is in
.
Longest Common Subsequence
Longest Common Subsequence
Given two strings,
and , the task is to find the the longest subsequence present in both strings.
Algorithm
Algorithm Let
be the length of LCS of and . Then can be determined recursively: where initially and for all indices . Then solve it by 2-dimensional dynamic programming.
Proposition
The table has
entries, thus the time complexity is in .

