CS 411 Fall 2025 > Outline for October 24, 2025
CS 411 Fall 2025
Outline
for October 24, 2025
Outline
Dynamic Programming [L Ch 8 intro]
- Dynamic Programming
  - What It Is
    - Our last time-space trade-off, but worth a whole chapter.
- Comes out of operations research; not originally a computing term; “programming” refers to planning.
- Some problems have overlapping subproblems. So solving subproblems involves common work (sub-subproblems solved more than once).
- To save time, solve each subproblem (or sub-subproblem, or ...) just once, and then save the answer for later use.
- Originally formulated only in the context of optimization.
 
- Variations
    - Bottom-up. The classical dynamic-programming approach. Solve subproblems from smallest to largest, saving results for each as we go.
- Top-down.
    Do not solve subproblems before results are needed.
    Computation is driven by recursive calls.
      - Needs memory function: function that, if called a second time with same arguments, does not need to recompute, since it can look up saved results.
- This technique is also called memoizing, particularly in the context of functional programming.
 
- We look at bottom-up methods now and top-down methods later.
 
- Optimization
    - In an optimization problem, we seek the best (largest, smallest, cheapest, etc.) example of something.
- Feasible solution: something that fits our constraints. We need to be able to measure the “goodness” of a feasible solutions.
- Objective function: a function that takes a feasible solutions and returns a number indicating how good it is.
- Optimal solution: feasible solution with the best value of the objective function (largest, if we are doing maximization, or smallest, if we are doing minimization). Best value is the optimum.
- Example: finding shortest path between two vertices \(x\), \(y\) in a graph. Feasible solutions are paths from \(x\) to \(y\). Objective function is length of path. We are doing minimization. Optimal solution is path with least length.
 
- Principle of Optimality
    - To use DP on an optimization problem we need the Principle of Optimality to be true. This says that an optimal solution is composed of optimal solutions to subproblems.
- Example 1. The principle holds for the above shortest-path problem. If an \(x,y\)-path passes through vertex \(v\), then the portion of the path from \(x\) to \(v\) is a shortest \(x,v\)-path, and similarly for the portion from \(v\) to \(y\).
- Example 2. The principle does not holds for the longest-path problem.
 
 
- What It Is
    
Dynamic-Programming Examples [L 8.1]
- Coin Row
  - Problem: given row of coins, find greatest value formed by taking some of them, with the restriction that we can never take two consecutive coins.
- Analysis: \(n\) is number of coins. Basic ops are integer operations.
- Brute-force solution: recursive. Exponential time.
- Bottom-up dynamic-programming solution: Find small solutions, work way up. Greatly increases speed: linear time.