# Dynamic programming Dynamic programming is an algorithm design technique for problems where the solution of the problem contains the solution to subproblems which computation is overlapping. A couple of definitions: - Optimal substructure means that the optimal solution to a problem (instance) contains optimal solutions to subproblems. - Overlapping subproblems means that a recursive solution contains a "small" number of distinct subproblems repeated many times. Generally redoing previous done computations solutions for the overlapping subproblems or the substructure feature of a problem should hint us at using a dynamic programming solution. ## Memoization To avoid re-computations we save the results of the subproblems in memory: when we face a certain subproblem, we simply look up the solution. We can proceed bottom-up or top-down: - The bottom-up approach computes all the computations before the current computation so when we face a subproblem we know we have already solved all the subproblems that are 'previous' to that one. We avoid recursion overheads. - The top-down approach computes at the moment the subproblem and when we face a sub-problem, we first look if the solution has already been computed (but it's not guaranteed as in bottom-up approach), if not we solve it and save it in memory; this way we solve subproblems only when necessary. [Climbing Stairs - Dynamic Programming - Leetcode 70 - Python - YouTube](https://www.youtube.com/watch?v=Y0lT9Fck7qI)