How does our performance
scale?
.
Smaller
Big O is better (lower time
complexity)
Simplify Products
: If the function is a product
of many terms, we drop the terms
that don't depend on
n.
Simplify Sums
: If the function is a sum of
many terms, we drop the
non-dominant terms.
n
: size of the input
T(f)
: unsimplified math
function
O(f)
: simplified math
function.
Putting it all together
O(1) Constant
The algorithm takes roughly the same number of steps for any input size.
O(log(n)) Logarithmic
In most cases our hidden base of Logarithmic time is 2, log complexity algorithm’s will typically display ‘halving’ the size of the input (like binary search!)
O(n) Linear
Linear algorithm’s will access each item of the input “once”.
O(nlog(n)) Log Linear Time
Combination of linear and logarithmic behavior, we will see features from both classes.
Algorithm’s that are log-linear will use both recursion AND iteration.
O(nc) Polynomial
C is a fixed constant.
O(c^n) Exponential
C is now the number of recursive calls made in each stack frame.
Algorithm’s with exponential time are VERY SLOW.
Use When:
Normal Recursive Fibonacci
Memoization Fibonacci 1
Memoization Fibonacci 2
Tabulated Fibonacci
Time Complexity
: Quadratic O(n^2)
Space Complexity
: O(1)
It’s almost never used in production code because:
Bubbling Up
: Term that infers that an item
is in motion, moving in some
direction, and has some final
resting destination.
Time Complexity
: Quadratic O(n^2)
Space Complexity
: O(1)
Summary of how Selection Sort should work:
Time Complexity
: Quadratic O(n^2)
Space Complexity
: O(n)
Time Complexity
: Log Linear O(nlog(n))
Space Complexity
: O(n)
Steps:
Time Complexity
: Quadratic O(n^2)
Space Complexity
: O(n)
Time Complexity
: Log Time O(log(n))
Space Complexity
: O(1)Recursive Solution
Min Max Solution
Steps: