# Notes on Big O Notation

Imagine we have 2 implementations of the same function. How do we know which one is *best*? This is what Big O helps with! Big O notation is a little bit like the Richter scale for code performance.

Write a function that calculates the sum of all numbers from 1 up to (and including) n.

```
function addUpToV1(n) {
let total = 0;
for (let i = 1; i <= n; i++) {
total += i;
}
return total;
}
function addUpToV2(n) {
return n * (n + 1) / 2;
}
```

These both work! But which is better? And what does better mean?

Faster, less memory usage, and more readable

…alas, readability tends to be the lowest ranked in this hierarchy `ಥ_ಥ`

Lets determine which one is faster! We’ll use `performance.now()`

to help determine this.

```
let t1 = performance.now();
addUpToV1(1000000000);
let t2 = performance.now();
console.log(`Time Elapsed: ${(t2 - t1) / 1000} seconds.`);
let t3 = performance.now();
addUpToV2(1000000000);
let t4 = performance.now();
console.log(`Time Elapsed: ${(t4 - t3) / 1000} seconds.`);
```

While this process works, it isn’t heaps reliable and is a bit hairy to talk about because time is problematic. Different computers will record different time — margins are pretty much guaranteed to be different across computers. Also, for stuff like this, time isn’t useful because they’re both wicked quick — the smallest interval of measurable time isn’t small enough!

## ENTER BIG O!

`console.log('big O!');`

Rather than counting time, lets count the number of operations!

What is an operation?

`addUpToV2`

does 3 operations:

- Multiplication
- Addition
- Division

Regardless of whatever value is passed for n, it’ll always do 3 operations.

`addUpToV1`

, now, is a different story! The number of operations is tied to the value of whatever value is passed for n!

Don’t be heaps concerned with the exact number of operations — what is important here is to understand the big picture. Big O Notation is a way to formalize fuzzy counting. With it, we can talk about how the runtime of an algorithm grows as the inputs grow. It is an expression of the relationship between the size of input and runtime.

Big O is all about the broad trends.

Legit definition:

An algorithm is O(f(n)) if the number of simple operations the computer has to do is eventually less than a constant f(n), as n increases

Big O helps answer the question “when n grows, how does it grow?”

Some general guidance to follow:

- Constants don’t matter! O(500) is simplified to O(1), O(13n^2) -> O(n^2)
- Smaller terms don’t matter! O(n+10) -> O(n), O(n^2+5n+8) -> O(n^2)
- Arithmetic operations are constant
- Variable assignment is constant
- Accessing elements in an array (by index) or object (by key) is constant
- In a loop the complexity is determined by the length of the loop multiplied by the the complexity of whatever is happening inside of the loop

```
// O(n), as n grows our runtime grows
function logAtLeast5(n) {
for (var i = 1; i <= Math.max(5, n); i++) {
console.log(i);
}
}
// O(1), as n grows runtime grows, but is capped at 5
function logAtMost5(n) {
for (var i = 1; i <= Math.min(5, n); i++) {
console.log(i);
}
}
let t5 = performance.now();
logAtLeast5(30);
let t6 = performance.now();
console.log(`Time Elapsed: ${(t6 - t5) / 1000} seconds.`);
let t7 = performance.now();
logAtMost5(30);
let t8 = performance.now();
console.log(`Time Elapsed: ${(t8 - t7) / 1000} seconds.`);
```