Determines whether a string is legal – syntax

Determines whether a string has meaning – static semantics.

Assigns a meaning to a legal sentence – semantics

## Guess and checks:

### Bisection search:

### Newton Raphson: next_guess=guess-p(guess)/derivative(p(guess))

### Converting decimal to binary:

take a decimal number -> 0.375

How can we represent this in binary?

Multiply this decimal number by a power of 2 which can convert it into a whole number. 2^3=8 in this case which gives 8*0.375=3

then convert this whole number to binary (11)

Divide by 2^3 (shift right) to get **0.011(binary)=0*2^-1+1*2^-2+1*2^-3**

If we can’t find p such that x*2^p doesn’t give a whole number, then we can only get approximate binary representation.

### Recursion:

Can a problem be represented as a smaller version of the same problem?

For example, finding the factorial of n can be n times smaller version of factorial of (n-1).

multiplication of a times b can be a + multiplication of a times (b-1).

Towers of Hanoi is the paragon problem used popularly to apply recursion.

gcd(a,b) is same as gcd of (b,a%b) until b becomes 0, when b becomes 0 then the answer is a.

fibonacci

An example of self-referencing.

calling a function from inside the same function with different arguments probably.

Complexity:

Evaluating efficiency of programs:

- Measure with a timer.
- Count the operations.
- The abstract notion of order of growth.

Big O Notation: Upper bound on the asymptotic growth, often called the order of growth.

- Upper bound on the asymptotic growth often called the order of growth.
- Big oh or O(): to describe the worst case.
- You can say the complexity of 5*n+2 as the order of n, O(n).
- ignore additive and multiplicative constants.
- focus on dominant terms.

- Order of dominancy –
- c^n(exponential)>n^c(polynomila)>nlog(n)(loglinear)>n(linear)>log(n)>1
- efficiency in opposite order 🙂

- Law of addition:O(n)+O(n+n)=O(n+n^2)=O(n^2)
- Law of multiplication:
**O(n)*O(n)=O(n^2)**

logarithmic complexity:

- bisection search.
- binary search of a list.
- look out for division operations as well (heuristic).

Linear complexity:

- Factorial – both iterative and recursive.
- n dependent loop.

Log-linear complexity:

- Merge sort.

polynomial complexity:

- Quadratic – nested loops.

Exponential complexity:

- mostly in recursive cases-Towers of Hanoi, generating all subsets(power set).

**Bogo/Monkey/stupid sort:**

Shuffling and checking iteratively until all the elements are sorted.

**Bubble Sort:**

Check two consecutive elements and swap them based on order. Do this all along the list for multiple times.

O(n^2)

in place sort, efficient in terms of space.

**Selection Sort:**

Find the smallest element and put it at the beginning. find the next smallest and put it at the second place and go on.

O(n^2) – Quadratic, more efficient than bubble sort.

The advantage in this is you can stop after getting a certain number of sorted elements.

**Merge sort:**

Divide and Conquer –

Time complexity – O(n*log(n)) – log-linear

Space Complexity – O(n)

**Quicksort**:

Pick a pivot and move all the values larger than it after it and all values less than it before it. Usually, pick the last element as your pivot.

Time Complexity –

worst case Time complexity – O(n^2)

Average case Time complexity – O(n*log(n))

in place sort, efficient in terms of space.