This is the algorithm of the night…

Lane Garner
4 min readNov 16, 2020
Photo by Roman Mager on Unsplash

Tell us about something you learned this week.

Short-circuit evaluation is an excellent way to keep your code short, clean, and D.R.Y. A short-circuit evaluation is essentially a way to shorten a conditional expression and it works with all three logical operators: || , && , ! . For example, if you had a variable that determined whether a user is online and another to greet the user:

const user.name = 'Brian'
const user.online = true
const greet = () => console.log(`Hello ${user.name}, you are online!`)

We can use a normal conditional statement:

if (user.online) {
greet()
}
// Hello Brian, you are online!

This can also be done with a short-circuit evaluation:

user.online && greet()// Hello Brian, you are online!

Ahhhh. Clean code.

What are the pros and cons of immutability? How can you achieve immutability in your own code?

We can assign variables to be either mutable or immutable using let or const respectively. If you know that a variable will not change then const is the way to go. Likewise, if you plan on changing a variable go with let. Another place that let is especially useful is to instantiate an undefined variable at the global level which can then be accessed and modified by different functions.

What are Divide and Conquer algorithms? Describe how they work. Can you give any common examples of the types of problems where this approach might be used?

Divide and Conquer algorithms recursively divide a problem into smaller versions of the same problem, conquer each smaller problem, and finally combine the smaller solutions to form a full solution. An example of where this method is used is when sorting with quicksort or merge sort.

How do insertion sort, heap sort, quick sort, and merge sort work? What are the key advantages of insertion sort, quick sort, heap sort and merge sort? Discuss best, average, and worst-case time and memory complexity.

Insertion: Non-recursive. Everything to the left is already sorted so only focus on sorting the remaining numbers on the right. Corresponds to how most people would sort a deck of cards. Works best when input is already nearly sorted. The main advantage is its simplicity and works best with a small list but it doesn't work as well as other sorting algorithms. Worst-case: O(n2). Best-case: O(n). Average performance: O(n2). Complexity: O(n) total or O(1) auxiliary.

Heapsort: Non-recursive. Find the maximum value and place it at the end. This divides the array into a sorted portion on the right and unsorted on the left. Continue throughout the array. This is a very efficient algorithm for sorting LARGE lists of numbers because unlike other algorithms that grow exponentially slower as the list grows, heapsort increases logarithmically. Worst-case: O(n log n). Best-case: O(n log n) for distinct keys or O(n) for equal keys. Average performance: O(n log n). Complexity: O(n) total or O(1) auxiliary.

Quicksort: Uses recursion (divide & conquer). Partition the array into two sides and a pivot number. Recursively sort each part of the partition where everything to the right is larger than the pivot and everything to the right is smaller than the pivot. Repeat with each item as a pivot until sorted. Quicksort is regarded as the best sorting algorithm because it works well with large lists of numbers while the primary (and minimal) disadvantage is that it works similarly to other sorting methods with smaller lists. Worst-case: O(n2). Best-case: O(n log n) for a simple partition, or O(n) for a three-way partition with equal keys. Average performance: O(n log n). Complexity: O(n) auxiliary or O(log n) auxiliary.

Merge sort: Uses recursion (divide & conquer). Split array into two halves, recursively sort each half, combine the two halves. Rule: always choose the smallest item. Merge sort is more efficient than quicksort for longer lists though not as efficient as other sorting algorithms. Worst-case: O(n log n). Best-case: Ω(n log n) typical, or Ω(n) natural variant. Average performance: Θ(n log n). Complexity: O(n) total with O(n) auxiliary, O(1) with linked lists.

Explain the difference between mutable and immutable objects.

Mutable objects are those whose values can change after being created while immutable objects are those whose values cannot change after being created.

What are the three laws of algorithm recursion? Describe them in your own words and what they mean to you.

  1. A recursive algorithm must have a base case. This means that the problem must be simple enough to have a direct solution
  2. A recursive algorithm must change its state and move toward the base case. This basically means that the list must change as the solution is being solved and as it changes it gets closer and closer to the solution.
  3. A recursive algorithm must call itself, recursively. This is the definition of recursion: it must call itself.

--

--