Backbones of Plan
- Algorithm
- neetcode
- OOP Design
- System Design
- Python ( Quick review)
- Golang ( Quick review)
- Java ( Quick review)
- JS ( Quick review)
- Leetcode more than 1000
- Garbage Collector
- Programming Techniques (TDD,OOP)
- https://adventofcode.com/
- Competitive programming
- SQL
Language Resource
Algorithm Resource
- Data Structures and Algorithms in Python Also read java and read same for golang
- Introduction to Algorithm
- Algorithm Design Manual
Calculate Big O
- Harvard CS50 - Asymptotic Notation (video)
- Big O Notations (general quick tutorial) (video)
- Big O Notation (and Omega and Theta) - best mathematical explanation (video)
- Skiena (video)
- UC Berkeley Big O (video)
- Amortized Analysis (video)
- TopCoder (includes recurrence relations and master theorem):
- Cheat sheet
- [Review] Big-O notation in 5 minutes (video)
Data Structures
-
Arrays
- About Arrays:
- Implement a vector (mutable array with automatic resizing):
- Practice coding using arrays and pointers, and pointer math to jump to an index instead of using indexing.
- New raw data array with allocated memory
- can allocate int array under the hood, just not use its features
- start with 16, or if the starting number is greater, use power of 2 - 16, 32, 64, 128
- size() - number of items
- capacity() - number of items it can hold
- is_empty()
- at(index) - returns the item at a given index, blows up if index out of bounds
- push(item)
- insert(index, item) - inserts item at index, shifts that index’s value and trailing elements to the right
- prepend(item) - can use insert above at index 0
- pop() - remove from end, return value
- delete(index) - delete item at index, shifting all trailing elements left
- remove(item) - looks for value and removes index holding it (even if in multiple places)
- find(item) - looks for value and returns first index with that value, -1 if not found
- resize(new_capacity) // private function
- when you reach capacity, resize to double the size
- when popping an item, if the size is 1/4 of capacity, resize to half
- Time
- O(1) to add/remove at end (amortized for allocations for more space), index, or update
- O(n) to insert/remove elsewhere
- Space
- contiguous in memory, so proximity helps performance
- space needed = (array capacity, which is >= n) * size of item, but even if 2n, still O(n)
-
Linked Lists
- Description:
- C Code (video) - not the whole video, just portions about Node struct and memory allocation
- Linked List vs Arrays:
- Why you should avoid linked lists (video)
- Gotcha: you need pointer to pointer knowledge: (for when you pass a pointer to a function that may change the address where that pointer points) This page is just to get a grasp on ptr to ptr. I don’t recommend this list traversal style. Readability and maintainability suffer due to cleverness.
- Implement (I did with tail pointer & without):
- size() - returns the number of data elements in the list
- empty() - bool returns true if empty
- value_at(index) - returns the value of the nth item (starting at 0 for first)
- push_front(value) - adds an item to the front of the list
- pop_front() - remove the front item and return its value
- push_back(value) - adds an item at the end
- pop_back() - removes end item and returns its value
- front() - get the value of the front item
- back() - get the value of the end item
- insert(index, value) - insert value at index, so the current item at that index is pointed to by the new item at the index
- erase(index) - removes node at given index
- value_n_from_end(n) - returns the value of the node at the nth position from the end of the list
- reverse() - reverses the list
- remove_value(value) - removes the first item in the list with this value
- Doubly-linked List
- Description (video)
- No need to implement
-
Stack
- Stacks (video)
- [Review] Stacks in 3 minutes (video)
- Will not implement. Implementing with the array is trivial
-
Queue
- Queue (video)
- Circular buffer/FIFO
- [Review] Queues in 3 minutes (video)
- Implement using linked-list, with tail pointer:
- enqueue(value) - adds value at a position at the tail
- dequeue() - returns value and removes least recently added element (front)
- empty()
- Implement using a fixed-sized array:
- enqueue(value) - adds item at end of available storage
- dequeue() - returns value and removes least recently added element
- empty()
- full()
- Cost:
- a bad implementation using a linked list where you enqueue at the head and dequeue at the tail would be O(n) because you’d need the next to last element, causing a full traversal of each dequeue
- enqueue: O(1) (amortized, linked list and array [probing])
- dequeue: O(1) (linked list and array)
- empty: O(1) (linked list and array)
-
Hash table
-
Videos:
- Hashing with Chaining (video)
- Table Doubling, Karp-Rabin (video)
- Open Addressing, Cryptographic Hashing (video)
- PyCon 2010: The Mighty Dictionary (video)
- PyCon 2017: The Dictionary Even Mightier (video)
- (Advanced) Randomization: Universal & Perfect Hashing (video)
- (Advanced) Perfect hashing (video)
- [Review] Hash tables in 4 minutes (video)
-
Online Courses:
-
Implement with array using linear probing
- hash(k, m) - m is the size of the hash table
- add(key, value) - if the key already exists, update value
- exists(key)
- get(key)
- remove(key)
-
More Knowledge
-
Binary search
- Binary Search (video)
- Binary Search (video)
- detail
- blueprint
- [Review] Binary search in 4 minutes (video)
- Implement:
- binary search (on a sorted array of integers)
- binary search using recursion
-
Bitwise operations
- Bits cheat sheet
- you should know many of the powers of 2 from (2^1 to 2^16 and 2^32)
- Get a really good understanding of manipulating bits with: &, |, ^, ~, >>, <<
- 2s and 1s complement
- Count set bits
- Swap values:
- Absolute value:
- Bits cheat sheet
Trees
-
Trees - Intro
- Intro to Trees (video)
- Tree Traversal (video)
- BFS(breadth-first search) and DFS(depth-first search) (video)
- BFS notes:
- level order (BFS, using queue)
- time complexity: O(n)
- space complexity: best: O(1), worst: O(n/2)=O(n)
- DFS notes:
- time complexity: O(n)
- space complexity: best: O(log n) - avg. height of tree worst: O(n)
- inorder (DFS: left, self, right)
- postorder (DFS: left, right, self)
- preorder (DFS: self, left, right)
- BFS notes:
- [Review] Breadth-first search in 4 minutes (video)
- [Review] Depth-first search in 4 minutes (video)
- [Review] Tree Traversal (playlist) in 11 minutes (video)
-
Binary search trees: BSTs
- Binary Search Tree Review (video)
- Introduction (video)
- MIT (video)
- C/C++:
- Binary search tree - Implementation in C/C++ (video)
- BST implementation - memory allocation in stack and heap (video)
- Find min and max element in a binary search tree (video)
- Find the height of a binary tree (video)
- Binary tree traversal - breadth-first and depth-first strategies (video)
- Binary tree: Level Order Traversal (video)
- Binary tree traversal: Preorder, Inorder, Postorder (video)
- Check if a binary tree is a binary search tree or not (video)
- Delete a node from Binary Search Tree (video)
- Inorder Successor in a binary search tree (video)
- Implement:
- insert // insert value into tree
- get_node_count // get count of values stored
- print_values // prints the values in the tree, from min to max
- delete_tree
- is_in_tree // returns true if a given value exists in the tree
- get_height // returns the height in nodes (single node’s height is 1)
- get_min // returns the minimum value stored in the tree
- get_max // returns the maximum value stored in the tree
- is_binary_search_tree
- delete_value
- get_successor // returns the next-highest value in the tree after given value, -1 if none
-
Heap / Priority Queue / Binary Heap
- visualized as a tree, but is usually linear in storage (array, linked list)
- Heap
- Introduction (video)
- Binary Trees (video)
- Tree Height Remark (video)
- Basic Operations (video)
- Complete Binary Trees (video)
- Pseudocode (video)
- Heap Sort - jumps to start (video)
- Heap Sort (video)
- Building a heap (video)
- MIT: Heaps and Heap Sort (video)
- CS 61B Lecture 24: Priority Queues (video)
- Linear Time BuildHeap (max-heap)
- [Review] Heap (playlist) in 13 minutes (video)
- Implement a max-heap:
- insert
- sift_up - needed for insert
- get_max - returns the max item, without removing it
- get_size() - return number of elements stored
- is_empty() - returns true if the heap contains no elements
- extract_max - returns the max item, removing it
- sift_down - needed for extract_max
- remove(x) - removes item at index x
- heapify - create a heap from an array of elements, needed for heap_sort
- heap_sort() - take an unsorted array and turn it into a sorted array in place using a max heap or min heap
Sorting
-
Notes:
- Implement sorts & know best case/worst case, average complexity of each:
- no bubble sort - it’s terrible - O(n^2), except when n ⇐ 16
- Stability in sorting algorithms (“Is Quicksort stable?“)
- Which algorithms can be used on linked lists? Which on arrays? Which of both?
- I wouldn’t recommend sorting a linked list, but merge sort is doable.
- Merge Sort For Linked List
- Implement sorts & know best case/worst case, average complexity of each:
-
For heapsort, see the Heap data structure above. Heap sort is great, but not stable
-
UC Berkeley:
-
Merge sort code:
-
Quick sort code:
-
Implement:
- Mergesort: O(n log n) average and worst case
- Quicksort O(n log n) average case
- Selection sort and insertion sort are both O(n^2) average and worst-case
- For heapsort, see Heap data structure above
-
Not required, but I recommended them:
As a summary, here is a visual representation of 15 sorting algorithms. If you need more detail on this subject, see the “Sorting” section in Additional Detail on Some Subjects
Graphs
Graphs can be used to represent many problems in computer science, so this section is long, like trees and sorting.
-
Notes:
- There are 4 basic ways to represent a graph in memory:
- objects and pointers
- adjacency matrix
- adjacency list
- adjacency map
- Familiarize yourself with each representation and its pros & cons
- BFS and DFS - know their computational complexity, their trade-offs, and how to implement them in real code
- When asked a question, look for a graph-based solution first, then move on if none
- There are 4 basic ways to represent a graph in memory:
-
MIT(videos):
-
Skiena Lectures - great intro:
- CSE373 2020 - Lecture 10 - Graph Data Structures (video)
- CSE373 2020 - Lecture 11 - Graph Traversal (video)
- CSE373 2020 - Lecture 12 - Depth First Search (video)
- CSE373 2020 - Lecture 13 - Minimum Spanning Trees (video)
- CSE373 2020 - Lecture 14 - Minimum Spanning Trees (con’t) (video)
- CSE373 2020 - Lecture 15 - Graph Algorithms (con’t 2) (video)
-
Graphs (review and more):
- 6.006 Single-Source Shortest Paths Problem (video)
- 6.006 Dijkstra (video)
- 6.006 Bellman-Ford (video)
- 6.006 Speeding Up Dijkstra (video)
- Aduni: Graph Algorithms I - Topological Sorting, Minimum Spanning Trees, Prim’s Algorithm - Lecture 6 (video)
- Aduni: Graph Algorithms II - DFS, BFS, Kruskal’s Algorithm, Union Find Data Structure - Lecture 7 (video)
- Aduni: Graph Algorithms III: Shortest Path - Lecture 8 (video)
- Aduni: Graph Alg. IV: Intro to geometric algorithms - Lecture 9 (video)
- CS 61B 2014: Weighted graphs (video)
- Greedy Algorithms: Minimum Spanning Tree (video)
- Strongly Connected Components Kosaraju’s Algorithm Graph Algorithm (video)
- [Review] Shortest Path Algorithms (playlist) in 16 minutes (video)
- [Review] Minimum Spanning Trees (playlist) in 4 minutes (video)
-
Full Coursera Course:
-
I’ll implement:
- DFS with adjacency list (recursive)
- DFS with adjacency list (iterative with stack)
- DFS with adjacency matrix (recursive)
- DFS with adjacency matrix (iterative with stack)
- BFS with adjacency list
- BFS with adjacency matrix
- single-source shortest path (Dijkstra)
- minimum spanning tree
- DFS-based algorithms (see Aduni videos above):
- check for a cycle (needed for topological sort, since we’ll check for the cycle before starting)
- topological sort
- count connected components in a graph
- list strongly connected components
- check for bipartite graph
Even More Knowledge
-
Recursion
- Stanford lectures on recursion & backtracking:
- When it is appropriate to use it?
- How is tail recursion better than not?
- 5 Simple Steps for Solving Any Recursive Problem(video)
-
Dynamic Programming
- You probably won’t see any dynamic programming problems in your interview, but it’s worth being able to recognize a problem as being a candidate for dynamic programming.
- This subject can be pretty difficult, as each DP soluble problem must be defined as a recursion relation, and coming up with it can be tricky.
- I suggest looking at many examples of DP problems until you have a solid understanding of the pattern involved.
- Videos:
- Skiena: CSE373 2020 - Lecture 19 - Introduction to Dynamic Programming (video)
- Skiena: CSE373 2020 - Lecture 20 - Edit Distance (video)
- Skiena: CSE373 2020 - Lecture 20 - Edit Distance (continued) (video)
- Skiena: CSE373 2020 - Lecture 21 - Dynamic Programming (video)
- Skiena: CSE373 2020 - Lecture 22 - Dynamic Programming and Review (video)
- Simonson: Dynamic Programming 0 (starts at 59:18) (video)
- Simonson: Dynamic Programming I - Lecture 11 (video)
- Simonson: Dynamic programming II - Lecture 12 (video)
- List of individual DP problems (each is short): Dynamic Programming (video)
- Yale Lecture notes:
- Coursera:
Interview Prep Books
You don’t need to buy a bunch of these. Honestly “Cracking the Coding Interview” is probably enough, but I bought more to give myself more practice. But I always do too much.
I bought both of these. They gave me plenty of practice.