Skip to content

Commit 9386d1e

Browse files
authored
Big oh uniform usage (#82)
1 parent 131f827 commit 9386d1e

19 files changed

+52
-98
lines changed

README.md

-1
Original file line numberDiff line numberDiff line change
@@ -75,7 +75,6 @@ Welcome to **Data Structures and Algorithms in Go**! 🎉 This project is design
7575
* [Palindrome](./recursion/is_palindrome_test.go)
7676
* [Climbing Stairs](./recursion/climbing_stairs_test.go)
7777
* [Exponentiation](./recursion/exponentiation_test.go)
78-
* [Permutations](./recursion/permutations_test.go)
7978
* [Regular Expressions Matching](./recursion/)
8079
* [Divide and Conquer](./dnc//README.md)
8180
* [Binary Search](./dnc/binary_search_test.go)

complexity.md

+33-22
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ t│ t│ ... t│
3232
└────────────────────────► └────────────────────────► └────────────────────────►
3333
n n n
3434
35-
O(N Log N) O(Log 2^n) O(2^n)
35+
O(n*Log n) O(Log 2^n) O(2^n)
3636
▲ . ▲ . ▲ .
3737
│ .. │ . │ .
3838
│ . │ . │ .
@@ -75,57 +75,68 @@ However, it is essential to note that this is not always the case. In practice,
7575

7676
Big O notation of an algorithm can be simplified using the following two rules:
7777

78-
1. Remove constants. `O(n) + 2*O(n Log n) + 3*O(K) + 5` is simplified to `O(n) + O(n Log n) + O(K)`.
79-
2. Remove non dominant, or slower terms. `O(n) + O(n Log n) + O(K)` is simplified to `O(n Log n)` because `O(n Log n)` is the most dominant term..
78+
1. Remove constants. `O(n) + 2*O(n*Log n) + 3*O(K) + 5` is simplified to `O(n) + O(n*Log n) + O(K)`.
79+
2. Remove non dominant, or slower terms. `O(n) + O(n*Log n) + O(K)` is simplified to `O(n*Log n)` because `O(n*Log n)` is the most dominant term..
8080

8181
### Constant - O(K) or O(1)
8282

8383
Constant time complexity represents the most efficient scenario for an algorithm, where the execution time remains constant regardless of the input size. Achieving constant time complexity often involves eliminating loops and recursive calls. Examples:
8484

85-
* Reads and writes in a [hash table](../hashtable)
86-
* Enqueue and Dequeue in a [queue](../queue)
87-
* Push and Pop in a [stack](../stack)
88-
* Finding the minimum or maximum in [heap](../heap)
89-
* Removing the last element of a [doubly linked list](../linkedlist)
85+
* Reads and writes in a [hash table](./hashtable/README.md)
86+
* Enqueue and Dequeue in a [queue](./queue/README.md)
87+
* Push and Pop in a [stack](./stack/README.md)
88+
* Finding the minimum or maximum in [heap](./heap/README.md)
89+
* Removing the last element of a [doubly linked list](./linkedlist/README.md)
90+
* [Max without conditions](./bit/max_function_without_conditions.go)
9091

9192
### Logarithmic - O(Log n)
9293

9394
Attaining logarithmic time complexity in an algorithm is highly desirable as it eliminates the need to iterate through every input in order to solve a given problem. Examples:
9495

95-
* Searching sorted items using [Binary Search](../dnc)
96-
* Inserting, Deleting and Searching in a [Binary Search Tree](../tree)
97-
* Push and Pop in [heap](../heap)
96+
* Searching sorted items using [Binary Search](./dnc/binary_search.go)
97+
* Inserting, Deleting and Searching in a [Binary Search Tree](./tree/README.md)
98+
* Push and Pop in [heap](./heap/README.md)
99+
* [Square Root](./dnc/square_root.go)
100+
* [Median in a Stream](./heap/median_in_a_stream.go)
98101

99102
### Linear - O(n)
100103

101104
Linear time complexity is considered favorable when an algorithm necessitates traversing every input, with no feasible way to avoid it. Examples:
102105

103-
* Removing the last element in a [singly linked list](../linkedlist)
104-
* Searching an unsorted [array](../array) or [linked list](../linklist)
106+
* Removing the last element in a [singly linked list](./linkedlist/README.md)
107+
* Searching an unsorted [array](./array/README.md) or [linked list](./linkedlist/README.md)
108+
* [Number of Islands](./graph/number_of_islands.go)
109+
* [Missing Number](./hashtable/missing_number.go)
105110

106-
### O(n Log n)
111+
### O(n*Log n)
107112

108-
The time complexity of O(n log n) is commonly observed when it is necessary to iterate through all inputs, and can yield an out come at the same time through an efficient operation. Sorting is a common example. It's not possible to sort items faster than O(n log n). Examples:
113+
The time complexity of O(n*Log n) is commonly observed when it is necessary to iterate through all inputs, and can yield an out come at the same time through an efficient operation. Sorting is a common example. It's not possible to sort items faster than O(n*Log n). Examples:
109114

110-
* [Merge Sort](../dnc) and [Heap Sort](../heap)
111-
* In order traversal of a [Binary Search Tree](../tree)
115+
* [Merge Sort](./dnc/merge_sort.go) and [Heap Sort](./heap/README.md)
116+
* [Knapsack](./greedy/knapsack.go)
117+
* [Find Anagrams](./hashtable/find_anagrams.go)
118+
* In order traversal of a [Binary Search Tree](./tree/README.md)
112119

113120
### Polynomial - O(n^2)
114121

115122
Polynomial time complexity marks the initial threshold of problematic time complexity for algorithms. This complexity often arises when an algorithm includes nested loops, involving both an inner loop and an outer loop. Examples:
116123

117-
* Bubble sort rehearsal problem in [array](../array)
118-
* Naive way of searching an unsorted [array](../array) for duplicates by using nested loops
124+
* [Bubble Sort](./array/bubble_sort.go)
125+
* [Cheapest Flight](./graph/cheapest_flights.go)
126+
* [Remove Invalid Parenthesis](./graph/remove_invalid_parentheses.go)
119127

120128
### Exponential O(2^n)
121129

122130
Exponential complexity is considered highly undesirable; however, it represents only the second-worst complexity scenario. Examples:
123131

124-
* Basic [Recursive](../recursion) implementation of Fibonacci
125-
* Tower of Hanoi rehearsal in [divide and conquer](../dnc)
132+
* [Climbing Stairs](./recursion/climbing_stairs.go)
133+
* [Tower of Hanoi](./dnc/towers_of_hanoi.go)
134+
* [Generate Parenthesis](./backtracking/generate_parenthesis.go)
135+
* Basic [Recursive](./recursion/README.md) implementation of Fibonacci
126136

127137
### Factorial O(n!)
128138

129139
Factorial time complexity represents the most severe time complexity for an algorithm. Understanding the scale of factorials is crucial, as even the estimated total number of atoms in the universe, which is approximately 10^80, is smaller than the factorial of 57. Example:
130140

131-
* Permutations rehearsal in [back tracking](../backtracking)
141+
* [N queens](./backtracking/n_queens.go)
142+
* [Permutations](./backtracking/permutations.go)

dnc/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,7 @@ func search(list []int, target int) int {
8484

8585
## Complexity
8686

87-
If used inappropriately, DNC algorithms can lead to an exponential number of unnecessary recursive calls, resulting in a time complexity of O(2^n). However, if an appropriate dividing strategy and base case that can be solved directly are identified, DNC algorithms can be very effective, with a time complexity as low as O(log n) in the case of binary search. As DNC algorithms are recursive in nature, their complexity analysis is analogous to that of [recursive](../recursion) algorithms.
87+
If used inappropriately, DNC algorithms can lead to an exponential number of unnecessary recursive calls, resulting in a time complexity of O(2^n). However, if an appropriate dividing strategy and base case that can be solved directly are identified, DNC algorithms can be very effective, with a time complexity as low as O(Log n) in the case of binary search. As DNC algorithms are recursive in nature, their complexity analysis is analogous to that of [recursive](../recursion) algorithms.
8888

8989
## Application
9090

dnc/binary_search.go

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
package dnc
22

3-
// BinarySearch solves the problem in O(log n) time and O(1) space.
3+
// BinarySearch solves the problem in O(Log n) time and O(1) space.
44
func BinarySearch(list []int, search int) int {
55
return binarySearchRecursive(list, 0, len(list), search)
66
}

dnc/merge_sort.go

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
package dnc
22

3-
// MergeSort solves the problem in O(n log n) time and O(n) space.
3+
// MergeSort solves the problem in O(n*Log n) time and O(n) space.
44
func MergeSort(list []int) []int {
55
if len(list) <= 1 {
66
return list

dnc/square_root.go

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
package dnc
22

3-
// SquareRoot solves the problem in O(log n) time and O(1) space.
3+
// SquareRoot solves the problem in O(Log n) time and O(1) space.
44
func SquareRoot(number, precision int) float64 {
55
start := 0
66
end := number

graph/network_delay_time.go

+1-1
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ const (
1212
edgeDestination, edgeCost = 0, 1
1313
)
1414

15-
// NetworkDelayTime solves the problem in O(n log n) time and O(n) space.
15+
// NetworkDelayTime solves the problem in O(n*Log n) time and O(n) space.
1616
func NetworkDelayTime(n, k int, edges [][3]int) int {
1717
var (
1818
verticesMap, edgesHeap = verticesAndEdges(edges, k)

greedy/knapsack.go

+1-1
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ type KnapsackItem struct {
88
Value int
99
}
1010

11-
// Knapsack solves the problem in O(n*log(n)) time and O(1) space.
11+
// Knapsack solves the problem in O(n*Log n) time and O(1) space.
1212
func Knapsack(items []KnapsackItem, capacity int) int {
1313
sort.Slice(items, func(i, j int) bool {
1414
return items[i].Value/items[i].Weight > items[j].Value/items[j].Weight

greedy/task_scheduling.go

+1-1
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ type Event struct {
88
EndTime int
99
}
1010

11-
// Solves the problem in O(n*log(n)) time and O(1) space.
11+
// Solves the problem in O(n*Log n) time and O(1) space.
1212
func ScheduleEvents(events []Event) []Event {
1313
sort.Slice(events, func(i, j int) bool {
1414
return events[i].EndTime < events[j].EndTime

hashtable/find_anagrams.go

+1-1
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ import "sort"
44

55
type sortRunes []rune
66

7-
// FindAnagrams solves the problem in O(n*log(n)) time and O(n) space.
7+
// FindAnagrams solves the problem in O(n*Log n) time and O(n) space.
88
func FindAnagrams(words []string) [][]string {
99
anagrams := make(map[string][]string)
1010
for _, word := range words {

heap/README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ A heap must satisfy two conditions:
77
1. The structure property requires that the heap be a complete binary search [tree](../tree), where each level is filled left to right, and all levels except the bottom are full.
88
2. The heap property requires that the children of a node be larger than or equal to the parent node in a min heap and smaller than or equal to the parent in a max heap, meaning that the root is the minimum in a min heap and the maximum in a max heap.
99

10-
As a result, if you push elements to the min or max heap and then pop them one by one, you will obtain a list that is sorted in ascending or descending order, respectively. This sorting technique is also an O(NLogN) algorithm known as heap sort. Although there are other sorting algorithms available, none of them are faster than O(NLogN).
10+
As a result, if you push elements to the min or max heap and then pop them one by one, you will obtain a list that is sorted in ascending or descending order, respectively. This sorting technique is also an O(n*Log n) algorithm known as heap sort. Although there are other sorting algorithms available, none of them are faster than O(n*Logn).
1111

1212
When pushing a new element to a heap, because of the structure property we always add the new element to the first available position on the lowest level of the heap, filling from left to right. Then to maintain the heap property, if the newly inserted element is smaller than its parent in a min heap (larger in a max heap), then we swap it with its parent. We continue swapping the swapped element with its parent until the heap property is achieved.
1313

@@ -81,11 +81,11 @@ In Go, the heap implementation is based on slices. The heap property is maintain
8181

8282
## Complexity
8383

84-
The time complexity of pushing and popping heap elements is O(LogN). On the other hand, initializing a heap, which involves pushing N elements, has a time complexity of O(NLogN).
84+
The time complexity of pushing and popping heap elements is O(LogN). On the other hand, initializing a heap, which involves pushing N elements, has a time complexity of O(n*Log n).
8585

8686
The insertion strategy entails percolating the new element up the heap until the correct location is identified. Similarly, the deletion strategy involves percolating down the heap.
8787

88-
Pushing and Popping heap elements are all O(LogN) operations. The strategy for inserting is the new element is percolating up the heap until the correct location is found. similarly the strategy for deletion is to percolate down.
88+
Pushing and Popping heap elements are all O(Log n) operations. The strategy for inserting is the new element is percolating up the heap until the correct location is found. similarly the strategy for deletion is to percolate down.
8989

9090
## Application
9191

heap/k_closest_points_to_origin.go

+1-1
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ type (
1313
pointsHeap []*point
1414
)
1515

16-
// KClosestPointToOrigin solves the problem in O(nlogk) time and O(k) space.
16+
// KClosestPointToOrigin solves the problem in O(n*Log k) time and O(k) space.
1717
func KClosestPointToOrigin(points [][]int, k int) [][]int {
1818
if len(points) <= 1 {
1919
return points

heap/median_in_a_stream.go

+1-1
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ func newMedianKeeper() medianKeeper {
1515
return medianKeeper{&maxHeap{}, &minHeap{}}
1616
}
1717

18-
// addNumber solves the problem in O(log n) time and O(n) space.
18+
// addNumber solves the problem in O(Log n) time and O(n) space.
1919
func (m *medianKeeper) addNumber(num int) {
2020
if m.len()%2 == 0 {
2121
if m.len() == 0 {

heap/merge_sorted_list.go

+1-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ type (
1010
priorityQueue []*linkedlist.Node
1111
)
1212

13-
// MergeSortedLists solves the problem in O(nlogk) time and O(k) space.
13+
// MergeSortedLists solves the problem in O(n*Log k) time and O(k) space.
1414
func MergeSortedLists(lists []*linkedlist.Node) *linkedlist.Node {
1515
pq := new(priorityQueue)
1616

heap/sliding_maximum.go

+1-1
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ import "container/heap"
44

55
type slidingWindow []int
66

7-
// MaxSlidingWindow solves the problem in O(nlogk) time and O(k) space.
7+
// MaxSlidingWindow solves the problem in O(n*Log k) time and O(k) space.
88
func MaxSlidingWindow(numbers []int, k int) []int {
99
output := []int{}
1010
if len(numbers) <= 1 || len(numbers) < k {

recursion/README.md

-4
Original file line numberDiff line numberDiff line change
@@ -73,10 +73,6 @@ Given n the number of steps, return in how many ways you can climb these stairs
7373

7474
Given x and n, return x raised to the power of n in an efficient manner. [Solution](exponentiation.go) [Test](exponentiation_test.go)
7575

76-
### Permutations
77-
78-
Given a set of integers like `{1,2}`, return all possible permutations like `{1,2},{2,1}`. [Solution](permutations.go) [Test](permutations_test.go)
79-
8076
### Regular Expressions Matching
8177

8278
Given an input and a regular expression pattern where `.` denotes to any character and `*` denotes to zero or more of the proceeding characters, write a recursive function to return true if the input matches the pattern and false otherwise. [Solution](regular_expressions.go) [Test](regular_expressions_test.go)

recursion/permutations.go

-20
This file was deleted.

recursion/permutations_test.go

-32
This file was deleted.

tree/README.md

+4-4
Original file line numberDiff line numberDiff line change
@@ -40,21 +40,21 @@ There are many types of trees. Some important tree types include:
4040

4141
A Binary Search Tree (BST) is a type of sorted tree where, for every node n, the values of all nodes in its left subtree are less than n and the values of all nodes in its right subtree are greater than n.
4242

43-
Performing an In-Order traversal of a binary search tree and outputting each visited node results in a sorted (In-Order) list of nodes. This is known as the tree sort algorithm, which has a time complexity of O(NLogN). While there are other sorting algorithms available, none are more efficient than O(NLogN).
43+
Performing an In-Order traversal of a binary search tree and outputting each visited node results in a sorted (In-Order) list of nodes. This is known as the tree sort algorithm, which has a time complexity of O(NLogN). While there are other sorting algorithms available, none are more efficient than O(n*Log n).
4444

4545
### BST Complexity
4646

4747
The time complexity of operations such as Search, Deletion, Insertion, and finding the minimum and maximum values in a binary search tree is O(h), where h represents the height of the tree.
4848

4949
## AVL - Height Balanced BST
5050

51-
A height balanced binary search tree has a height of O(log n) and its left and right subtrees of all nodes have equal heights.
51+
A height balanced binary search tree has a height of O(Log n) and its left and right subtrees of all nodes have equal heights.
5252

5353
In order to maintain balance after an insertion, a single rotation is needed if the insertion was on the outer side, either left-left or right-right, while a double rotation is required if the insertion was on the inner side, either left-right or right-left.
5454

5555
### AVL Complexity
5656

57-
Same as a Binary Search Tree except that the height of the tree is known. So Search, Deletion, Insertion, and finding Min and Max in an AVL tree are all O(LogN) operations.
57+
Same as a Binary Search Tree except that the height of the tree is known. So Search, Deletion, Insertion, and finding Min and Max in an AVL tree are all O(Log n) operations.
5858

5959
## Trie
6060

@@ -66,7 +66,7 @@ Insertion and Search are done in O(K), where K is the length of the word.
6666

6767
## Application
6868

69-
Trees, such as Binary Search Trees (BSTs), can offer a time complexity of O(log n) for searches, as opposed to the linear access time of linked lists. Trees are widely employed in search systems, and operating systems can represent file information using tree structures.
69+
Trees, such as Binary Search Trees (BSTs), can offer a time complexity of O(Log n) for searches, as opposed to the linear access time of linked lists. Trees are widely employed in search systems, and operating systems can represent file information using tree structures.
7070

7171
## Rehearsal
7272

0 commit comments

Comments
 (0)