You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -75,57 +75,68 @@ However, it is essential to note that this is not always the case. In practice,
75
75
76
76
Big O notation of an algorithm can be simplified using the following two rules:
77
77
78
-
1. Remove constants. `O(n) + 2*O(nLog n) + 3*O(K) + 5` is simplified to `O(n) + O(nLog n) + O(K)`.
79
-
2. Remove non dominant, or slower terms. `O(n) + O(nLog n) + O(K)` is simplified to `O(nLog n)` because `O(nLog n)` is the most dominant term..
78
+
1. Remove constants. `O(n) + 2*O(n*Log n) + 3*O(K) + 5` is simplified to `O(n) + O(n*Log n) + O(K)`.
79
+
2. Remove non dominant, or slower terms. `O(n) + O(n*Log n) + O(K)` is simplified to `O(n*Log n)` because `O(n*Log n)` is the most dominant term..
80
80
81
81
### Constant - O(K) or O(1)
82
82
83
83
Constant time complexity represents the most efficient scenario for an algorithm, where the execution time remains constant regardless of the input size. Achieving constant time complexity often involves eliminating loops and recursive calls. Examples:
84
84
85
-
* Reads and writes in a [hash table](../hashtable)
86
-
* Enqueue and Dequeue in a [queue](../queue)
87
-
* Push and Pop in a [stack](../stack)
88
-
* Finding the minimum or maximum in [heap](../heap)
89
-
* Removing the last element of a [doubly linked list](../linkedlist)
85
+
* Reads and writes in a [hash table](./hashtable/README.md)
86
+
* Enqueue and Dequeue in a [queue](./queue/README.md)
87
+
* Push and Pop in a [stack](./stack/README.md)
88
+
* Finding the minimum or maximum in [heap](./heap/README.md)
89
+
* Removing the last element of a [doubly linked list](./linkedlist/README.md)
90
+
*[Max without conditions](./bit/max_function_without_conditions.go)
90
91
91
92
### Logarithmic - O(Log n)
92
93
93
94
Attaining logarithmic time complexity in an algorithm is highly desirable as it eliminates the need to iterate through every input in order to solve a given problem. Examples:
94
95
95
-
* Searching sorted items using [Binary Search](../dnc)
96
-
* Inserting, Deleting and Searching in a [Binary Search Tree](../tree)
97
-
* Push and Pop in [heap](../heap)
96
+
* Searching sorted items using [Binary Search](./dnc/binary_search.go)
97
+
* Inserting, Deleting and Searching in a [Binary Search Tree](./tree/README.md)
98
+
* Push and Pop in [heap](./heap/README.md)
99
+
*[Square Root](./dnc/square_root.go)
100
+
*[Median in a Stream](./heap/median_in_a_stream.go)
98
101
99
102
### Linear - O(n)
100
103
101
104
Linear time complexity is considered favorable when an algorithm necessitates traversing every input, with no feasible way to avoid it. Examples:
102
105
103
-
* Removing the last element in a [singly linked list](../linkedlist)
104
-
* Searching an unsorted [array](../array) or [linked list](../linklist)
106
+
* Removing the last element in a [singly linked list](./linkedlist/README.md)
107
+
* Searching an unsorted [array](./array/README.md) or [linked list](./linkedlist/README.md)
108
+
*[Number of Islands](./graph/number_of_islands.go)
109
+
*[Missing Number](./hashtable/missing_number.go)
105
110
106
-
### O(nLog n)
111
+
### O(n*Log n)
107
112
108
-
The time complexity of O(n log n) is commonly observed when it is necessary to iterate through all inputs, and can yield an out come at the same time through an efficient operation. Sorting is a common example. It's not possible to sort items faster than O(n log n). Examples:
113
+
The time complexity of O(n*Log n) is commonly observed when it is necessary to iterate through all inputs, and can yield an out come at the same time through an efficient operation. Sorting is a common example. It's not possible to sort items faster than O(n*Log n). Examples:
109
114
110
-
*[Merge Sort](../dnc) and [Heap Sort](../heap)
111
-
* In order traversal of a [Binary Search Tree](../tree)
115
+
*[Merge Sort](./dnc/merge_sort.go) and [Heap Sort](./heap/README.md)
116
+
*[Knapsack](./greedy/knapsack.go)
117
+
*[Find Anagrams](./hashtable/find_anagrams.go)
118
+
* In order traversal of a [Binary Search Tree](./tree/README.md)
112
119
113
120
### Polynomial - O(n^2)
114
121
115
122
Polynomial time complexity marks the initial threshold of problematic time complexity for algorithms. This complexity often arises when an algorithm includes nested loops, involving both an inner loop and an outer loop. Examples:
116
123
117
-
* Bubble sort rehearsal problem in [array](../array)
118
-
* Naive way of searching an unsorted [array](../array) for duplicates by using nested loops
* Basic [Recursive](./recursion/README.md) implementation of Fibonacci
126
136
127
137
### Factorial O(n!)
128
138
129
139
Factorial time complexity represents the most severe time complexity for an algorithm. Understanding the scale of factorials is crucial, as even the estimated total number of atoms in the universe, which is approximately 10^80, is smaller than the factorial of 57. Example:
130
140
131
-
* Permutations rehearsal in [back tracking](../backtracking)
If used inappropriately, DNC algorithms can lead to an exponential number of unnecessary recursive calls, resulting in a time complexity of O(2^n). However, if an appropriate dividing strategy and base case that can be solved directly are identified, DNC algorithms can be very effective, with a time complexity as low as O(log n) in the case of binary search. As DNC algorithms are recursive in nature, their complexity analysis is analogous to that of [recursive](../recursion) algorithms.
87
+
If used inappropriately, DNC algorithms can lead to an exponential number of unnecessary recursive calls, resulting in a time complexity of O(2^n). However, if an appropriate dividing strategy and base case that can be solved directly are identified, DNC algorithms can be very effective, with a time complexity as low as O(Log n) in the case of binary search. As DNC algorithms are recursive in nature, their complexity analysis is analogous to that of [recursive](../recursion) algorithms.
Copy file name to clipboardExpand all lines: heap/README.md
+3-3
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ A heap must satisfy two conditions:
7
7
1. The structure property requires that the heap be a complete binary search [tree](../tree), where each level is filled left to right, and all levels except the bottom are full.
8
8
2. The heap property requires that the children of a node be larger than or equal to the parent node in a min heap and smaller than or equal to the parent in a max heap, meaning that the root is the minimum in a min heap and the maximum in a max heap.
9
9
10
-
As a result, if you push elements to the min or max heap and then pop them one by one, you will obtain a list that is sorted in ascending or descending order, respectively. This sorting technique is also an O(NLogN) algorithm known as heap sort. Although there are other sorting algorithms available, none of them are faster than O(NLogN).
10
+
As a result, if you push elements to the min or max heap and then pop them one by one, you will obtain a list that is sorted in ascending or descending order, respectively. This sorting technique is also an O(n*Log n) algorithm known as heap sort. Although there are other sorting algorithms available, none of them are faster than O(n*Logn).
11
11
12
12
When pushing a new element to a heap, because of the structure property we always add the new element to the first available position on the lowest level of the heap, filling from left to right. Then to maintain the heap property, if the newly inserted element is smaller than its parent in a min heap (larger in a max heap), then we swap it with its parent. We continue swapping the swapped element with its parent until the heap property is achieved.
13
13
@@ -81,11 +81,11 @@ In Go, the heap implementation is based on slices. The heap property is maintain
81
81
82
82
## Complexity
83
83
84
-
The time complexity of pushing and popping heap elements is O(LogN). On the other hand, initializing a heap, which involves pushing N elements, has a time complexity of O(NLogN).
84
+
The time complexity of pushing and popping heap elements is O(LogN). On the other hand, initializing a heap, which involves pushing N elements, has a time complexity of O(n*Log n).
85
85
86
86
The insertion strategy entails percolating the new element up the heap until the correct location is identified. Similarly, the deletion strategy involves percolating down the heap.
87
87
88
-
Pushing and Popping heap elements are all O(LogN) operations. The strategy for inserting is the new element is percolating up the heap until the correct location is found. similarly the strategy for deletion is to percolate down.
88
+
Pushing and Popping heap elements are all O(Log n) operations. The strategy for inserting is the new element is percolating up the heap until the correct location is found. similarly the strategy for deletion is to percolate down.
Copy file name to clipboardExpand all lines: recursion/README.md
-4
Original file line number
Diff line number
Diff line change
@@ -73,10 +73,6 @@ Given n the number of steps, return in how many ways you can climb these stairs
73
73
74
74
Given x and n, return x raised to the power of n in an efficient manner. [Solution](exponentiation.go)[Test](exponentiation_test.go)
75
75
76
-
### Permutations
77
-
78
-
Given a set of integers like `{1,2}`, return all possible permutations like `{1,2},{2,1}`. [Solution](permutations.go)[Test](permutations_test.go)
79
-
80
76
### Regular Expressions Matching
81
77
82
78
Given an input and a regular expression pattern where `.` denotes to any character and `*` denotes to zero or more of the proceeding characters, write a recursive function to return true if the input matches the pattern and false otherwise. [Solution](regular_expressions.go)[Test](regular_expressions_test.go)
Copy file name to clipboardExpand all lines: tree/README.md
+4-4
Original file line number
Diff line number
Diff line change
@@ -40,21 +40,21 @@ There are many types of trees. Some important tree types include:
40
40
41
41
A Binary Search Tree (BST) is a type of sorted tree where, for every node n, the values of all nodes in its left subtree are less than n and the values of all nodes in its right subtree are greater than n.
42
42
43
-
Performing an In-Order traversal of a binary search tree and outputting each visited node results in a sorted (In-Order) list of nodes. This is known as the tree sort algorithm, which has a time complexity of O(NLogN). While there are other sorting algorithms available, none are more efficient than O(NLogN).
43
+
Performing an In-Order traversal of a binary search tree and outputting each visited node results in a sorted (In-Order) list of nodes. This is known as the tree sort algorithm, which has a time complexity of O(NLogN). While there are other sorting algorithms available, none are more efficient than O(n*Log n).
44
44
45
45
### BST Complexity
46
46
47
47
The time complexity of operations such as Search, Deletion, Insertion, and finding the minimum and maximum values in a binary search tree is O(h), where h represents the height of the tree.
48
48
49
49
## AVL - Height Balanced BST
50
50
51
-
A height balanced binary search tree has a height of O(log n) and its left and right subtrees of all nodes have equal heights.
51
+
A height balanced binary search tree has a height of O(Log n) and its left and right subtrees of all nodes have equal heights.
52
52
53
53
In order to maintain balance after an insertion, a single rotation is needed if the insertion was on the outer side, either left-left or right-right, while a double rotation is required if the insertion was on the inner side, either left-right or right-left.
54
54
55
55
### AVL Complexity
56
56
57
-
Same as a Binary Search Tree except that the height of the tree is known. So Search, Deletion, Insertion, and finding Min and Max in an AVL tree are all O(LogN) operations.
57
+
Same as a Binary Search Tree except that the height of the tree is known. So Search, Deletion, Insertion, and finding Min and Max in an AVL tree are all O(Log n) operations.
58
58
59
59
## Trie
60
60
@@ -66,7 +66,7 @@ Insertion and Search are done in O(K), where K is the length of the word.
66
66
67
67
## Application
68
68
69
-
Trees, such as Binary Search Trees (BSTs), can offer a time complexity of O(log n) for searches, as opposed to the linear access time of linked lists. Trees are widely employed in search systems, and operating systems can represent file information using tree structures.
69
+
Trees, such as Binary Search Trees (BSTs), can offer a time complexity of O(Log n) for searches, as opposed to the linear access time of linked lists. Trees are widely employed in search systems, and operating systems can represent file information using tree structures.
0 commit comments