You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: Big-O Notation.markdown
+30-30
Original file line number
Diff line number
Diff line change
@@ -16,21 +16,21 @@ Big-O | Name | Description
16
16
**O(n^3)** | cubic | **Poor performance.** If you have 100 items, this does 100^3 = 1,000,000 units of work. Doubling the input size makes it eight times slower. Example: matrix multiplication.
17
17
**O(2^n)** | exponential | **Very poor performance.** You want to avoid these kinds of algorithms, but sometimes you have no choice. Adding just one bit to the input doubles the running time. Example: traveling salesperson problem.
18
18
**O(n!)** | factorial | **Intolerably slow.** It literally takes a million years to do anything.
19
-
19
+
20
20
21
21
Below are some examples for each category of performance:
22
22
23
23
**O(1)**
24
24
25
25
The most common example with O(1) complexity is accessing an array index.
26
-
26
+
27
27
```swift
28
28
let value = array[5]
29
29
```
30
-
30
+
31
31
Another example of O(1) is pushing and popping from Stack.
32
-
33
-
32
+
33
+
34
34
**O(log n)**
35
35
36
36
```swift
@@ -40,23 +40,23 @@ Below are some examples for each category of performance:
40
40
j *=2
41
41
}
42
42
```
43
-
43
+
44
44
Instead of simply incrementing, 'j' is increased by 2 times itself in each run.
45
-
45
+
46
46
Binary Search Algorithm is an example of O(log n) complexity.
47
-
48
-
47
+
48
+
49
49
**O(n)**
50
50
51
51
```swift
52
52
for i instride(from: 0, to: n, by: 1) {
53
53
print(array[i])
54
54
}
55
55
```
56
-
56
+
57
57
Array Traversal and Linear Search are examples of O(n) complexity.
58
-
59
-
58
+
59
+
60
60
**O(n log n)**
61
61
62
62
```swift
@@ -68,23 +68,23 @@ Below are some examples for each category of performance:
68
68
}
69
69
}
70
70
```
71
-
71
+
72
72
OR
73
-
73
+
74
74
```swift
75
75
for i instride(from: 0, to: n, by: 1) {
76
76
funcindex(afteri: Int) ->Int? { // multiplies `i` by 2 until `i` >= `n`
77
-
return i < n ? i *2:nil
77
+
return i < n ? i *2:nil
78
78
}
79
79
for j insequence(first: 1, next: index(after:)) {
80
80
// do constant time stuff
81
81
}
82
82
}
83
83
```
84
-
84
+
85
85
Merge Sort and Heap Sort are examples of O(n log n) complexity.
86
-
87
-
86
+
87
+
88
88
**O(n^2)**
89
89
90
90
```swift
@@ -94,10 +94,10 @@ Below are some examples for each category of performance:
94
94
}
95
95
}
96
96
```
97
-
97
+
98
98
Traversing a simple 2-D array and Bubble Sort are examples of O(n^2) complexity.
99
-
100
-
99
+
100
+
101
101
**O(n^3)**
102
102
103
103
```swift
@@ -109,24 +109,24 @@ Below are some examples for each category of performance:
109
109
}
110
110
}
111
111
```
112
-
112
+
113
113
**O(2^n)**
114
114
115
115
Algorithms with running time O(2^N) are often recursive algorithms that solve a problem of size N by recursively solving two smaller problems of size N-1.
116
116
The following example prints all the moves necessary to solve the famous "Towers of Hanoi" problem for N disks.
117
117
118
118
```swift
119
-
funcsolveHanoi(N: Int, from: String, to: String, spare: String) {
119
+
funcsolveHanoi(n: Int, from: String, to: String, spare: String) {
120
120
guard n >=1else { return }
121
-
ifN>1 {
122
-
solveHanoi(N: N-1, from: from, to: spare, spare: to)
121
+
ifn>1 {
122
+
solveHanoi(n: n-1, from: from, to: spare, spare: to)
123
123
} else {
124
-
solveHanoi(N: N-1, from: spare, to: to, spare: from)
124
+
solveHanoi(n: n -1, from: spare, to: to, spare: from)
125
125
}
126
126
}
127
127
```
128
-
129
-
128
+
129
+
130
130
**O(n!)**
131
131
132
132
The most trivial example of function that takes O(n!) time is given below.
@@ -137,8 +137,8 @@ Below are some examples for each category of performance:
137
137
nFactFunc(n -1)
138
138
}
139
139
}
140
-
```
141
-
140
+
```
141
+
142
142
Often you don't need math to figure out what the Big-O of an algorithm is but you can simply use your intuition. If your code uses a single loop that looks at all **n** elements of your input, the algorithm is **O(n)**. If the code has two nested loops, it is **O(n^2)**. Three nested loops gives **O(n^3)**, and so on.
143
143
144
144
Note that Big-O notation is an estimate and is only really useful for large values of **n**. For example, the worst-case running time for the [insertion sort](Insertion%20Sort/) algorithm is **O(n^2)**. In theory that is worse than the running time for [merge sort](Merge%20Sort/), which is **O(n log n)**. But for small amounts of data, insertion sort is actually faster, especially if the array is partially sorted already!
0 commit comments