Skip to content

Commit cc338d6

Browse files
authored
Merge pull request #656 from artkirillov/master
Fix nonconventional parameter naming in Big-O notation article
2 parents 2e5ad76 + d38c428 commit cc338d6

File tree

1 file changed

+30
-30
lines changed

1 file changed

+30
-30
lines changed

Big-O Notation.markdown

+30-30
Original file line numberDiff line numberDiff line change
@@ -16,21 +16,21 @@ Big-O | Name | Description
1616
**O(n^3)** | cubic | **Poor performance.** If you have 100 items, this does 100^3 = 1,000,000 units of work. Doubling the input size makes it eight times slower. Example: matrix multiplication.
1717
**O(2^n)** | exponential | **Very poor performance.** You want to avoid these kinds of algorithms, but sometimes you have no choice. Adding just one bit to the input doubles the running time. Example: traveling salesperson problem.
1818
**O(n!)** | factorial | **Intolerably slow.** It literally takes a million years to do anything.
19-
19+
2020

2121
Below are some examples for each category of performance:
2222

2323
**O(1)**
2424

2525
The most common example with O(1) complexity is accessing an array index.
26-
26+
2727
```swift
2828
let value = array[5]
2929
```
30-
30+
3131
Another example of O(1) is pushing and popping from Stack.
32-
33-
32+
33+
3434
**O(log n)**
3535

3636
```swift
@@ -40,23 +40,23 @@ Below are some examples for each category of performance:
4040
j *= 2
4141
}
4242
```
43-
43+
4444
Instead of simply incrementing, 'j' is increased by 2 times itself in each run.
45-
45+
4646
Binary Search Algorithm is an example of O(log n) complexity.
47-
48-
47+
48+
4949
**O(n)**
5050

5151
```swift
5252
for i in stride(from: 0, to: n, by: 1) {
5353
print(array[i])
5454
}
5555
```
56-
56+
5757
Array Traversal and Linear Search are examples of O(n) complexity.
58-
59-
58+
59+
6060
**O(n log n)**
6161

6262
```swift
@@ -68,23 +68,23 @@ Below are some examples for each category of performance:
6868
}
6969
}
7070
```
71-
71+
7272
OR
73-
73+
7474
```swift
7575
for i in stride(from: 0, to: n, by: 1) {
7676
func index(after i: Int) -> Int? { // multiplies `i` by 2 until `i` >= `n`
77-
return i < n ? i * 2 : nil
77+
return i < n ? i * 2 : nil
7878
}
7979
for j in sequence(first: 1, next: index(after:)) {
8080
// do constant time stuff
8181
}
8282
}
8383
```
84-
84+
8585
Merge Sort and Heap Sort are examples of O(n log n) complexity.
86-
87-
86+
87+
8888
**O(n^2)**
8989

9090
```swift
@@ -94,10 +94,10 @@ Below are some examples for each category of performance:
9494
}
9595
}
9696
```
97-
97+
9898
Traversing a simple 2-D array and Bubble Sort are examples of O(n^2) complexity.
99-
100-
99+
100+
101101
**O(n^3)**
102102

103103
```swift
@@ -109,24 +109,24 @@ Below are some examples for each category of performance:
109109
}
110110
}
111111
```
112-
112+
113113
**O(2^n)**
114114

115115
Algorithms with running time O(2^N) are often recursive algorithms that solve a problem of size N by recursively solving two smaller problems of size N-1.
116116
The following example prints all the moves necessary to solve the famous "Towers of Hanoi" problem for N disks.
117117

118118
```swift
119-
func solveHanoi(N: Int, from: String, to: String, spare: String) {
119+
func solveHanoi(n: Int, from: String, to: String, spare: String) {
120120
guard n >= 1 else { return }
121-
if N > 1 {
122-
solveHanoi(N: N - 1, from: from, to: spare, spare: to)
121+
if n > 1 {
122+
solveHanoi(n: n - 1, from: from, to: spare, spare: to)
123123
} else {
124-
solveHanoi(N: N-1, from: spare, to: to, spare: from)
124+
solveHanoi(n: n - 1, from: spare, to: to, spare: from)
125125
}
126126
}
127127
```
128-
129-
128+
129+
130130
**O(n!)**
131131

132132
The most trivial example of function that takes O(n!) time is given below.
@@ -137,8 +137,8 @@ Below are some examples for each category of performance:
137137
nFactFunc(n - 1)
138138
}
139139
}
140-
```
141-
140+
```
141+
142142
Often you don't need math to figure out what the Big-O of an algorithm is but you can simply use your intuition. If your code uses a single loop that looks at all **n** elements of your input, the algorithm is **O(n)**. If the code has two nested loops, it is **O(n^2)**. Three nested loops gives **O(n^3)**, and so on.
143143

144144
Note that Big-O notation is an estimate and is only really useful for large values of **n**. For example, the worst-case running time for the [insertion sort](Insertion%20Sort/) algorithm is **O(n^2)**. In theory that is worse than the running time for [merge sort](Merge%20Sort/), which is **O(n log n)**. But for small amounts of data, insertion sort is actually faster, especially if the array is partially sorted already!

0 commit comments

Comments
 (0)