You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: Naive Bayes Classifier/NaiveBayes.playground/Contents.swift
+14-14
Original file line number
Diff line number
Diff line change
@@ -2,13 +2,13 @@ import Foundation
2
2
3
3
/*:
4
4
## Naive Bayes Classifier
5
-
5
+
6
6
This playground uses the given algorithm and utilizes its features with some example datasets
7
-
7
+
8
8
### Gaussian Naive Bayes
9
9
- Note:
10
10
When using Gaussian NB you have to have continuous features (Double).
11
-
11
+
12
12
For this example we are going to use a famous dataset with different types of wine. The labels of the features can be viewed [here](https://gist.github.com/tijptjik/9408623)
I can assure you that ***class 1*** is the correct result and as you can see the classifier thinks that its ***99.99%*** likely too.
57
-
57
+
58
58
### Multinomial Naive Bayes
59
-
59
+
60
60
- Note:
61
61
When using Multinomial NB you have to have categorical features (Int).
62
-
62
+
63
63
Now this dataset is commonly used to describe the classification problem and it is categorical which means you don't have real values you just have categorical data as stated before. The structure of this dataset is as follows.
64
-
64
+
65
65
Outlook,Temperature,Humidity,Windy
66
-
66
+
67
67
***Outlook***: 0 = rainy, 1 = overcast, 2 = sunny
68
-
68
+
69
69
***Temperature***: 0 = hot, 1 = mild, 2 = cool
70
-
70
+
71
71
***Humidity***: 0 = high, 1 = normal
72
-
72
+
73
73
***Windy***: 0 = false, 1 = true
74
-
74
+
75
75
The classes are either he will play golf or not depending on the weather conditions. (0 = won't play, 1 = will play)
0 commit comments