You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: Naive Bayes Classifier/NaiveBayes.playground/Contents.swift
+48-3
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ import Foundation
7
7
8
8
### Gaussian Naive Bayes
9
9
- Note:
10
-
When using Gaussian NB you have to have continuous features (Double).
10
+
When using Gaussian NB you have to have continuous features (Double).
11
11
12
12
For this example we are going to use a famous dataset with different types of wine. The labels of the features can be viewed [here](https://gist.github.com/tijptjik/9408623)
13
13
*/
@@ -52,9 +52,54 @@ let data = wineData.map { row in
I can assure you that this is the correct result and as you can see the classifier thinks that its ***99.99%*** correct too.
56
+
I can assure you that ***class 1*** is the correct result and as you can see the classifier thinks that its ***99.99%*** likely too.
58
57
59
58
### Multinomial Naive Bayes
59
+
60
+
- Note:
61
+
When using Multinomial NB you have to have categorical features (Int).
62
+
63
+
Now this dataset is commonly used to describe the classification problem and it is categorical which means you don't have real values you just have categorical data as stated before. The structure of this dataset is as follows.
64
+
65
+
Outlook,Temperature,Humidity,Windy
66
+
67
+
***Outlook***: 0 = rainy, 1 = overcast, 2 = sunny
68
+
69
+
***Temperature***: 0 = hot, 1 = mild, 2 = cool
70
+
71
+
***Humidity***: 0 = high, 1 = normal
72
+
73
+
***Windy***: 0 = false, 1 = true
74
+
75
+
The classes are either he will play golf or not depending on the weather conditions. (0 = won't play, 1 = will play)
0 commit comments