Implementation of Naïve bayes classifier using python

Machine Learning Using Python Other classification Algorithms
5 minutes
Share the link to this page
Copied
  Completed
You need to have access to the item to view this lesson.
One-time Fee
$69.99
List Price:  $99.99
You save:  $30
€65.39
List Price:  €93.41
You save:  €28.02
£55.92
List Price:  £79.90
You save:  £23.97
CA$96.01
List Price:  CA$137.16
You save:  CA$41.15
A$107.15
List Price:  A$153.08
You save:  A$45.93
S$95.13
List Price:  S$135.90
You save:  S$40.77
HK$547.14
List Price:  HK$781.66
You save:  HK$234.52
CHF 63.86
List Price:  CHF 91.23
You save:  CHF 27.37
NOK kr775.40
List Price:  NOK kr1,107.76
You save:  NOK kr332.36
DKK kr487.78
List Price:  DKK kr696.86
You save:  DKK kr209.07
NZ$118.01
List Price:  NZ$168.60
You save:  NZ$50.58
د.إ257.06
List Price:  د.إ367.24
You save:  د.إ110.18
৳7,680.49
List Price:  ৳10,972.60
You save:  ৳3,292.11
₹5,842.03
List Price:  ₹8,346.11
You save:  ₹2,504.08
RM332.86
List Price:  RM475.54
You save:  RM142.67
₦86,437.65
List Price:  ₦123,487.65
You save:  ₦37,050
₨19,491.96
List Price:  ₨27,846.85
You save:  ₨8,354.89
฿2,586.09
List Price:  ฿3,694.58
You save:  ฿1,108.48
₺2,265.39
List Price:  ₺3,236.41
You save:  ₺971.02
B$363.53
List Price:  B$519.35
You save:  B$155.82
R1,302.64
List Price:  R1,861
You save:  R558.35
Лв127.90
List Price:  Лв182.73
You save:  Лв54.82
₩96,270.48
List Price:  ₩137,535.16
You save:  ₩41,264.67
₪262.29
List Price:  ₪374.71
You save:  ₪112.42
₱4,033.94
List Price:  ₱5,763.02
You save:  ₱1,729.07
¥10,867.12
List Price:  ¥15,525.12
You save:  ¥4,658
MX$1,187.12
List Price:  MX$1,695.96
You save:  MX$508.84
QR254.93
List Price:  QR364.20
You save:  QR109.27
P994.08
List Price:  P1,420.18
You save:  P426.09
KSh9,360.69
List Price:  KSh13,372.99
You save:  KSh4,012.30
E£3,358.63
List Price:  E£4,798.26
You save:  E£1,439.62
ብር4,003.77
List Price:  ብር5,719.92
You save:  ብር1,716.15
Kz58,546.63
List Price:  Kz83,641.63
You save:  Kz25,095
CLP$67,216.99
List Price:  CLP$96,028.39
You save:  CLP$28,811.40
CN¥506.70
List Price:  CN¥723.89
You save:  CN¥217.19
RD$4,073.53
List Price:  RD$5,819.58
You save:  RD$1,746.04
DA9,418.34
List Price:  DA13,455.35
You save:  DA4,037.01
FJ$158.31
List Price:  FJ$226.17
You save:  FJ$67.86
Q543.96
List Price:  Q777.12
You save:  Q233.16
GY$14,650.29
List Price:  GY$20,929.88
You save:  GY$6,279.59
ISK kr9,815.39
List Price:  ISK kr14,022.59
You save:  ISK kr4,207.20
DH707.71
List Price:  DH1,011.06
You save:  DH303.35
L1,237.78
List Price:  L1,768.33
You save:  L530.55
ден4,025.24
List Price:  ден5,750.59
You save:  ден1,725.35
MOP$563.96
List Price:  MOP$805.69
You save:  MOP$241.73
N$1,304.33
List Price:  N$1,863.42
You save:  N$559.08
C$2,570.38
List Price:  C$3,672.13
You save:  C$1,101.75
रु9,397.27
List Price:  रु13,425.24
You save:  रु4,027.97
S/263.43
List Price:  S/376.35
You save:  S/112.91
K270.11
List Price:  K385.89
You save:  K115.77
SAR262.49
List Price:  SAR375.01
You save:  SAR112.51
ZK1,873.89
List Price:  ZK2,677.10
You save:  ZK803.21
L325.37
List Price:  L464.84
You save:  L139.46
Kč1,643.47
List Price:  Kč2,347.91
You save:  Kč704.44
Ft25,458.03
List Price:  Ft36,370.18
You save:  Ft10,912.14
SEK kr764.90
List Price:  SEK kr1,092.76
You save:  SEK kr327.86
ARS$61,327.27
List Price:  ARS$87,614.14
You save:  ARS$26,286.87
Bs483.57
List Price:  Bs690.85
You save:  Bs207.27
COP$273,218.78
List Price:  COP$390,329.27
You save:  COP$117,110.49
₡35,710.66
List Price:  ₡51,017.42
You save:  ₡15,306.75
L1,733.65
List Price:  L2,476.75
You save:  L743.09
₲524,442.73
List Price:  ₲749,236.02
You save:  ₲224,793.28
$U2,683.09
List Price:  $U3,833.15
You save:  $U1,150.06
zł283.24
List Price:  zł404.64
You save:  zł121.40
Already have an account? Log In

Transcript

Hello everyone, welcome to the course of machine learning with Python. In this video, we shall learn how to implement knee based classification. In Python, we shall use wind data set which is available under the library escalon dot data sets. So first we'll be importing the necessary libraries. So from SK learn dot datasets input load underscore wine and import NumPy as NP. So let's go ahead and run this particular cell.

Now we'll be loading the data set. So y is equals to load underscore wine. So inside this wine variable, the data of the wind data set will be loaded. So now we'll be exploring the data. So what are the features so wind dot feature underscore names will contain the name of all the features. So there are alcohol malic acid, H, alkalinity of H, magnesium etc.

And wind dot target underscore names. We continue Target names or the labels. Similarly, we can print the description of the entire data set just by using this command print within bracket wine dot d s er so DCR stands for described. So there are total 178 instances 50 in each of the three classes, number of attributes 30 numeric predictive attributes and the class activity information alcohol, malic acid H, alkalinity of H magnesia, total phenols, etc. And there are total three classes class to class one and class two. It also prints the summary statistics, so there are no missing attributes.

Okay, and the original data set archive is also given over here. Now, we'll be loading the feature set x and the target variable y. So x is equals to wine dot data and y equals to wine dot target. So What is the shape of this feature set? It is 178 comma 13 because it contains 178 instances or samples and each sample is a vector of dimension 13 that means, it contains 13 features. Now splitting the data set into training and paste here we shall use 75% training and 25% is data.

So, from SK learn dot model underscore selection will be importing crane underscore test underscore split. So, using this function we can split the data set into train and test. So, x train will contain the feature set of the training x test will contain the feature set of the test y train will contain the label of the training data set and y test will contain the label of the test data set. And I have specified test sizes because 2.25 so that 25% are being used as tested data So let's go ahead and run this particular cell. So what is the shape of the screen? So it is 133 comma 13.

So out of 178 instances 133 instances has been picked randomly as the training sample and rest risk 45 instances has been used for test. Now we'll be creating the model. So from SK learn dot name underscore Bayes will import Gaussian envy. So Gaussian envy stands for Gaussian Naive Bayes model. So this is basically a class GNP is a object of the class Gaussian in V. So we run these cells and in this object, we'll be fitting our x train and y train. Okay.

So after that, we'll be testing or predicting By the model. So for this we'll be using GNP dot predict we thought, and we have supplied the entire test data set, and we'll be receiving the predicted values or the predicted classes. Now we'll be evaluating the model performance. So from SK learn dot matrix, we'll be importing confusion matrix. So cm is our confusion matrix, which takes two argument whitest and white credit. Let's see what is our confusion matrix.

So, this is how our confusion matrix looks like. So corrects will be NP dot Chris within bracket cm. So that means all the data some of the diagonal elements are put on repeat the some of the all elements of the confusion matrix. So total number of correct prediction sorry, out of this mini prediction, so total number of correct prediction is 42 out of 45. So what will be the accuracy so the accuracy of the model is 93.33%. Sent which is quite high.

Now from SK learn dot matrix we'll be importing classification report and we'll be printing the classification report which is a function which takes two argument whitest and by credit. So this will be printing the precision recall an X score of all the classes. So I hope you have enjoyed this video. In the next video we shall discuss about decision tree classification. So see you in the next lecture. Thank you

Sign Up

Share

Share with friends, get 20% off
Invite your friends to LearnDesk learning marketplace. For each purchase they make, you get 20% off (upto $10) on your next purchase.