Training Function - Part 1

7 minutes
Share the link to this page
Copied
  Completed
You need to have access to the item to view this lesson.
One-time Fee
$69.99
List Price:  $99.99
You save:  $30
€65.59
List Price:  €93.70
You save:  €28.11
£56.11
List Price:  £80.17
You save:  £24.05
CA$96.29
List Price:  CA$137.56
You save:  CA$41.27
A$108.61
List Price:  A$155.17
You save:  A$46.55
S$95.18
List Price:  S$135.98
You save:  S$40.79
HK$548.09
List Price:  HK$783.01
You save:  HK$234.92
CHF 63.62
List Price:  CHF 90.89
You save:  CHF 27.27
NOK kr770.78
List Price:  NOK kr1,101.16
You save:  NOK kr330.38
DKK kr489.42
List Price:  DKK kr699.21
You save:  DKK kr209.78
NZ$118.08
List Price:  NZ$168.70
You save:  NZ$50.61
د.إ257.04
List Price:  د.إ367.21
You save:  د.إ110.17
৳7,659.43
List Price:  ৳10,942.52
You save:  ৳3,283.08
₹5,847.07
List Price:  ₹8,353.31
You save:  ₹2,506.24
RM334.79
List Price:  RM478.30
You save:  RM143.50
₦90,777.03
List Price:  ₦129,687.03
You save:  ₦38,910
₨19,427.14
List Price:  ₨27,754.24
You save:  ₨8,327.10
฿2,573.47
List Price:  ฿3,676.55
You save:  ฿1,103.07
₺2,275.24
List Price:  ₺3,250.48
You save:  ₺975.24
B$366.28
List Price:  B$523.28
You save:  B$157
R1,332.50
List Price:  R1,903.66
You save:  R571.15
Лв128.20
List Price:  Лв183.15
You save:  Лв54.95
₩96,220.81
List Price:  ₩137,464.20
You save:  ₩41,243.38
₪265.25
List Price:  ₪378.95
You save:  ₪113.69
₱4,004.45
List Price:  ₱5,720.89
You save:  ₱1,716.43
¥10,808.04
List Price:  ¥15,440.72
You save:  ¥4,632.68
MX$1,187.23
List Price:  MX$1,696.11
You save:  MX$508.88
QR254.64
List Price:  QR363.78
You save:  QR109.14
P964.57
List Price:  P1,378.02
You save:  P413.44
KSh9,308.67
List Price:  KSh13,298.67
You save:  KSh3,990
E£3,394.91
List Price:  E£4,850.08
You save:  E£1,455.17
ብር3,970.94
List Price:  ብር5,673.01
You save:  ብር1,702.07
Kz58,616.62
List Price:  Kz83,741.62
You save:  Kz25,125
CLP$68,515.31
List Price:  CLP$97,883.21
You save:  CLP$29,367.90
CN¥506.66
List Price:  CN¥723.83
You save:  CN¥217.17
RD$4,125.31
List Price:  RD$5,893.56
You save:  RD$1,768.24
DA9,412.64
List Price:  DA13,447.21
You save:  DA4,034.56
FJ$158.92
List Price:  FJ$227.04
You save:  FJ$68.12
Q542.78
List Price:  Q775.43
You save:  Q232.65
GY$14,611.69
List Price:  GY$20,874.74
You save:  GY$6,263.05
ISK kr9,858.79
List Price:  ISK kr14,084.59
You save:  ISK kr4,225.80
DH708.36
List Price:  DH1,011.98
You save:  DH303.62
L1,252.84
List Price:  L1,789.85
You save:  L537.01
ден4,036.97
List Price:  ден5,767.34
You save:  ден1,730.37
MOP$562.89
List Price:  MOP$804.17
You save:  MOP$241.27
N$1,328.43
List Price:  N$1,897.84
You save:  N$569.40
C$2,568.70
List Price:  C$3,669.73
You save:  C$1,101.03
रु9,342.21
List Price:  रु13,346.58
You save:  रु4,004.37
S/262.58
List Price:  S/375.13
You save:  S/112.55
K265.26
List Price:  K378.96
You save:  K113.70
SAR262.55
List Price:  SAR375.09
You save:  SAR112.53
ZK1,763.89
List Price:  ZK2,519.95
You save:  ZK756.06
L326.30
List Price:  L466.17
You save:  L139.86
Kč1,658.23
List Price:  Kč2,369.01
You save:  Kč710.77
Ft25,839.41
List Price:  Ft36,915.02
You save:  Ft11,075.61
SEK kr762.42
List Price:  SEK kr1,089.22
You save:  SEK kr326.80
ARS$60,837.09
List Price:  ARS$86,913.85
You save:  ARS$26,076.76
Bs483.29
List Price:  Bs690.45
You save:  Bs207.15
COP$273,779.82
List Price:  COP$391,130.80
You save:  COP$117,350.97
₡34,969.07
List Price:  ₡49,957.95
You save:  ₡14,988.88
L1,722.93
List Price:  L2,461.44
You save:  L738.50
₲516,485.32
List Price:  ₲737,867.79
You save:  ₲221,382.47
$U2,722.73
List Price:  $U3,889.78
You save:  $U1,167.05
zł284.09
List Price:  zł405.87
You save:  zł121.77
Already have an account? Log In

Transcript

Hello everyone. In this video we are going to start with building our training function. Since this function is fairly big, we are going to split it into two parts. In this first part, we are going to focus on training data. And the next one will cover testing data and testing on model and saving parameters. Before we start writing the code, I would like to go through some arguments on the function.

Our first argument is the model itself. We're going to use our model class for this argument. The next one is our ebooks. Then we have our argument drop rate, which is just short for dropout rate. The fourth argument is batch size. This hyper parameter indicates how many samples in our case images this model receives at one time, batch size could be one 128, maybe 512 depending on how much RAM Have.

Okay, now we have data, which is arguably the most important argument. This parameter should be a tupple that has four parts, x train, y train, x test and y test in that order. You can see that example right here. The last one is a safe path, string. This is the path to a folder where all model checkpoints will be saved. Now that we have all arguments in our order, let's create the function itself.

The first thing to do is to unpack all variables from the data argument, then we have to start our TensorFlow session. So to do that type TF dot session, and after starting it, we need to initialize all of our model variables. Without this type, we won't be able to use our model. So to do this step, just run session dot run and type TF dot global variables initializer. The next step is to define our tensor flow saver type TF dot train dot saver. This step is very, very important.

And as the name suggests, we will use this saver to save or load the train model. There is only one more variable that we have to initialize and that is best test accuracy. We will set that to zero. This variable will help us to decide whether we should save the model or not. Basically speaking if the current accuracy is better than the best accuracy so far, the model will be saved. Now that we have everything set up, we can create the training loop itself.

The loop will iterate through all ebooks. For those who are not familiar one epoch means we are going to go through whole data set and update the model for each sample. So in our case, one epoch means going For all 50,000 images, updating the model parameters and doing it again, for basic logging purposes define two lists one to log training accuracy and another one to train loss. Inside this main loop define another for loop that will iterate through the whole training data set right range and provide length of x train floor divided by batch size. Inside this second for loop type, starting ID equals to III times batch size and defined end ID equals two starting ID plus and bad size. Before we go any further let's explain this part of the code because this part could be confusing.

For this example, set the length of the training set to 100 and the batch size to nine. In this for loop set range of length x train floor divided by batch size. And inside this loop will have set start ID equals to II times batch size, the same as we did before. And and ad set is to start IV plus batch size. Let's bring both of them to see what are the values, execute Excel. And here we go.

You can see here the start ID and end ID through which we will iterate throughout the whole data set. And each will increase by batch size. So in our example, we have 029 918, and so on, until it comes to the end of data set, which in this case is 100. This operation is happening in the training loop as well, but we are going to iterate through whole training set. Now Now that we made that clear, let's use these IDs to create our batch of data. First, we are going to define x batch equals to x train Then we are going to use starting ID, column and ID.

So this is this will take only those samples between these two arguments. And for our labels, we have y batch equals to y train and the same IDs. Now that we have a batch of data prepared for feeding to the model, we need to define feed dict or feed dictionary. The first key will be model inputs, which has value of x batch. Then we have model targets as a key and value of a y batch. And lastly, we have to provide dropout rate, which will take the value of our argument drop rate.

Let's optimize our model using the batch data. The method will return optimizer loss and predictions for the batch of data for optimizer just write underscore for loss. We will call it loss, shorter for training loss. And lastly preds T for training predictions to run the optimization step, type session dot run, and in brackets provide the first argument, the list of network variables that we want to fetch or to fish. Because we will fetch more than one variable we need to provide them in as a list. The first one is model dot OPT, or optimizer.

Then we'll have model dot loss. And lastly, model dot predictions. The second argument of this function is to actually provide the data in the form of our feed dictionary. So set feed dict equals to feed dict. And that's it. Now that we got all the results, let's log in append the result of the sparse accuracy function to our training accuracy list and you may remember, this function takes two arguments, the true labels, in the case of y batch and the predictions, and that is the results of our run function, press T. Now we have to log one more thing, the training loss, which is by simply attending to loss to our training clause list, just to keep everything simple and to be aware of the training process.

Let's write a few simple print statements. I'll cut the video here because it is basically just formatting a string. Okay, here it is, you can copy my formatting method, or you can just come up with your own. Let's stop the video right here and finish it in the next one. If you have any questions or comments so far, just post them in the comment section. Otherwise, I'll see you in the next tutorial.

Sign Up

Share

Share with friends, get 20% off
Invite your friends to LearnDesk learning marketplace. For each purchase they make, you get 20% off (upto $10) on your next purchase.