APPLICATION - Handling Website Pagination When Extracting Data

Python 3: Automating Your Job Tasks Superhero Level: Automate Web Scraping with Python 3
6 minutes
Share the link to this page
You need to have access to the item to view this lesson.
One-time Fee
List Price:  $139.99
You save:  $40
List Price:  €129.04
You save:  €36.87
List Price:  £110.53
You save:  £31.58
List Price:  CA$191.55
You save:  CA$54.73
List Price:  A$211.24
You save:  A$60.35
List Price:  S$189
You save:  S$54
List Price:  HK$1,093.20
You save:  HK$312.36
CHF 92.16
List Price:  CHF 129.04
You save:  CHF 36.87
NOK kr1,065.65
List Price:  NOK kr1,491.95
You save:  NOK kr426.30
DKK kr688.26
List Price:  DKK kr963.60
You save:  DKK kr275.33
List Price:  NZ$228.66
You save:  NZ$65.33
List Price:  د.إ514.18
You save:  د.إ146.92
List Price:  ৳16,401.27
You save:  ৳4,686.41
List Price:  ₹11,627.39
You save:  ₹3,322.35
List Price:  RM658.58
You save:  RM188.18
List Price:  ₦204,567.22
You save:  ₦58,451.95
List Price:  ₨38,898.11
You save:  ₨11,114.54
List Price:  ฿5,122.47
You save:  ฿1,463.66
List Price:  ₺4,507.41
You save:  ₺1,287.92
List Price:  B$723.35
You save:  B$206.68
List Price:  R2,578.54
You save:  R736.77
List Price:  Лв252.74
You save:  Лв72.21
List Price:  ₩191,324.33
You save:  ₩54,668
List Price:  ₪513.44
You save:  ₪146.71
List Price:  ₱8,148.04
You save:  ₱2,328.18
List Price:  ¥21,970.72
You save:  ¥6,277.79
List Price:  MX$2,337.80
You save:  MX$667.99
List Price:  QR509.87
You save:  QR145.68
List Price:  P1,899.76
You save:  P542.82
List Price:  KSh18,247
You save:  KSh5,213.80
List Price:  E£6,597.01
You save:  E£1,884.99
List Price:  ብር8,034.09
You save:  ብር2,295.62
List Price:  Kz118,761.77
You save:  Kz33,934.36
List Price:  CLP$127,220.83
You save:  CLP$36,351.40
List Price:  CN¥995.39
You save:  CN¥284.42
List Price:  RD$8,234.17
You save:  RD$2,352.78
List Price:  DA18,814.36
You save:  DA5,375.91
List Price:  FJ$317.44
You save:  FJ$90.70
List Price:  Q1,086.15
You save:  Q310.35
List Price:  GY$29,254.27
You save:  GY$8,358.96
ISK kr13,791.40
List Price:  ISK kr19,308.52
You save:  ISK kr5,517.11
List Price:  DH1,394.74
You save:  DH398.52
List Price:  L2,480.57
You save:  L708.78
List Price:  ден7,947.19
You save:  ден2,270.78
List Price:  MOP$1,125.23
You save:  MOP$321.51
List Price:  N$2,569.81
You save:  N$734.28
List Price:  C$5,146.35
You save:  C$1,470.49
List Price:  रु18,594.99
You save:  रु5,313.23
List Price:  S/522.73
You save:  S/149.36
List Price:  K543.39
You save:  K155.26
List Price:  SAR525.04
You save:  SAR150.02
List Price:  ZK3,731.94
You save:  ZK1,066.34
List Price:  L642.02
You save:  L183.44
List Price:  Kč3,179.50
You save:  Kč908.49
List Price:  Ft49,522.43
You save:  Ft14,150.27
SEK kr1,086.34
List Price:  SEK kr1,520.92
You save:  SEK kr434.58
List Price:  ARS$124,480.29
You save:  ARS$35,568.34
List Price:  Bs966.19
You save:  Bs276.07
List Price:  COP$541,025.52
You save:  COP$154,589.76
List Price:  ₡71,687.89
You save:  ₡20,483.71
List Price:  L3,455.41
You save:  L987.33
List Price:  ₲1,051,715.43
You save:  ₲300,511.59
List Price:  $U5,389.29
You save:  $U1,539.90
List Price:  zł549.70
You save:  zł157.07
Already have an account? Log In


Okay, great to see you in this lecture where we are going to upgrade the application we've just built in the previous video. Actually, I'm going to have this new code in a separate file, web scraper underscore pagination dot p y in the same folder. And I'm only going to highlight the differences as compared to the code you've seen earlier. Basically, this new application is going to perform the exact same tasks, meaning extracting the name, link and price of each of the 21 products from a given test website and save them to an Excel spreadsheet. The new thing here is that this time, we don't have all the products listed on a single page. Instead, we have pagination enabled on the website, meaning there are multiple pages in this case, four of them that contain our products, and our web scrapers should be able to handle such a scenario for the purpose.

Of this lecture, I'm going to use another link from the same website. This is the link right here, you can find this link attached to this video, as well as in the notebook that follows. So before seeing and testing the code for our new application version, let's take a look at how the information is structured. This time, we have the first six products listed on page one, then the next six products on page two, then yet another batch of six tablets listed on page three. And finally, the last three products residing on page four. So this means that our application will have to automatically iterate over all these four pages and extract the product information from each page, as we already did in the previous video.

To enable this iterating behavior, let's try and find an identifier interlink of each page that will uniquely reference that particular page. Going back to the original link As you can see it on the screen right now, we don't see any specific identifier. However, as soon as you click on each page number, let's say page number two, the link changes accordingly. And this text gets appended to the initial link. So we have question mark page equals two. And this happens for page three, and page four as well.

So the first thing we should do is define the common part of this link in our code. So this should be this part right here, up to the equal sign, including the equal sign. So that's exactly what I did here in the application using a variable called link. After importing the modules we need Of course, Next I have created an empty list. This one right here called products that will eventually hold all the products extracted from all the pages. Now in order to extract all the products listed on Each of the four pages, we need to use the unique link of each page inside the parentheses of the request dot get method.

Therefore, we have to iterate over the four pages using a basic for loop and the range function. As you can see it right here, where range of one comma five equates to 123, and four, which are our page numbers, right. So for each page in this range, we're getting that page by concatenating, the string referenced by the link variable with the corresponding page number, which it is, by the way converted from integer to string, otherwise, we wouldn't be able to compose the necessary link. Then we are just passing the string obtained as a result of the concatenation to the get method from within the requests module. Next, as we iterate over each page, we are also loading and parsing the content of that page. Then we have to identify the div tags corresponding to the products that are listed on that particular page.

This is done the same way as we did in the previous lecture using the Find all method and the correct class value. Finally, inside the same for loop, we have to write another for loop that will iterate through all the products that have been identified on each page, and append each product to the general list of products up here using the append method. And this process is performed for each of the four pages. As soon as the list of products is complete, meaning all four pages have been scanned and all 21 products have been saved to the products list. Then the rest of the code is exactly the same as in the previous video, performing the same tasks, meaning extracting the name, link and price for each product to three different lists. Then zipping the list together in a list of stocks.

Building the panda's data frame. And finally writing the data frame to the Excel file. Now it's time to test our new application version to see if the iteration through all four pages is done correctly. And if all the data is indeed saved to the Excel file, I'm going to use the windows cmd. Again, and run the script. Python D web scraping web scraper underscore pagination dot p y.

This would be the second Python script in my folder. And we have web data successfully written to excel quitting the program. So no exceptions have been raised. Let's check the folder as well. And there's our products underscore pagination dot XLS. x file.

Let's open it and success. We have all 21 products saved to the Excel spreadsheet along with their names, links and prices. Basically the result is identical to the one we got in The previous video, only that this time we scraped multiple web pages instead of a single page. Feel free to check out the notebook following this video and download the Python script attached to that notebook to save the upgraded version of our web scrapping application. So I hope you enjoyed this section on web scrapping with Python. And I will see you soon bye

Sign Up


Share with friends, get 20% off
Invite your friends to LearnDesk learning marketplace. For each purchase they make, you get 20% off (upto $10) on your next purchase.