How to Scraping a Website Using PHP CURL in Laravel Framework

Juang Salaz Prabowo
4 min readFeb 1, 2020

Hi, today I want to share about scraping data using PHP Curl in Laravel, in this tutorial I want to make a case, I’ll scraping data from this website https://decisiondata.org/tv-internet-by-zip/01001-internet/ and get some data from there, like the city name, internet provider name, and percentage of each service, then I’ll export it into excel file with format is like this:

Excel file format

Ok, this is the step by step in this tutorial:

  1. Install Laravel new project
  2. Install Laravel excel library
  3. Install PHP simple dom parser library
  4. Let’s code the PHP Curl

Install Laravel New Project

First step is we need to install Laravel Framework, in this tutorial I used Laravel 6, we can install it using composer like this:

composer create-project --prefer-dist laravel/laravel php_scraping

After installation process successful, now we have project sources structure like this:

Laravel Project Sources

Install Laravel Excel Library

This library is very popular already, this library help us to make export data into excel file became very easy step. You can read the docs of this library in https://docs.laravel-excel.com/3.1/getting-started/

I’m using composer to install it, open your terminal and go to your project directory then type:

composer require maatwebsite/excel

Now we can look at our composer.json file, and the library is already installed in there

Laravel Excel library already installed

Install PHP Simple DOM Parser Library

Next step, we need to install PHP simple dom parser library to help us get the html element and value we need. This library is useful and make us easy to get the html element by class or id, just like JQuery.

You can read the docs of this library in https://github.com/Kub-AT/php-simple-html-dom-parser

We can install it using composer with type

composer require kub-at/php-simple-html-dom-parser

and our composer.json file is already update like this

PHP simple html dom parser library already installed

Let’s code the PHP Curl

Let’s start code the PHP Curl to scraping the website. We can start with create Exports folder inside app, and then we create a file “DataExport.php” inside Exports folder. We’ll have this structure now

File structure

Then open the DataExport.php file and create a function to scraping the website data, like this:

You can read php simple html dom parser library to know more about that, you can look at the sample code above, you can find html element with specific id or class, like this code line

$dom->find('section.internet-data > div.et_column_last > ul > li > p')

Just like JQuery way, so it’s very easy to grab data from any html element. After get the element, we can process it and get the text then save it into database or export it into excel file.

Now we already get the data, next is how to export it into excel file, In this tutorial I used how to export data from Laravel View, like this docs https://docs.laravel-excel.com/3.1/exports/from-view.html

create a function like this

Call process function and return it into Laravel view

we need to create a view “excel_report.blade.php” inside resources/views directory like this

excel_report.blade.php

Now the last step, we need to create a controller to call Laravel Excel library, like this

ScrapingController.php

Ok, now we are ready to running our script, we can type this

php artisan serve

in our terminal, and then execute the local host address from our browser. We’ll get the excel file downloaded and the content is like this

Finish, now we are successful scraping the website, you can get complete version of this project sources in my repository https://github.com/juangsalaz/php-scraping

Hopefully this tutorial is useful and help you all guys :)

Thanks

--

--