Search

Data scraping with node js and display in laravel

post-title

Hello, Everyone laravelcode today share somthing new. we are working some day ago with one laravel application and we are required some another site data to in our laravel application using data scriping. so, we are done our data scriping logic in nodejs and data display logic in laravel. here we are share one simple example demo for data scraping with nodejss and how to display in laravel application

Today, in IT world day by day required something do new. but sometime some functionality code write into php is very difficult(but not imposible) and we are write that code login in another technology and then after merge. this method is very good for better performance for site.

Nodejs and laravel currently most famous framework for web-development and today many project develope using nodejs and laravel in one platform. for example real time chat system is best example for it.

Many laravel developer did't knowing how to use nodejs in laravel application. so we are share here one tutorials may be it's more helpfull to you for how to start nodejs setup in laravel and some basic stuff

We are starting here to install node and how to use with laravel application step by step

Our scraping data dispay preview look like

Step : 1 Install node js

For this tutorial we are required node so first we are install node in our local system. sumple run following command in your terminal.


sudo apt-get update
sudo apt-get install nodejs
sudo apt-get install npm

Check node version node -v

Check npm version npm -v

Step : 2 Create one laravel application

After install node js then we are required creae one laravel dummy application run by following command.


composer create-project --prefer-dist laravel/laravel LaravelWithNodeJs

After successfully create laravel aplication then configure you database, user and password in .env file.

Step : 3 Install npm package dependency

Open your laravel project root directory's package.json file and add following dependency in this file like that.


"devDependencies": {
    -------
    -------
    "express"    : "latest",
    "request"    : "latest",
    "cheerio"    : "latest"
  }

After add above dependency then after run following command to install it in our laravel project directory


sudo npm install

After run this command then check in your project directory here created node_modules folder automatic and your all npm and node dependency installed there.

Step : 4 Check node js working

Now how to check our node js code working in our application or not. so first create file node/server.js and simply write followign code for testing.


var express = require('express');
var app     = express();
app.listen('8001');

console.log('Your node server start....');

exports = module.exports = app;

Then after goto your project_derectory/node path and run folloeing command


sudo node server.js

If after run this command and you sow "Your node server start...." this output string then your node js code perfect work/run in your local system.

Step : 5 Write nodejs scraping code

Now we are write our target site scraping code in nodejs. here we are targeting https://news.ycombinator.com this website for demo.

Before starting write nodejs scraping script first show following html structure which use in our targeting https://news.ycombinator.com site


<td class="title">
	<a href="link" class="storylink">
		title
	</a>
	<span class="sitebit comhead"> (
	<a href="from?site=decisionsciencenews.com">
	<span class="sitestr">decisionsciencenews.com</span>
	</a>)
	</span>
</td>

In this nodejs scraping code we are targetin span.comhead html dome and scraping following structured data

  • 1) rank
  • 2) title
  • 3) url
  • 4) points
  • 5) username
  • 6) comments

Now open your node/server.js file and put following nodejs scraping code.

[ADDCODE]


var express = require('express');
var fs = require('fs');
var request = require('request');
var cheerio = require('cheerio');
var app     = express();

//Scraping start
app.get('/scrape', function(req, res){

	request('https://news.ycombinator.com', function (error, response, html) {
	  	if (!error && response.statusCode == 200) {
	    	var $ = cheerio.load(html);
	    	var parsedResults = [];
	    	$('span.comhead').each(function(i, element){
	      		// Select the previous element
	      		var a = $(this).prev();
		      	// Get the rank by parsing the element two levels above the "a" element
		      	var rank = a.parent().parent().text();
		      	// Parse the link title
		      	var title = a.text();
		      	// Parse the href attribute from the "a" element
		      	var url = a.attr('href');
		      	// Get the subtext children from the next row in the HTML table.
		      	var subtext = a.parent().parent().next().children('.subtext').children();
		      	// Extract the relevant data from the children
		      	var points = $(subtext).eq(0).text();
		      	var username = $(subtext).eq(1).text();
		      	var comments = $(subtext).eq(2).text();
		      	// Our parsed meta data object
	      		var metadata = {
	        		rank: parseInt(rank),
	        		title: title,
	        		url: url,
	        		points: parseInt(points),
	        		username: username,
	        		comments: parseInt(comments)
	      		};
	      		// Push meta-data into parsedResults array
	      		parsedResults.push(metadata);
	    	});
	    	// Log our finished parse results in the terminal
	    	console.log(parsedResults);
	  	}

	  	fs.writeFile('../public/output.json', JSON.stringify(parsedResults, null, 4), function(err){
	    	console.log('Sraping data successfully written! - Check your project public/output.json file');
		});

	  	res.send('Scraping Done...');
	});
});

app.listen('8001');

console.log('Your node server start successfully....');

exports = module.exports = app;

Then after goto your project_derectory/node path and run folloeing command


sudo node server.js

After starting node server then open your browser and type following url


localhost:8001/scrape

after run this url in browser you got this message in browser window"Scraping Done..." now your data scraping done. then check your terminal here you also getting one message like thatSraping data successfully written! - Check your project public/output.json file

Now we are done data scraping proccess and our data scraping file automatic write in json formate data in public/output.json file. now how to display this data in our laravel application front site. simple follow this step.

Step : 6 Create route

First create one route for a view/blade file like that


Route::get('data-scraping', 'DataScrapingController@index');

Step : 7 Create controller

now create DataScrapingController.php file look like


namespace App\Http\Controllers;

use App\Http\Requests;
use Illuminate\Http\Request;
use DB;
use Session;

class DataScrapingController extends Controller
{
    public function index()
    {
        $data = json_decode(file_get_contents('output.json'));

        return view('data-scraping', compact('data'));
    }
}

Step : 8 Create blade/view file

Into the last create resources/views/data-scraping.blade.php file and write html code like that for display data.


@extends('layouts.app')

@section('content')
<div class="container">
	@foreach($data as $key => $value)
	<div class="col-sm-12">
      	<a href="{!! $value->url !!}">
      		<h3 class="title">{!! $value->title !!}</h3>
      	</a>
      	<p class="text-muted">
	      	<strong>Points :</strong> {!! $value->points !!}
	      	<strong>Comments :</strong> {!! $value->comments !!}
      	</p>
      	<p class="text-muted">Posted by <a href="#">{!! $value->username !!}</a></p>
    </div>
    @endforeach
</div>
@endsection

Now we are ready to run our example so run bellow command ro quick run:

php artisan serve

Now you can open bellow URL on your browser:

http://localhost:8000/data-scraping

We are hope it can help you...