接上面两章,这一节,我们来把爬出来的数据存到mongodb数据库里
首先,我们要先安装mongoose这个库
npm install mongoose --save
创建一个db.js文件
'use strict'
import Promise from 'bluebird'
import mongoose from 'mongoose'
//创建连接
export function createConnection(uri){
return new Promise((resolve, reject) => {
mongoose.Promise = Promise;
mongoose.connection
.on('error', error => reject(error))
.on('close', () => console.log('Database connection closed.'))
.once('open', () => resolve(mongoose.connections[0]));
mongoose.connect(uri);
})
}
//关闭连接
export function closeConnection(){
return new Promise((resolve, reject) => {
mongoose.connection.close(function(){
resolve()
})
})
}
接下来创建mongoose models定义
创建一个models.js
'use strict'
import mongoose from 'mongoose'
const productSchema = new mongoose.Schema({
title: String,
url: String,
img: String,
price: String
})
export default mongoose.model('Product', productSchema)
好了,接下来,我们就可以把数据存到mongodb里了
在main.js里加入
...
import { createConnection, closeConnection } from './db'
import Product from './models'
const dbUri = 'mongodb://localhost/node_scrapy';
(async() => {
try{
const html = await requestAsync(url, 'gbk')
const products = await tmallParse(html)
//connection mongodb
const info = await createConnection(dbUri)
console.log(`Connected to ${info.host}:${info.port}/${info.name}`)
const docs = await Product.collection.insert(products);
console.log('insert ' + docs.insertedCount + ' rows success');
//close connection
await closeConnection();
}catch(error){
console.log(error)
}
})();
在运行这个程序之前,先要把mongodb数据库启动了,不然会报错
启动完数据库
运行
npm start
看一下结果是不是这样?
Connected to localhost:27017/node_scrapy
insert 60 rows success
Database connection closed.
接下来,我们加入定时任务,可以让我们的爬虫每隔一定时间来运行
这里我们就需要node-schedule这个库了
npm install node-schedule --save
安装好了,接下来修改一下main.js
import schedule from 'node-schedule'
//定时规则
const rule = new schedule.RecurrenceRule()
//每小时第1分钟运行
rule.minute = 1
//运行定时任务
const job = schedule.scheduleJob(rule, () => {
(async() => {
...
}).()
})
好了,这次我们的爬虫就做好了,当然如果有兴趣的同学可以去爬淘宝数据,原来还想爬京东数据,但京东的ajax的数据,不知道具体的生成规则,哪个同学有更好的方法,可以给我留言。谢谢
代码会放到GitHub上
代码下载