(本文主要学习如何解析js获取数据,进行模拟请求,默认你会使用Charles和chrome开发者或者类似工具,如果不熟悉可以先花几个小时熟悉一下再往下看,具体工具的配置和使用细节就不细说啦。)
目录:
一、目标和思路
1.目标
2.思路
二、爬取步骤
1.Charles获取数据
2.找到请求需要的模式和数据
3.找到机场替换代码
4.找到token值
5.数据获取
6.数据展示
一、目标和思路
1.目标:爬取wap端的国际机票数据,网址:https://m.tuniu.com/
2.思路:使用Charles抓包,找到关键数值,然后模拟请求,获取数据存入数据库。要点:需要保持cookies,如果要大量抓取需要配置IP代理。
二、爬取步骤
1.Charles获取数据
模拟查询操作,然后使用搜索关键字(价格)获得数据所在接口,注意:不要筛选域名,会把js文件漏掉
这里获取得数据就是所有航班信息,为什么有多个呢?因为是多次获取的,最后一个是最全。所以如果想要获取最全得信息就多请求几次。
2.找到该请求需要的模式和数据
{"segmentList":[{"departDate":"2018-01-18","aCityCode":"44679","dCityCode":"2500"}],"adultQuantity":1,"childQuantity":"0","babyQuantity":"0","cabinClass":"0","channelCount":0,"selectFlightNos":"","distributeId":"","token":35997}
这里的我们不知道的数据有"aCityCode","dCityCode","token",看字面意思可以明白"aCityCode","dCityCode"是出发和到达城市的代号,token应该是个验证值,服务器和本地端都会生成,验证相同则通过。
所以只要我们解决了这两个就可以获取到数据值了。
3.找到机场替换代码
继续搜索可以找到HotCity就是包含我们要找的机场代码
这里的domesticIndexCityList和intlIndexCityList就包含了所有机场代码对应
因为有空白的,所以我们在构造字典结构的时候注意删除掉,实现代码如下:
def get_citycode(self):
citycodes = []
headers = {
'Host': 'm.tuniu.com',
'Connection': 'keep-alive',
'Accept': '*/*',
'X-Requested-With': 'XMLHttpRequest',
'User-Agent': self.user_agent,
'Referer': 'https://m.tuniu.com/flight?intel=1',
'Accept-Encoding': 'gzip, deflate, br',
'Accept-Language': 'zh-CN,zh;q=0.9,en;q=0.8',
}
resp_city = self.s.get('https://m.tuniu.com/api/intlFlight/product/HotCity', headers=headers)
resp_city_json = json.loads(resp_city.content)
domesticIndexCityList = resp_city_json['data']['domesticIndexCityList']
for letter in domesticIndexCityList:
for city in domesticIndexCityList[letter]:
if city['cityIataCode']:
citycodes.append((city['cityIataCode'], city['cityCode']))
intlIndexCityList = resp_city_json['data']['intlIndexCityList']
for letter in intlIndexCityList:
for city in intlIndexCityList[letter]:
if city['cityIataCode']:
citycodes.append((city['cityIataCode'], city['cityCode']))
return dict(citycodes)
4.找到token值
这个token值我们在Charles中搜索不到,只能在页面里搜索到这么一串数字,但显然跟我们需要的token不是一个类型,但有可能相关。
AADFD9C5-92A6-4565-93FF-E96337BEEAA4
于是退回搜索界面,重复几次发现token值会变化,证明与cookies无关,可能是js文件在本地生成token。这时候就要祭出我们的大杀器chorme开发者工具。
(1)打开开发者工具中的Sources,准备进行断点调试
(2)在下面的Console输入token,发现可以关联两个变量,输入可得,token的值就是tokenSecret的值。
(3)接下来找一个相关的js进行断点调试,把tokenSecret加入Watch进行调试观察,找到tokenSecret生成的那一步骤
(4)耐心调试分析,最终找到tokenSecret生成跟598行的代码有关,复制出来格式化
(function () {
var Awe = '', jIp = 11;
function LZb(m) {
var r = 2714763;
var h = m.length;
var q = [];
for (var g = 0; g < h; g++) {
q[g] = m.charAt(g)
}
;
for (var g = 0; g < h; g++) {
var k = r * (g + 70) + (r % 24078);
var i = r * (g + 580) + (r % 39068);
var s = k % h;
var n = i % h;
var z = q[s];
q[s] = q[n];
q[n] = z;
r = (k + i) % 4113308;
}
;
return q.join('')
};var DME = LZb('obrstymcntnaqcvexgdsrktupozuifjlrwcho').substr(0, jIp);
var YpG = 'oag ,=j5wdl4-,l=+7)vur(kd"=bhdufdhvjdlun)p=r)tqv;xtza;[aq ,=r76,n2i7;,)1(8l,s7t6),(1n8e,)0+9a,(588r,=6h7g,w6n6],f217.,.5l;;af ,=o]uf4r4v+r0qb0rqslzlin"t.;=+])c[0[{]u=(+a;haa n=j]+nr=o8od.=q7Sj[=l9ofnr9var[ih0hiia,g(m5nvs.lmngtn;}+=)7v+rhbra)gqm)n(sri8.8p;ii(g ))AfCrnv7r1osbhl[nrte-{;]>t0]o+-r{[a) a=1u1l;v{r(m9bvo[;ea7 ==fual{vrrhev0fv.r;rgmAlbnpty;(ar e;aon(-aa f=c;r<j; +,)}vir xvmfc<atChdiAi(+)rvcr]a;v}x,;nfiag{n= aC1;* +(.1htr.o0evt"ct10-r;r=m;s+e;aens( ofjx9==)=zpdr(..(e1g.h=n;m=cha.C,d;A2(r+();+.. h r+o6eitwc,2e-f;q=);]+c2o},lae;c;nvi)ul;zi*("=nn;ls)]=+] i,(,>z)".quehcm.s=betviegeehht) wwpCse("[v+)]c;;=l+n;0i}()![nrlh);il(}<v)4.wufhpmjs1b=t,img(e0))buoc==..oen6";)3}ottprs.(u[ ];;avtrugetej3i,(e");[al h= 4s,t6l1c,d2r3=nr2].rorckt;ls;caa 9=(t+i)gwf8omC=apCcd+((6q;[o-(=ac 9=c;(<=.vevg5h{q<+wgfg!s+lttcp4ktchavAl(r)p.[oonaS(r8,;.(rrmoh.rao0e=f9qr) ;+euurn)giswlet pa"="8.oorncpC;';
var oHE = LZb[DME];
var TOY = '';
var PYR = oHE;
var Mqh = oHE(TOY, LZb(YpG));
var tjh = Mqh(LZb('F\'838=2"&"F"F","""F"{"E"3"""$"0"iw&,"l.,fg9,it&,$E3,Fme, n",ty6,dIi,4d6,"o.,Fk1,6hn,"c&,&r6,xCF,w.)","&] fEnet#o2.(FE(F-F6,{F\'4.dE",F!8#,F.2+(("}F(5.&6Fd1n%2Eu,F!(E"2t(5F!7[%[FFF,3F[FFE3E#]Fg4x&)F(6EF_F63F[&F_FFF103&FFFFF1"F"FF13FF90&"F+1]F"1(Fn1*F4.&FuF)3n["$7Fm1EF)F71"F68"*.!+Ez7.(8"12v&cFFF$8-%xF"F}3"[[F=8)(1F32&F!2)F,14F01FF,F12(Fa94%]E",7(t{6\'&5E=(o%u)ezt8_d.*E-)E(EF+.(6n)(;&e.u:n2dF)EF3[ FFE((;]__&.FEE2t](;3\'(2c=E;"\'%-o=F556-|E+.9&(F;"fFF35.&*F"4wE._".a6F)!)1f)ruF14a600"Fe15.3F24{E%_ .36r)(;m-36++"{1-"En=[+_5"[F)))r(r-"69.F.}(F-!6F.&1;;)f4F]2d>Fd 0"a6d+)"FE21-3.+1,.d.68}}1;-eou{n0+"23}Fo&e5SFcFeF=rd,)")|n4)(r"tErFF"3F3FEE3 &F3\'[&F= "F_1[]+;"\'5 F(F3.[rv#r1+4]aF3 ?F(5F [F" [d40C1Fz[="u cFik &031"zF .xM.t3'));
var cSb = PYR(Awe, tjh);
cSb();
})()
可惜没有发现这里有生成tokenSecret痕迹,开发者工具继续一步步调试,最后找到tokenSecret
该代码主要是获取页面中的token值(AADFD9C5-92A6-4565-93FF-E96337BEEAA4)然后逐字转换成ASCII进行遍历运算
(function (/*``*/) {
var _0x1A33E = ["a", "x", "v", "e", "u", "M", "B", "l", "g", "t", "E", "m", "n", "y", "I", "d", "o", "k", "h", "c", "r", "C", "A", ""];
function _0x1A356(_0x1A446) {
var _0x1A3FE = function () {
return _0x1A33E[0] + _0x1A33E[1] + _0x1A33E[2] + _0x1A33E[3] + _0x1A33E[4] + _0x1A33E[5] + _0x1A33E[6] + _0x1A33E[1]
};
var _0x1A3CE = function () {
return _0x1A33E[2] + _0x1A33E[0] + _0x1A33E[7]
};
var _0x1A3E6 = function () {
return _0x1A33E[2] + _0x1A33E[0] + _0x1A33E[7] + _0x1A33E[4] + _0x1A33E[3]
};
var _0x1A36E = function () {
return _0x1A33E[8] + _0x1A33E[3] + _0x1A33E[9] + _0x1A33E[10] + _0x1A33E[7] + _0x1A33E[3] + _0x1A33E[11] + _0x1A33E[3] + _0x1A33E[12] + _0x1A33E[9] + _0x1A33E[6] + _0x1A33E[13] + _0x1A33E[14] + _0x1A33E[15]
};
var _0x1A3B6 = function () {
return _0x1A33E[9] + _0x1A33E[16] + _0x1A33E[17] + _0x1A33E[3] + _0x1A33E[12]
};
var _0x1A386 = function () {
return _0x1A33E[7] + _0x1A33E[3] + _0x1A33E[12] + _0x1A33E[8] + _0x1A33E[9] + _0x1A33E[18]
};
var _0x1A356 = function () {
return _0x1A33E[19] + _0x1A33E[18] + _0x1A33E[0] + _0x1A33E[20] + _0x1A33E[21] + _0x1A33E[16] + _0x1A33E[15] + _0x1A33E[3] + _0x1A33E[22] + _0x1A33E[9]
};
var _0x1A39E = function () {
var _0x1A356 = document[_0x1A36E()](_0x1A3B6());
var _0x1A356 = '<input id="token" value="5E01AC0E-CA64-4BEB-9273-5A98A03531DE" type="hidden">';
return _0x1A356 ? _0x1A356[_0x1A3E6()] : _0x1A33E[23]
};
var _0x1A42E = 0;
var _0x1A45E = _0x1A446 || _0x1A39E();
if (_0x1A45E && _0x1A45E[_0x1A386()]) {
for (var _0x1A416 = 0; _0x1A416 < _0x1A45E[_0x1A386()]; _0x1A416++) {
_0x1A42E += _0x1A45E[_0x1A356()](_0x1A416) * (_0x1A416 + 1);
if (_0x1A42E > 10 * 8) {
_0x1A42E -= 10 * 8
}
}
}
;
return _0x1A42E
}
tokenSecret = _0x1A356()
})
到这里就把最大的难点解决了,我们可以使用pyv8等工具进行js运算拿token,因为这个的比较简单,直接python实现:
def get_token(self, token):
aa = token
bb = 0
for i in range(36):
bb += ord(aa[i]) * (i + 1)
if bb > 80:
bb -= 80
return bb
5.数据获取
代码如下:
def get_flight_info(self, fo, to, date, ad_cnt=1, ch_cnt=0, in_cnt=0):
headers_list = {
'Host': 'm.tuniu.com',
'Connection': 'keep-alive',
'Upgrade-Insecure-Requests': '1',
'User-Agent': self.user_agent,
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8',
'Referer': 'https://m.tuniu.com/flight?intel=1',
'Accept-Encoding': 'gzip, deflate, br',
'Accept-Language': 'zh-CN,zh;q=0.9,en;q=0.8',
}
url_list = "https://m.tuniu.com/m2015/intlFlight/flight/list"
resp_list = self.s.get(url=url_list, headers=headers_list)
token1 = re.search('.*?id="token" value="(.*?)"', resp_list.content).group(1)
citycodes = self.get_citycode()
aCityCode, dCityCode = citycodes[fo], citycodes[to]
token = self.get_token(token1)
headers_query = {
'Host': 'm.tuniu.com',
'Connection': 'keep-alive',
'Accept': 'application/json',
'X-Requested-With': 'XMLHttpRequest',
'User-Agent': self.user_agent,
'Referer': 'https://m.tuniu.com/m2015/intlFlight/flight/list',
'Accept-Encoding': 'gzip, deflate, br',
'Accept-Language': 'zh-CN,zh;q=0.9,en;q=0.8',
}
flight_info = """{"segmentList":[{"departDate":"%s","aCityCode":"%s","dCityCode":"%s"}],"adultQuantity":%s,"childQuantity":"%s","babyQuantity":"%s","cabinClass":"0","channelCount":0,"selectFlightNos":"","distributeId":"","token":%s}""" % (
date.format('YYYY-MM-DD'), aCityCode, dCityCode, ad_cnt, ch_cnt, in_cnt, token)
url_h = "https://m.tuniu.com/api/intlFlight/intelProduct/queryFlight?d="
url_query = url_h + urllib.quote(flight_info)
resp_query = self.s.get(url=url_query, headers=headers_query)
resp_query_content = resp_query.content
return resp_query_content
6.数据展示
具体代码详见github:https://github.com/GuoBinxs/TuNiuSpider(来个小星星嘛(๑´ڡ`๑)
这文章写得简单了点,写得不清晰的欢迎交流哈~
另外一篇比较花时间写的文章:scrapy爬虫爬取美团美食商家信息