:422.29KB : :1 :2020-01-05 13:59:46
1、创建项目
scrapy startproject job51_link
2、自动生成Spider
scrapy genspider job51Hlink 51job.com
3、运行爬虫
scrapy crawl job51Hlink
4、使用scrapy shell
scrapy shell crapy shell -s USER_AGENT="Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36" "https://jobs.51job.com/ningbo-yzq/108714907.html?s=01&t=0"
'beijing','tianjin','shanghai','chongqing','hebeisheng',
'shanxisheng','liaoningsheng','jilinsheng','heilongjiangsheng',
'jiangsusheng','zhejiangsheng','anhuisheng','fujiansheng',
'jiangxisheng','shandongsheng','henansheng','hubeisheng',
'hunansheng','guangdongsheng','hainansheng','sichuansheng',
'guizhousheng','yunnansheng','shaanxisheng','gansusheng',
'qinghaisheng','innermongolia','guangxi','tibet','ningxia',
'xinjiang'
01-08Python爬虫原理分析
01-06python爬虫简易框架
01-06Python爬虫实例,爬取网页信息
01-03Python爬虫框架Scrapy
01-03python爬虫项目代码
01-02python爬虫爬取最新热搜
01-01基于Python爬虫爬取最新天气预报信息