Python crawler self study series (2)

Look at the future 2021-01-21 03:29:33
python crawler self study series


 Insert picture description here

Preface

It's still a little chatter , Once again, bloggers are looking for a sense of existence .

Looking back , In the past, we talked about the simple operation of reptiles , And encapsulates a simple , Get the function of web page source data , It's good, isn't it .
Python Reptile self study series one

Today, we're going to grab the data we want from the web data we get .
( notes : Many things in this article have already been mentioned , So this article is basically a link , It won't be long )


HTML A brief introduction to the website

Xpath, God forever

XPath Is a kind of will XML The hierarchical structure of a document is described as a relational way . because HTML yes from XML The elements make up , So we can use XPath from HTML Locate and select elements in the document .

If you want to know more XPath Relevant knowledge , You can click on the blue on this side .


as for beautifulsoup Don't mention it .
Don't ask me why I didn't mention it , Look down and you'll see .


Performance comparison

 Insert picture description here

Do you realize ?


good , I've finished my knowledge , What is needed in the comparison is the code encapsulation in the actual combat of the project .

The actual combat of the project also has , Crawling 2021 Tencent school recruitment in

Let's look at it first. It's about , Then we come back to pick up some functions and encapsulate them .


Getting data from a web page

What about this function , Take the data straight away , But this Xpath The use of , It's not that easy .

def get_data(html_data,Xpath_path):
'''
This is a function to grab the required data from the web page source data
:param html_data: Web source data ( A single data )
:param Xpath_path: Xpath Addressing method
:return: A list of stored results
'''
data = html_data.content
data = data.decode().replace("<!--", "").replace("-->", "") # Delete comments from data 
tree = etree.HTML(data) # establish element object 
el_list = tree.xpath(Xpath_path)
return el_list

The one above is disposable , What about sustainable development ? For example, in a web page, you need to capture more than one type of data , That is to say, there are many sets of Xpath, What to do with that ?

I have two ways :
1. take element Object is used to transfer , The function is divided into two , see :

Sustainable development method 1 :

First step , Get the URL of element Object and return

# Get the URL of element object 
def get_element(html_data):
data = html_data.content
data = data.decode().replace("<!--", "").replace("-->", "")
tree = etree.HTML(data)
return tree

The second step , from element Object

def parser_element_data(Tree,Xpath):
el_list = Tree.xpath(Xpath)
return el_list

This method , It's a bit rustic , It really needs to be used , It's not very beautiful either , redundancy .

Let's look at method two .


Sustainable development method 2 :

What about this method , It's going to be all Xpath Pass in as a list , And then take the data through the loop .

def get_data_2(html_data,Xpath_path_list):
'''
Through multiple Xpath Extract data
:param html_data: Raw web data
:param Xpath_paths: Xpath Addressing list
:return: 2 d list , A kind of addressing data, a list
'''
el_data = []
data = html_data.content
data = data.decode().replace("<!--", "").replace("-->", "")
tree = etree.HTML(data)
for Xpath_path in Xpath_path_list:
el_list = tree.xpath(Xpath_path)
el_data.append(el_list)
el_list = [] # Let's clean it up for safety's sake 
return el_data

practice

This one is relatively short , But the content is not short .
If you have a heart, you can find a website to practice Xpath, Let's say recruitment website .


版权声明
本文为[Look at the future]所创,转载请带上原文链接,感谢
https://pythonmana.com/2021/01/20210121032906893B.html

  1. 【七天搞定Python】day01.Python环境配置、pip、IDE、注释、变量,数据类型、标识符/关键字、输出、输入
  2. Life is short, I learn Python
  3. Python image enhancement and special effects - using Baidu AI to color black and white images
  4. Python environment configuration, Pip, IDE, comment, variable, data type, identifier / keyword, output, input
  5. 为什么说Python是最伟大的语言?看图就知道了 - 知乎
  6. Why is Python the greatest language? Just look at the picture. - Zhihu
  7. 通过创建视频游戏来学习 Python
  8. Learn Python by creating video games
  9. Python3版本下创建计算给定日期范围内工作日方法
  10. Creating a method to calculate working days within a given date range in Python 3
  11. 图解爬虫,用几个最简单的例子带你入门Python爬虫
  12. Graphical crawler, with a few of the simplest examples to take you to the introduction of Python crawler
  13. python+requests基础知识
  14. Basic knowledge of Python + requests
  15. python自定义windowsr日志支持文件分割
  16. python+requests基础知识
  17. Python custom Windowsr log supports file segmentation
  18. Basic knowledge of Python + requests
  19. 高级测试 | Python笔试题
  20. 火了!开源的 Python 抢票神器,过年回家就看这一波了!
  21. Python 爬虫进阶 - 前后端分离有什么了不起,过程超详细!
  22. 【python】使用pip提示ModuleNotFoundError
  23. 【python】虚拟环境搭建
  24. Advanced test | Python written test questions
  25. Fire! Open source Python ticket grabbing artifact, come home to see this wave of New Year!
  26. Python crawler advanced - before and after the end of the separation of what great, super detailed process!
  27. [Python] prompt modulenotfounderror with PIP
  28. Building a virtual environment
  29. Serverless 架构下用 Python 轻松搞定图像分类和预测
  30. Easy image classification and prediction with Python under serverless architecture
  31. python协程爬取某网站的老赖数据
  32. Python coroutine crawls Laolai data of a website
  33. 使用Python分析姿态估计数据集COCO的教程
  34. Using Python to analyze the data set coco of attitude estimation
  35. win环境 python3 flask 上手整理 环境搭建(一)
  36. Getting started with win environment python3 flash
  37. Python实现一个论文下载器,赶紧收藏
  38. win环境 python3 flask 上手整理 快速上手-基础操作(二)
  39. Python 中常见的配置文件写法
  40. Python to achieve a paper Downloader, quickly collect
  41. Python批量 png转ico
  42. 使用line_profiler对python代码性能进行评估优化
  43. 使用line_profiler对python代码性能进行评估优化
  44. Getting started with Python 3 flash in win environment
  45. Common ways to write configuration files in Python
  46. Python会在2021年死去吗? Python 3.9最终版本的回顾
  47. Python batch PNG to ICO
  48. Using line_ Profiler evaluates and optimizes the performance of Python code
  49. Using line_ Profiler evaluates and optimizes the performance of Python code
  50. Will Python die in 2021? A review of the final version of Python 3.9
  51. Python3 SMTP send mail
  52. Understanding closures in Python: getting started with closures
  53. Python日志实践
  54. Python logging practice
  55. [python opencv 计算机视觉零基础到实战] 十、图片效果毛玻璃
  56. [python opencv 计算机视觉零基础到实战] 九、模糊
  57. 10. Picture effect ground glass
  58. [Python opencv computer vision zero basis to actual combat] 9. Fuzzy
  59. 使用line_profiler對python程式碼效能進行評估優化
  60. Using line_ Profiler to evaluate and optimize the performance of Python code