Python crawler from the beginning to give up 06 | Python crawler the first shot to save data

SunriseCai 2020-11-13 11:31:50
python crawler beginning python crawler


This blog is only for my spare time to record articles , Publish to , Only for users to read , If there is any infringement , Please let me know , I'll delete it .
This article is pure and wild , There is no reference to other people's articles or plagiarism . Insist on originality !!

Preface

Hello . Here is Python Reptiles from getting started to giving up series of articles . I am a SunriseCai.

Use Python Reptiles Here are three steps , One step corresponds to one article .

  • Request web page
  • Get web response , Parsing data ( Webpage )
  • Save the data

This article introduces Python Reptiles The third step of : Save the data .

  • It mainly introduces how to store web data , In other words, saving trees . This article mainly introduces the following ways to save data :
  1. Save as txt file
  2. Save as csv file
  3. Save to mysql database
  4. Save to mongoDB database

Here is just a brief introduction to saving data , Do not save data as a systematic learning .

1. Save in text format

It is suggested to learn the module of the step-by-step system , Novice tutorial :Python3 File Method .

Save to file operation flow :

  1. Open file
  2. write file
  3. Close file

Open file :

  • With mode Mode on file file , And write content The content of
with open(file, mode) as f:
f.write(content)
f.close() # Close file

Various modes of writing files :

  • It is worth noting that w and a,w It's overlay mode a It's the append mode .

 Insert picture description here

1.1 Save to txt

  • Example : Incoming file content , You can save the text as txt file
content = 'This is the test save file as TXT.'
def save_txt(content):
with open("test.txt",'w',encoding='utf-8') as f:
f.write(content) # Write the contents of the file passed in
f.close()

1.2 Save as csv file

Import module first :

import csv

  • csv namely Comma-Separated Values. I think csv and excel Fried chicken .
# Write single line data (writerow([]))
import csv
def save_file():
with open('test.csv', 'w',encoding='utf-8') as f:
writer = csv.writer(f)
writer.writerow(['hello world'])
#  Write multiple lines of data  (writerows([(),(),()])
#  Multi line write needs to add (newline=''), Otherwise, there will be blank lines in the middle of each line
import csv
def save_file():
with open('test.csv', 'w', newline='',encoding='utf-8') as f:
writer = csv.writer(f)
writer.writerows([(' Dobbiaco ', '20'), ('sunrisecai', '20')])

2. Save to mysql database

You need to install mysql database , If you haven't installed it, please refer to this article :https://blog.csdn.net/weixin_45081575/article/details/102510115

Installation is required first pymysql modular , Then import the module

pip install pymysql # First step
import pymysql # The second step
import pymysql
# establish 2 Objects
db = pymysql.connect('localhost','root','123456','< Database name >',charset='utf8') # Connect to database
cursor = db.cursor() # Create cursors
# perform SQL Command and submit to database for execution
# execute() The second parameter of the method is the list parameter complement
cursor.execute('insert into < Data table name > values(%s,%s)',['SunriseCai','2020'])
db.commit() # Submit operation
# Close cursor 、 Close the database
cursor.close()
db.close()

3. Save to mongoDB database

You need to install mongoDB database , If you haven't installed it, please refer to The course of rookie course :Windows Platform installation MongoDB

Installation is required first pymongo modular , Then import the module

pip install pymongo # First step
import pymongo # The second step

Establish a connection to mongoDB:

import pymongo
#  Three steps  (1. Connection object 2. Library connection 3. A collection of objects )
# 1. Database connection object
conn=pymongo.MongoClient('localhost',27017)
# 2. Library object
db = conn[' Library name ']
# 3. A collection of objects
myset = db[' Collection name ']
# 4. insert data
myset.insert_one({
"name":" Dobbiaco "}) # {
 Dictionaries }

mongoDB Common operations :

# 1. View all databases
show dbs
# 2. Select database
use < Database name >
# 3. See all sets
show collections
# 4. View the contents of the collection
db.collection( aggregate ).find()
# 5. Delete the current database
db.dropDatabase()
# 6. Delete the collection
db.collection.drop()

Undeniable? , This article is poorly written , It is suggested that you learn the system through the link of official documents .


Finally, I will summarize the content of this chapter :

  1. It introduces with open How to write files
  2. It introduces how to save data to mysql database
  3. Describes how to save data to mongoDB database

sunrisecai

  • Thank you for your patience in watching , Focus , Neverlost .
  • For the convenience of chicken pecking each other , Welcome to join QQ Group organization :648696280

Next article , be known as 《Python Reptiles from getting started to giving up 07 | Python Reptile battle – Download the complete collection of tomb robbing notes 》.

版权声明
本文为[SunriseCai]所创,转载请带上原文链接,感谢

  1. 利用Python爬虫获取招聘网站职位信息
  2. Using Python crawler to obtain job information of recruitment website
  3. Several highly rated Python libraries arrow, jsonpath, psutil and tenacity are recommended
  4. Python装饰器
  5. Python实现LDAP认证
  6. Python decorator
  7. Implementing LDAP authentication with Python
  8. Vscode configures Python development environment!
  9. In Python, how dare you say you can't log module? ️
  10. 我收藏的有关Python的电子书和资料
  11. python 中 lambda的一些tips
  12. python中字典的一些tips
  13. python 用生成器生成斐波那契数列
  14. python脚本转pyc踩了个坑。。。
  15. My collection of e-books and materials about Python
  16. Some tips of lambda in Python
  17. Some tips of dictionary in Python
  18. Using Python generator to generate Fibonacci sequence
  19. The conversion of Python script to PyC stepped on a pit...
  20. Python游戏开发,pygame模块,Python实现扫雷小游戏
  21. Python game development, pyGame module, python implementation of minesweeping games
  22. Python实用工具,email模块,Python实现邮件远程控制自己电脑
  23. Python utility, email module, python realizes mail remote control of its own computer
  24. 毫无头绪的自学Python,你可能连门槛都摸不到!【最佳学习路线】
  25. Python读取二进制文件代码方法解析
  26. Python字典的实现原理
  27. Without a clue, you may not even touch the threshold【 Best learning route]
  28. Parsing method of Python reading binary file code
  29. Implementation principle of Python dictionary
  30. You must know the function of pandas to parse JSON data - JSON_ normalize()
  31. Python实用案例,私人定制,Python自动化生成爱豆专属2021日历
  32. Python practical case, private customization, python automatic generation of Adu exclusive 2021 calendar
  33. 《Python实例》震惊了,用Python这么简单实现了聊天系统的脏话,广告检测
  34. "Python instance" was shocked and realized the dirty words and advertisement detection of the chat system in Python
  35. Convolutional neural network processing sequence for Python deep learning
  36. Python data structure and algorithm (1) -- enum type enum
  37. 超全大厂算法岗百问百答(推荐系统/机器学习/深度学习/C++/Spark/python)
  38. 【Python进阶】你真的明白NumPy中的ndarray吗?
  39. All questions and answers for algorithm posts of super large factories (recommended system / machine learning / deep learning / C + + / spark / Python)
  40. [advanced Python] do you really understand ndarray in numpy?
  41. 【Python进阶】Python进阶专栏栏主自述:不忘初心,砥砺前行
  42. [advanced Python] Python advanced column main readme: never forget the original intention and forge ahead
  43. python垃圾回收和缓存管理
  44. java调用Python程序
  45. java调用Python程序
  46. Python常用函数有哪些?Python基础入门课程
  47. Python garbage collection and cache management
  48. Java calling Python program
  49. Java calling Python program
  50. What functions are commonly used in Python? Introduction to Python Basics
  51. Python basic knowledge
  52. Anaconda5.2 安装 Python 库(MySQLdb)的方法
  53. Python实现对脑电数据情绪分析
  54. Anaconda 5.2 method of installing Python Library (mysqldb)
  55. Python implements emotion analysis of EEG data
  56. Master some advanced usage of Python in 30 seconds, which makes others envy it
  57. python爬取百度图片并对图片做一系列处理
  58. Python crawls Baidu pictures and does a series of processing on them
  59. python链接mysql数据库
  60. Python link MySQL database