Peeping into the future is not a dream, python data analysis is easy to achieve

Java architects Alliance 2021-02-23 01:45:05
peeping future dream python data


Preface

What is programming , To put it simply, it's actually to write code ? That's such a boring job , Why do so many people want to enter this industry ? Actually , As a professional for many years “ Senior cv The engineer ”, One College entrance examination 6 Institutions , Every institution 6 A professional , There's no one who has anything to do with computers , Up to now, I have completely fallen in love with the computer industry , It's very simple , When you see yourself writing countless lines of code , Finally, it shows the functions one by one demo, Help a lot of people save a lot of practice ,demo Isn't it our project to unite ? Right , In fact, the charm of programming is to teach the computer to help you do all kinds of difficult things in the shortest time , And then help others

Because I wrote some for you before Python Reptiles Related cases of , It's very hot recently python What can I do for you ? It doesn't seem to have much to do with my work , Take a moment to look down , except Artificial intelligence , big data Beyond such high-end disciplines , How does he help you in your daily life

 

First knowledge of data analysis

 

If you're an accountant , Administration , Journalism , In medicine and other professions, you can use data analysis to help you sort out and report a lot of data

What era is this , This is an age of data , You have to deal with a lot of data reports every day 、 Reports, even customer relationship news, etc , however , There is a saying. , The number is not as good as the watch , Chart is not as good as table , I don't know if your year-end report is finished , Let me take this year-end report as an example , In fact, it's plain , What the year-end report should do is to show the data of the whole year and the data of the future ? And the data , When we finish the presentation , What can you do , The previous data can be directly used out of the box , adopt excel Wait for the operation , Even though it's a little more troublesome , But it's also possible to make some forms (ps:office Except for the gods , I this office Small slag , Don't even think about it ), But when the amount of data comes up ,office Not even the great God , What's more, with the help of previous data , Show the future . What the year-end report decides is the amount of the year-end bonus , And your career development in the coming year , So at this point , Through data processing, filtering and visual analysis , Come to the conclusion , Help you make decisions

With all of you , The time to feel your head and make decisions is over , This is the age of data driven decision making , Or you write a year-end report , The boss asked you how you got this conclusion , You said you felt it ? If your whole report data derivation conclusion is in place , Are you more likely to add year-end bonus to your year-end report ?

So much for that ,python What can I do for you ?

Look directly at the renderings

img

That's our most basic , Draw our icon through the corresponding data , The source of the data is simple , It's America that can be downloaded from the Internet gdp Data graph , But if I really want you to do it , You're going to have a headache , If you don't believe it, you can try

This is the data format :

img

Data source address :https://www.kylc.com/stats/global/yearly_per_country/g_gdp/usa.html

 

I think it will take a lot of time just for you to paste and organize the data , however , The same thing , If you give it all to python To do it , How much time can I save you ? The first is our data acquisition

 

 

Crawler gets data code

# -*- coding: utf-8 -*-
import requests
from lxml import etree
import openpyxl


def get_html(url):
   headers = {
       "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.106 Safari/537.36"
   }
   html = requests.get(url)
   return html.text

def parse_content(content):
   e = etree.HTML(content)
   year_list = e.xpath("//tr/td[1]/text()")
   gdp_list = e.xpath("//tr/td[2]/text()")
   percents_list = e.xpath("//tr/td[3]/text()")
   year_list = list(filter(is_correct,year_list))
   year_list = list(map(int,year_list))
   gdp_list = list(map(extract, gdp_list))
   percents_list = list(map(delete_char,percents_list))
   print(year_list)
   print(gdp_list)
   print(percents_list)
   save_data(year_list,gdp_list,percents_list)


def save_data(year_list,gdp_list,percents_list):
   wk = openpyxl.Workbook()
   sheet = wk.active
   for i in range(len(year_list)):
       sheet.append([year_list[i],gdp_list[i],percents_list[i]])
   wk.save("t2.xlsx")

def extract(s):
   return int(s[s.find('(')+1:s.rfind(')')].replace(",",""))

def delete_char(s):
   return float(s.replace("%",""))

def is_correct(s):
   s = s.strip()
   if s:
       return s.isdigit()
   return False

if __name__ == '__main__':
   content = get_html("https://www.kylc.com/stats/global/yearly_per_country/g_gdp/usa.html")
   parse_content(content)

 

When our data is obtained and sorted according to certain rules , Specific steps , You can refer to my previous python Crawler related content

And when the data is available , We can analyze the data according to these data , I'll just show you some key codes , I have uploaded the detailed code to my code cloud warehouse , Friends in need can take their own :

The warehouse address is here

 

img

Data analysis code

 

# Draw America GDP Change chart import xlrdimport matplotlib.pyplot as pltimport numpy as np

#1、 get data

data=xlrd.open_workbook(r'E:\Python\com\test\GDP\t4.xlsx')
#print(data.sheet_names())   # Read the data in the table excel=data.sheet_by_name('Sheet')
#print(excel.nrows)

#2、 Charting

   # Set the font 
   plt.rcParams['font.sans-serif']=['FangSong']    
   # Draw area 
   plt.figure(figsize=(16,9))plt.title(" The United States gdp Change chart ")    
   # structure xy Axis 
   plt.grid(linestyle='-.')
   plt.xlabel(' year ')
   plt.ylabel('GDP( One trillion )')

#3、 The data fills the chart

# adopt for Loop through the data in each cell and store it accordingly 
year_list=[]for i in range(excel.nrows):    
year=excel.cell_value(i,0)   
# Adds data from a cell to the list    
year_list.append(year)
# !!!!!!!! The most important point : Merge data together to form a multidimensional array arr=np.array(list(zip(year_list,gdp_list,percent_list)),dtype=np.float64)

#4、 Exhibition

plt.plot(arr[:,[0]],arr[:,[1]]/1000000000000,"dg",label="GDP")
plt.plot(arr[:,[0]],arr[:,[2]],"--r",label=" Market share ")
#print(year_list)
plt.legend()
plt.show()

 

Be careful :

If you have the latest version numpy, But you also used the previous online tutorial , Will report a mistake

img

No problem , take it easy , No impact on operation , Just said it would report a mistake , The method of improvement is also very simple

The old version of the code says np.float To define the data type , After improvement, it's like this

img

That's all right.

 

This is also a problem brought about by code updating , A lot of previous code can't be used or error is reported , This question is very unfriendly for beginners , We may have to go through a lot of information , But I don't know , The reasons and answers are shown on the console

 

 picture

—END—

 

Here are some new year's benefits for you , There is a need for these learning materials , You can pay attention to me , Click add background assistant :python1180 Can get

This is my new year's benefit to you

img

 

版权声明
本文为[Java architects Alliance]所创,转载请带上原文链接,感谢
https://pythonmana.com/2021/02/20210222192903167W.html

  1. 利用Python爬虫获取招聘网站职位信息
  2. Using Python crawler to obtain job information of recruitment website
  3. Several highly rated Python libraries arrow, jsonpath, psutil and tenacity are recommended
  4. Python装饰器
  5. Python实现LDAP认证
  6. Python decorator
  7. Implementing LDAP authentication with Python
  8. Vscode configures Python development environment!
  9. In Python, how dare you say you can't log module? ️
  10. 我收藏的有关Python的电子书和资料
  11. python 中 lambda的一些tips
  12. python中字典的一些tips
  13. python 用生成器生成斐波那契数列
  14. python脚本转pyc踩了个坑。。。
  15. My collection of e-books and materials about Python
  16. Some tips of lambda in Python
  17. Some tips of dictionary in Python
  18. Using Python generator to generate Fibonacci sequence
  19. The conversion of Python script to PyC stepped on a pit...
  20. Python游戏开发,pygame模块,Python实现扫雷小游戏
  21. Python game development, pyGame module, python implementation of minesweeping games
  22. Python实用工具,email模块,Python实现邮件远程控制自己电脑
  23. Python utility, email module, python realizes mail remote control of its own computer
  24. 毫无头绪的自学Python,你可能连门槛都摸不到!【最佳学习路线】
  25. Python读取二进制文件代码方法解析
  26. Python字典的实现原理
  27. Without a clue, you may not even touch the threshold【 Best learning route]
  28. Parsing method of Python reading binary file code
  29. Implementation principle of Python dictionary
  30. You must know the function of pandas to parse JSON data - JSON_ normalize()
  31. Python实用案例,私人定制,Python自动化生成爱豆专属2021日历
  32. Python practical case, private customization, python automatic generation of Adu exclusive 2021 calendar
  33. 《Python实例》震惊了,用Python这么简单实现了聊天系统的脏话,广告检测
  34. "Python instance" was shocked and realized the dirty words and advertisement detection of the chat system in Python
  35. Convolutional neural network processing sequence for Python deep learning
  36. Python data structure and algorithm (1) -- enum type enum
  37. 超全大厂算法岗百问百答(推荐系统/机器学习/深度学习/C++/Spark/python)
  38. 【Python进阶】你真的明白NumPy中的ndarray吗?
  39. All questions and answers for algorithm posts of super large factories (recommended system / machine learning / deep learning / C + + / spark / Python)
  40. [advanced Python] do you really understand ndarray in numpy?
  41. 【Python进阶】Python进阶专栏栏主自述:不忘初心,砥砺前行
  42. [advanced Python] Python advanced column main readme: never forget the original intention and forge ahead
  43. python垃圾回收和缓存管理
  44. java调用Python程序
  45. java调用Python程序
  46. Python常用函数有哪些?Python基础入门课程
  47. Python garbage collection and cache management
  48. Java calling Python program
  49. Java calling Python program
  50. What functions are commonly used in Python? Introduction to Python Basics
  51. Python basic knowledge
  52. Anaconda5.2 安装 Python 库(MySQLdb)的方法
  53. Python实现对脑电数据情绪分析
  54. Anaconda 5.2 method of installing Python Library (mysqldb)
  55. Python implements emotion analysis of EEG data
  56. Master some advanced usage of Python in 30 seconds, which makes others envy it
  57. python爬取百度图片并对图片做一系列处理
  58. Python crawls Baidu pictures and does a series of processing on them
  59. python链接mysql数据库
  60. Python link MySQL database