Conquer Antarctica
2021-01-22 03:46:56

steps
master
python
machine
learning

selected from kdnuggets

Python It's the most popular machine learning language right now , And you can also find a lot of resources on the Internet . You're also thinking about starting from Python Introduction to machine learning ？ This tutorial may help you get started , from 0 To 1 master Python machine learning , As for later from 1 To 100 Become a machine learning expert , It depends on your own efforts . The original text of this tutorial is divided into two parts , The heart of machine integrates it in this article , For the original text, please refer to ：http://suo.im/KUWgl and http://suo.im/96wD3. The author of this tutorial is KDnuggets Associate editor and data scientist Matthew Mayo.

「 Start 」 It's often the hardest , Especially when there are too many choices , It's often difficult for a person to make a decision and make a choice . The purpose of this tutorial is to help with almost no Python Novices with machine learning background grow into knowledgeable practitioners , And in the process, only free materials and resources are needed . The main goal of this outline is to take you through the vast amount of resources available . without doubt , There are a lot of resources , But which are the best ？ Which are complementary ？ In what order is it most appropriate to learn these resources ？

First , I assume you're not an expert on ：

machine learning

Python

whatever Python Machine learning of 、 Scientific computing or data analysis library

Of course , It's better if you have some basic knowledge of the first two topics , But that's not necessary , Just spend a little more time in the early stages to get to know .

If we're going to use Python To perform machine learning , So right. Python It's important to have some basic understanding . Fortunately, , because Python Is a widely used general purpose programming language , Plus its applications in scientific computing and machine learning , So it's not very difficult to find a tutorial for beginners . you are here Python And the level of programming experience is crucial to getting started .

First , You need to install Python. Because we will use scientific computing and machine learning software package later , So I suggest you install Anaconda. This is one that can be used for Linux、OS X and Windows On the industrial level Python Realization , It contains all the necessary software packages for machine learning , Include numpy、scikit-learn and matplotlib. It also includes iPython Notebook, This is an interactive environment for many of our tutorials . I recommend installing Python 2.7.

If you don't know programming , I suggest you start with the following free online books , And then into the subsequent material ：

Learn Python the Hard Way, author Zed A. Shaw：https://learnpythonthehardway.org/book/

If you have programming experience , But I don't understand Python Or very elementary , I suggest you take the following two courses ：

Google developers Python Course （ Visual learners are strongly recommended to learn ）：http://suo.im/toMzq

Python An introduction to scientific computing （ come from UCSB Engineering Of M. Scott Shell）（ A good introduction , There are about 60 page ）：http://suo.im/2cXycM

If you want to 30 Minutes to go Python Fast track course for , Look below ：

stay Y Learn in minutes X（X=Python）：http://suo.im/zm6qX

Of course , If you are already an experienced Python The programmer , This step can be skipped . even so , I also suggest you use it often Python file ：https://www.python.org/doc/

KDnuggets Of Zachary Lipton It has been pointed out that ： Now? , People evaluate a 「 Data scientist 」 There are already many different standards . This is actually a reflection of the field of machine learning , Because what data scientists do most of the time involves using machine learning algorithms to varying degrees . To effectively create and gain insights from support vector machines , Very familiar with nuclear methods （kernel methods） Is it necessary ？ Of course not. . Like almost everything in life , Mastering the depth of theory is related to practical application . The deep understanding of machine learning algorithm is beyond the scope of this paper , It usually requires you to devote a lot of time to more academic courses , Or at least you have to do high-intensity self-study training yourself .

The good news is , For practice , You don't need to get a doctorate in machine learning —— If you want to be an efficient programmer, you don't need to study the theory of computer science .

People think that Wu Enda is Coursera The content of machine learning courses on campus is often well received ; However , My suggestion is to browse the class notes taken by the previous student online . Skip specific to Octave（ A similar to Matlab It's with you Python Learning unrelated languages ） The notes . It's important to understand that these are not official notes , But we can grasp the relevant content of Wu Enda's course materials from them . Of course, if you have time and interest , You can go now Coursera Take Wu Enda's machine learning course ：http://suo.im/2o1uD

Unofficial notes on Wu Enda's course ：http://www.holehouse.org/mlclass/

In addition to the Wu Enda course mentioned above , If you need anything else , There are many kinds of courses for you to choose from online . For example, I like Tom Mitchell, Here's a video of his recent speech （ Together with Maria-Florina Balcan）, Very approachable .

Tom Mitchell Machine learning course of ：http://suo.im/497arw

You don't need all the notes and videos right now . An effective way is when you feel fit , Go straight to the specific exercises below , Refer to the notes above and the appropriate part of the video ,

Okay , We have mastered Python Programming and a certain understanding of machine learning . And in the Python outside , There are also open source software libraries commonly used to perform real machine learning . A broad sense , There's a lot of so-called science Python library （scientific Python libraries） Can be used to perform basic machine learning tasks （ There must be some subjectivity in this judgment ）：

numpy—— Mainly for its N Dimension array objects are useful http://www.numpy.org/

pandas——Python Data analysis library , Including the data framework （dataframes） Isostructure http://pandas.pydata.org/

matplotlib—— One 2D Drawing library , Charts that produce publication quality http://matplotlib.org/

scikit-learn—— Machine learning algorithms for data analysis and data mining http://scikit-learn.org/stable/

A good way to learn about these libraries is to learn the following materials ：

Scipy Lecture Notes, come from Ga?l Varoquaux、Emmanuelle Gouillart and Olav Vahtras：http://www.scipy-lectures.org/

This pandas The tutorial is also very good ：10 Minutes to Pandas：http://suo.im/4an6gY

You'll see some other packages later in this tutorial , For example, based on matplotlib Data visualization Library of Seaborn. The packages mentioned above are just Python Part of some core libraries commonly used in machine learning , But understanding them should help you avoid confusion when you come across other packages later .

Let's get started ！

First check the preparation

Python： be ready

Machine learning basic materials ： be ready

Numpy： be ready

Pandas： be ready

Matplotlib： be ready

Now it's time to use Python Machine learning standard library scikit-learn To implement machine learning algorithms .

scikit-learn flow chart

Many of the following tutorials and exercises use iPython (Jupyter) Notebook Accomplished ,iPython Notebook Is to perform Python An interactive environment for statements .iPython Notebook It can be easily found on the Internet or downloaded to your local computer .

From Stanford iPython Notebook overview ：http://cs231n.github.io/ipython-tutorial/

Also note , The following tutorial consists of a series of online resources . If you feel there's something wrong with the course , You can communicate with the author . Our first tutorial is from scikit-learn At the beginning , I suggest you take a look at the following articles in order before continuing with the tutorial .

Here's an article about scikit-learn Introduction article ,scikit-learn yes Python The most commonly used general machine learning library , It covers K Nearest neighbor algorithm ：

Jake VanderPlas Written scikit-learn brief introduction ：http://suo.im/3bMdEd

The following is going to go deeper 、 An introduction to expansion , Including starting with a famous database and completing a project ：

Randal Olson Machine learning case notes for ：http://suo.im/RcPR6

The next one focuses on scikit-learn Evaluate the strategies of different models on the Internet , Including training sets / Test set segmentation method ：

Kevin Markham Model evaluation of ：http://suo.im/2HIXDD

In the presence of scikit-learn After the basic knowledge of , We can further explore those more general and practical algorithms . We started from the very famous K Mean clustering （k-means clustering） Algorithm start , It's a very simple and efficient way , It can solve the problem of unsupervised learning ：

K- Mean clustering ：http://suo.im/40R8zf

Now we can go back to classification , And learn the most popular classification algorithm ：

Decision tree ：http://thegrimmscientist.com/tutorial-decision-trees/

After understanding the classification problem , We can continue to look at continuous numerical prediction ：

Linear regression ：http://suo.im/3EV4Qn

We can also apply the idea of regression to classification problems , namely logistic Return to ：

logistic Return to ：http://suo.im/S2beL

We're familiar scikit-learn, Now we can look at more advanced algorithms . The first is support vector machine , It is a kind of nonlinear classifier which depends on the transformation of data to high dimensional space .

Support vector machine ：http://suo.im/2iZLLa

And then , We can go through Kaggle Titanic Competition checking learning as a random forest of ensemble classifiers ：

Kaggle Titanic competition （ Using random forests ）：http://suo.im/1o7ofe

Dimension reduction algorithms are often used to reduce variables used in problems . Principal component analysis is a special form of unsupervised dimension reduction algorithm ：

Dimension reduction algorithm ：http://suo.im/2k5y2E

Before you go to step seven , We can take a moment to think about some of the progress that has been made in a relatively short period of time .

use first Python And its machine learning library , We have not only learned about some of the most common and well-known machine learning algorithms （k a near neighbor 、k Mean clustering 、 Support vector machine, etc ）, Also studied the powerful integration technology （ Random forests ） And some extra machine learning tasks （ Dimension reduction algorithm and model validation technology ）. In addition to some basic machine learning skills , We've started looking for some useful toolkits .

We will further learn the new necessary tools .

Neural networks have many layers

Deep learning is everywhere . Deep learning is based on neural networks decades ago , But recent progress began a few years ago , And greatly improve the cognitive ability of deep neural network , It has aroused a lot of interest . If you're not familiar with neural networks ,KDnuggets There are many articles that describe in detail a lot of innovations in deep learning recently 、 Achievement and praise .

The final step is not to review all types of deep learning , But in 2 An advanced contemporary Python Deep learning library to explore a few simple network implementation . For readers who are interested in deep learning , I suggest starting with these free online books ：

Neural networks and deep learning , author Michael Nielsen：http://neuralnetworksanddeeplearning.com/

1.Theano

link ：http://deeplearning.net/software/theano/

Theano It's the first one we talked about Python Deep learning library . have a look Theano What does the author say ：

Theano It's a Python library , It allows you to define effectively 、 Optimize and evaluate mathematical expressions that contain multidimensional arrays .

Here's how to use Theano The introduction to deep learning is a bit long , But good enough , The description is vivid , High evaluation ：

Theano In depth learning course , author Colin Raffel：http://suo.im/1mPGHe

2.Caffe

link ：http://caffe.berkeleyvision.org/

Another library that we will test drive is Caffe. Once again, , Let's start with the author ：

Caffe Is a deep learning framework , By expression 、 Speed and modular construction ,Bwekeley The vision and learning center and community workers have jointly developed Caf fe.

This tutorial is one of the best in this article . We've learned a few interesting examples above , But none of them can compete with the following example , It can be achieved through Caffe Realize Google's DeepDream. This is quite wonderful ！ After mastering the course , Try to make your processor run freely , It's entertainment .

adopt Caffe To achieve Google DeepDream：http://suo.im/2cUSXS

I have no guarantee that it will be quick or easy , But if you put in the time and do it 7 A step , You'll be understanding a lot of machine learning algorithms as well as through popular Libraries （ Including some of the most cutting-edge libraries in the field of deep learning research ） stay Python I've become very good at implementing algorithms in .

Machine learning algorithm

This article is about using Python Master the basic knowledge of machine learning 7 Next in a series of steps , If you've learned the first part of the series , Then we should achieve satisfactory learning speed and skilled skills ; If not , Maybe you should review the first part , How much time does it take , Depending on your current level of understanding . I promise it's worth it . After a quick review , This article will focus more specifically on several machine learning related task sets . Because some basic modules are safely skipped ——Python Basics 、 Machine learning foundation and so on —— We can go directly into different machine learning algorithms . This time we can better classify the tutorials according to their functions .

The first part includes the following steps ：

1. Python Basic skills

2. Machine learning basic skills

3. Python Package Overview

4. Application Python Start machine learning ： Introduce & Model to evaluate

5. About Python The theme of machine learning ：k- Mean clustering 、 Decision tree 、 Linear regression & Logical regression

6. About Python High level machine learning topics ： Support vector machine 、 Random forests 、PCA Dimension reduction

7. Python Deep learning in

As mentioned above , If you're going to start from scratch , I suggest you read the first part in order . I'll also list all the entry materials for beginners , Installation instructions were included in the last article .

However , If you have read , I'll start with the basics below ：

Explanation of machine learning key terms , author Matthew Mayo. Address ：http://suo.im/2URQGm

Wikipedia entries ： Statistical classification . Address ：http://suo.im/mquen

machine learning ： A complete and detailed overview , author Alex Castrounis. Address ：http://suo.im/1yjSSq

If you're looking for alternative or complementary ways to learn the basics of machine learning , It happens that I can take what I'm looking at Shai Ben-David Video lectures and Shai Shalev-Shwartz My textbook is recommended to you ：

Shai Ben-David Introduction to machine learning video lecture , University of Waterloo . Address ：http://suo.im/1TFlK6

Understand machine learning ： From theory to algorithm , author Shai Ben-David & Shai Shalev-Shwartz. Address ：http://suo.im/1NL0ix

remember , I don't need to read all of these introductory materials to start my series . Video lectures 、 Textbooks and other resources can be found in the following circumstances ： When the machine learning algorithm is used to implement the model, or when the appropriate concept is applied in the subsequent steps . I'll judge for myself .

We start with new materials , First, consolidate our classification techniques and introduce some additional algorithms . Although the first part of this article covers decision trees 、 Support vector machine 、 Logistic regression and synthetic classification random forest , We'll still add k- Nearest neighbor 、 Naive Bayes classifier and multilayer perceptron .

Scikit-learn classifier

k- Nearest neighbor （kNN） Is an example of a simple classifier and lazy learner , All of the calculations take place at the sort time （ Instead of happening in advance during the training steps ）.kNN It's nonparametric , By comparing data examples with k Recent examples to decide how to classify .

Use Python Conduct k- Nearest neighbor classification . Address ：http://suo.im/2zqW0t

Naive Bayes is a classifier based on Bayes theorem . It assumes independence between features , And the existence of any particular feature in a class has nothing to do with the existence of any other feature in the same class .

Use Scikit-learn Document classification , author Zac Stewart. Address ：http://suo.im/2uwBm3

Multilayer perceptron （MLP） It's a simple feedforward neural network , It's made up of multiple layers of nodes , Each layer is fully connected to the following layers . Multilayer perceptron in Scikit-learn edition 0.18 It is introduced in .

First of all, from the Scikit-learn Read in the document MLP An overview of classifiers , Then use the tutorial exercise to achieve .

Neural network model （ Supervised ）,Scikit-learn file . Address ：http://suo.im/3oR76l

Python and Scikit-learn A beginner's Guide to neural networks 0.18！ author Jose Portilla. Address ：http://suo.im/2tX6rG

Let's move on to clustering , An unsupervised form of learning . In the first part , We talked about k-means Algorithm ; Here we introduce DBSCAN And maximize expectations （EM）.

Scikit-learn clustering algorithm

First , Read these introductory articles ; The first is k Mean and EM Fast comparison of clustering techniques , It is a good continuation of the new clustering form , The second is right Scikit-learn An overview of the clustering techniques available in ：

Comparison of clustering techniques ： Brief technical overview , author Matthew Mayo. Address ：http://suo.im/4ctIvI

Compare different clustering algorithms in toy datasets ,Scikit-learn file . Address ：http://suo.im/4uvbbM

Expect to maximize （EM） It's a probabilistic clustering algorithm , And thus involves determining the probability that the instance belongs to a particular cluster .EM It is close to the maximum likelihood or maximum a posteriori estimation of parameters in statistical models （Han、Kamber and Pei）.EM The process starts with a set of parameters and iterates until it is relative to each other k Clustering maximization of clustering .

First read about EM Algorithm tutorial . Next , Look at the relevant Scikit-learn file . Last , Follow the tutorial to use Python Realize it by yourself EM clustering .

Expect to maximize （EM） Algorithm tutorial , author Elena Sharova. Address ：http://suo.im/33ukYd

Gaussian mixture model ,Scikit-learn file . Address ：http://suo.im/20C2tZ.

Use Python A quick introduction to building Gaussian mixture models , author Tiago Ramalho. Address ：http://suo.im/4oxFsj

If the Gaussian mixture model looks confusing at first , So from Scikit-learn This relevant part of the document should ease any unnecessary concerns ：

Gaussian mixture object maximizes expectation （EM） Algorithm to fit Gaussian model mixture .

Density based and noisy spatial clustering applications （DBSCAN） By grouping dense data points together , And the low-density data points are designated as outliers to operate .

First of all, from the Scikit-learn Read and follow DBSCAN Example implementation of , Then follow the concise tutorial ：

DBSCAN Clustering algorithm demonstration ,Scikit-learn file . Address ：http://suo.im/1l9tvX

Density based clustering algorithm （DBSCAN） And the implementation . Address ：http://suo.im/1LEoXC

The first part deals with a single integration approach ： Random forests （RF）.RF As a top-level classifier , Great success has been achieved in the past few years , But it's certainly not the only ensemble classifier . We're going to look at the packaging 、 Promotion and voting .

Give me a promotion

First , Read an overview of these integrated learners , The first is universal ; The second is that they are related to Scikit-learn of ：

Introduction to integrated learner , author Matthew Mayo. Address ：http://suo.im/cLESw

Scikit-learn Integration methods in ,Scikit-learn file . Address ：http://suo.im/yFuY9

then , Before continuing with the new integration approach , Take a quick look at random forest through a new tutorial ：

Python Random forest in the forest , come from Yhat. Address ：http://suo.im/2eujI

packing 、 Promotion and voting are different forms of ensemble classifiers , It's all about building multiple models ; However , What algorithms are used to build these models , The data used by the model , And how the results come together , These will change with the program .

packing ： Building multiple models from the same classification algorithm , Also use different from the training set （ Independent ） Data samples ——Scikit-learn Implement packaging classifier

promote ： Building multiple models from the same classification algorithm , Link models one by one , To improve the learning of each subsequent model ——Scikit-learn Realization AdaBoost

vote ： Build multiple models from different classification algorithms , And use criteria to determine how best to combine models ——Scikit-learn Implement voting classifier

that , Why combine models ？ In order to deal with this problem from a specific perspective , Here's the deviation - An overview of variance tradeoffs , It's about promotion , Here are Scikit-learn file ：

Single evaluator vs packing ： deviation - Variance decomposition ,Scikit-learn file . Address ：http://suo.im/3izlRB

Now you've read some introductory materials about the ensemble learner , And have a basic understanding of several specific ensemble classifiers , Here's how to learn from Machine Learning Mastery Use in Scikit-learn stay Python Implementation of integrated classifiers in ：

Use Scikit-learn stay Python Implement integrated machine learning algorithm in , author Jason Brownlee. Address ：http://suo.im/9WEAr

Next we'll continue to learn ensemble classifiers , Explore one of the most popular machine learning algorithms in the world . Gradient lifting has recently had a significant impact on machine learning , Become Kaggle One of the most popular and successful algorithms in the competition .

Give me a gradient upgrade

First , Read an overview of gradient improvement ：

Wikipedia entries ： Gradient rise . Address ：http://suo.im/TslWi

Next , Understand why gradient ascension is Kaggle In the race 「 The most winning 」 Methods ：

Why gradient ascension perfectly solves many problems Kaggle problem ？Quora, Address ：http://suo.im/3rS6ZO

Kaggle Master explains what gradient ascension is , author Ben Gorman. Address ：http://suo.im/3nXlWR

although Scikit-learn It has its own gradient upgrade implementation , We're going to make a little change , Use XGBoost library , We mentioned that this is a faster implementation .

The following links provide XGBoost Some additional information about the library , And gradient enhancement （ Out of necessity ）：

Wikipedia entries ：XGBoost. Address ：http://suo.im/2UlJ3V

Ghub Upper XGBoost library . Address ：http://suo.im/2JeQI8

XGBoost file . Address ：http://suo.im/QRRrm

Now? , Follow this tutorial to bring all together ：

Python in XGBoost Gradient lifting tree implementation guide , author Jesse Steinweg-Woods. Address ：http://suo.im/4FTqD5

You can also follow these more concise examples to enhance ：

XGBoost stay Kaggle Example on （Python）. Address ：http://suo.im/4F9A1J

Iris Data set and XGBoost Simple tutorial , author Ieva Zarina. Address ：http://suo.im/2Lyb1a

Dimension reduction is to obtain a set of main variables by using process , Reduce the number of variables used for model building from its initial number to a reduced number .

There are two main forms of dimensionality reduction ：

1. feature selection —— Select a subset of related features . Address ：http://suo.im/4wlkrj

2. feature extraction —— Construct an informative and non redundant feature set of derived values . Address ：http://suo.im/3Gf0Yw

Here is a pair of commonly used feature extraction methods .

Principal component analysis （PCA） It's a statistical step , It uses orthogonal transformation to transform a group of observations of possible correlated variables into a group of linearly uncorrelated variables called principal components . The number of principal components is less than or equal to the number of original variables . This transformation is defined in this way , That is, the first principal component has the largest possible variance （ That is, consider as many variables as possible in the data ）

The above definition comes from PCA Wikipedia entries , If you are interested, you can read further . however , Here's an overview / The tutorial is very thorough ：

Principal component analysis ：3 Simple steps , author Sebastian Raschka. Address ：http://suo.im/1ahFdW

Linear discriminant analysis （LDA） yes Fisher Generalization of linear discriminant , It's statistics 、 A method used in pattern recognition and machine learning , Features for discovering linearly combined features or separating two or more categories of objects or events . The resulting combination can be used as a linear classifier , Or more often , Used as dimension reduction before subsequent classification .

LDA And analysis of variance （ANOVA） It's closely related to regression analysis , It also attempts to express a dependent variable as a linear combination of other features or measurements . However ,ANOVA Use classified independent variables and continuous dependent variables , Discriminant analysis has continuous independent variables and classification dependent variables （ Class labels ）.

The above definition also comes from Wikipedia . Here's a complete reading ：

Linear discriminant analysis —— Until bit , author Sebastian Raschka. Address ：http://suo.im/gyDOb

You are right about PCA and LDA Are you confused about the actual difference of dimension reduction ？Sebastian Raschka The following clarifications have been made ：

Linear discriminant analysis （LDA） And principal component analysis （PCA） They're all linear transformation techniques that are usually used for dimensionality reduction .PCA It can be described as 「 Unsupervised 」 Algorithm , Because it 「 Ignore 」 Class label , And its goal is to find the direction to maximize the variance in the dataset （ The so-called principal component ）. And PCA contrary ,LDA yes 「 Supervised 」 And calculate the direction of the axis that maximizes the spacing between multiple classes （「 Linear discriminant 」）.

A brief account of this , Please read the following ：

LDA and PCA What's the difference between dimensionality reduction ？ author Sebastian Raschka. Address ：http://suo.im/2IPt0U

The first part provides an entrance for learning neural networks and deep learning . If your study is going well so far and you want to consolidate your understanding of neural networks , And practice implementing several common neural network models , So please keep looking down .

First , Look at some basic materials for deep learning ：

Learn key terms and explanations in depth , author Matthew Mayo

Understand the importance of deep learning 7 A step , author Matthew Mayo. Address ：http://suo.im/3QmEfV

Next , stay Google Open source software library for machine intelligence TensorFlow（ An effective deep learning framework and now almost the best neural network tool ） Try some simple overviews ／ course ：

Machine learning is the key to success ： Anyone can understand TensorFlow Introduce （ The first 1、2 part ）

Entry level interpretation ： Xiaobai can also understand TensorFlow Introduce （ The first 3、4 part ）

Last , Directly from TensorFlow Try these tutorials on the website , It implements some of the most popular and common neural network models ：

Cyclic neural network , Google TensorFlow course . Address ：http://suo.im/2gtkze

Convolutional neural networks , Google TensorFlow course . Address ：http://suo.im/g8Lbg

Besides , Now there's an article about 7 The article of three steps to master deep learning is being written , Focus on the use of TensorFlow The top level API, To increase the ease and flexibility of model implementation . I'll also add a link here when I'm done .

dependent ：

What you should read before you enter the machine learning industry 5 This ebook . Address ：http://suo.im/SlZKt

Understand the importance of deep learning 7 A step . Address ：http://suo.im/3QmEfV

Machine learning key terms and explanations . Address ：http://suo.im/2URQGm

- 前后端分离有什么了不起，手把手教你用Python爬下来！
- 在 Azure 上执行一些简单的 python 工作
- 推荐 ：利用Python的混合集成机器学习（附链接）
- Cunning or orthodox Kung Fu? The most complete usage of Python derivation
- It's estimated that 80% of pandas people have to hang up!
- What's so great about the separation of front and rear ends? Hand in hand teach you to climb down with Python!
- Doing some simple Python work on azure
- Recommendation: hybrid integrated machine learning using python (link attached)
- Learning PPO algorithm programming from scratch (Python version)
- Python OpenCV 图片模糊操作 blur 与 medianBlur
- Python OpenCV image blur operation blur and mediablur
- 成功解决cv2.error: OpenCV(4.1.2) C:\projects\opencv-python\opencv\modules\imgproc\src\color.cpp:182: err
- Cv2.error solved successfully: opencv (4.1.2) C:: (projects / opencv Python / opencv modules / imgproc / SRC)\ color.cpp:182 : err
- Python 中使用 virtualenv 管理虚拟环境
- Using virtualenv to manage virtual environment in Python
- 如何使用Python执行系统命令?Python学习教程!
- How to use Python to execute system commands? Python tutorial!
- 快速掌握Python中的循环技术
- Quickly grasp the loop technology in Python
- Python主流Web框架之Tornado
- appium+python自动化63-使用Uiautomator2报错问题解决
- Tornado: the mainstream Python Web Framework
- Appium + Python automation 63 - using uiautomator2 to solve the problem of error reporting
- 爬虫+django，打造个性化API接口
- Crawler + Django to create personalized API interface
- 爬虫+django，打造个性化API接口
- Crawler + Django to create personalized API interface
- C、C++、Java、PHP、Python主要应用在哪里方面？
- C. Where are the main applications of C + +, Java, PHP and python?
- Python 无限级分类树状结构生成算法 「实用代码」
- Python infinite classification tree structure generation algorithm "practical code"
- 【Azure 存储服务】Python模块(azure.cosmosdb.table)直接对表存储(Storage Account Table)做操作示例
- [azure storage service] Python module（ azure.cosmosdb.table ）Direct operation example of storage account table
- 【Azure 存储服务】Python模块(azure.cosmosdb.table)直接对表存储(Storage Account Table)做操作示例
- [azure storage service] Python module（ azure.cosmosdb.table ）Direct operation example of storage account table
- openpose c++ 配置教程 + python api
- Openpose C + + configuration tutorial + Python API
- PYTHON爬虫实战_垃圾佬闲鱼爬虫转转爬虫数据整合自用二手急速响应捡垃圾平台_3(附源码持续更新)
- 使用python javaSerializationTools模块拼接生成 8u20 Gadget
- 萌新入门之python基础语法
- python中hmac模块的使用
- Python crawler_ Garbage man idle fish crawler turn crawler data integration self use second hand rapid response garbage collection platform_ 3 (with continuous source update)
- Using Python javaserialization tools module to generate 8u20 gadget
- The basic syntax of Python
- The use of HMAC module in Python
- 攻防世界web进阶区Web_python_block_chain详解
- Attack and defense world web advanced zone Web_ python_ block_ Details of chain
- pandas DataFrame的新增行列，修改、删除、筛选、判断元素以及转置操作
- Add rows and columns, modify, delete, filter, judge elements and transpose operations in pandas dataframe
- pandas DataFrame的新增行列，修改、删除、筛选、判断元素以及转置操作
- Add rows and columns, modify, delete, filter, judge elements and transpose operations in pandas dataframe
- 虚言妙诀终虚见，面试躬行是致知，Python技术面试策略与技巧实战记录
- The interview practice is knowledge, python technology interview strategy and skills of the actual record
- 用tqdm和rich为固定路径和目标的python算法代码实现进度条
- Using tqdm and rich as the fixed path and target of Python algorithm code to realize the progress bar
- 我来记笔记啦-Django开发流程与配置
- Let me take notes - Django development process and configuration
- python数据类型的强制转换
- Django报错:'Key 'id' not found in 'xxx'. Choices are: xxx'
- Python400集大型视频，从正确的方向出发学习，全套完整送给大家