Accelerated Python learning - 5 days nn.functional And nn.Module

The sky is full of stars_ 2020-11-13 00:39:01
accelerated python learning days nn.functional

One ,nn.functional and nn.Module

We introduced Pytorch The structure operation of tensor and some commonly used mathematical operations API.

Using these tensors API We can build neural network related components ( Such as activation function , The model layer , Loss function ).

Pytorch Most of the functional components related to neural networks are packaged in torch.nn Under module .

Most of these functional components are implemented in the form of existing functions , There are also class forms to implement .

among nn.functional( It is usually renamed after introduction F) There are function implementations of various functional components . for example :

( Activation function )

  • F.relu
  • F.sigmoid
  • F.tanh
  • F.softmax

( The model layer )

  • F.linear
  • F.conv2d
  • F.max_pool2d
  • F.dropout2d
  • F.embedding

( Loss function )

  • F.binary_cross_entropy
  • F.mse_loss
  • F.cross_entropy

In order to facilitate the management of parameters , Usually by inheritance nn.Module Convert to an implementation of a class , And directly packaged in nn Under module . for example :

( Activation function )

  • nn.ReLU
  • nn.Sigmoid
  • nn.Tanh
  • nn.Softmax

( The model layer )

  • nn.Linear
  • nn.Conv2d
  • nn.MaxPool2d
  • nn.Dropout2d
  • nn.Embedding

( Loss function )

  • nn.BCELoss
  • nn.MSELoss
  • nn.CrossEntropyLoss

actually nn.Module In addition to managing the various parameters it references , You can also manage the submodules it references , Very powerful .

Two , Use nn.Module To manage parameters

stay Pytorch in , The parameters of the model need to be trained by the optimizer , therefore , Usually set the parameter to requires_grad = True Tensor .

meanwhile , In a model , There are many parameters , It's not easy to manage these parameters manually .

Pytorch Generally, the parameters are used as nn.Parameter To express , And use nn.Module To manage all the parameters under its structure .

import torch
from torch import nn
import torch.nn.functional as F
from matplotlib import pyplot as plt
# nn.Parameter have requires_grad = True attribute
w = nn.Parameter(torch.randn(2,2))

# nn.ParameterList Multiple nn.Parameter Make a list
params_list = nn.ParameterList([nn.Parameter(torch.rand(8,i)) for i in range(1,3)])

# nn.ParameterDict Multiple nn.Parameter Make up a dictionary
params_dict = nn.ParameterDict({"a":nn.Parameter(torch.rand(2,2)),


(a): Parameter containing: [torch.FloatTensor of size 2x2]
(b): Parameter containing: [torch.FloatTensor of size 2]
# It can be used Module Manage them
# module.parameters() Return to a generator , Including all of the parameters
module = nn.Module()
module.w = w
module.params_list = params_list
module.params_dict = params_dict
num_param = 0
for param in module.parameters():
num_param = num_param + 1
print("number of Parameters =",num_param)


# In practice , Usually by inheritance nn.Module To build module classes , And put all the parts that contain the parameters that need to be learned in the constructor .
# The following example is Pytorch in nn.Linear Simplified version of the source code
# You can see that it places the parameters to be learned in __init__ In the constructor , And in forward Call in F.linear Function to implement computational logic .
class Linear(nn.Module):
__constants__ = ['in_features', 'out_features']
def __init__(self, in_features, out_features, bias=True):
super(Linear, self).__init__()
self.in_features = in_features
self.out_features = out_features
self.weight = nn.Parameter(torch.Tensor(out_features, in_features))
if bias:
self.bias = nn.Parameter(torch.Tensor(out_features))
self.register_parameter('bias', None)
def forward(self, input):
return F.linear(input, self.weight, self.bias)

3、 ... and , Use nn.Module To manage sub modules

In general , We seldom use it directly nn.Parameter To define the parameters to build the model , Instead, the model is constructed by assembling some common model layers .

These model layers are also inherited from nn.Module The object of , It also includes parameters , Submodules that belong to the module we want to define .

nn.Module There are ways to manage these submodules .

  • children() Method : Return to generator , Including all sub modules under the module .

  • named_children() Method : Return to a generator , Including all sub modules under the module , And their names .

  • modules() Method : Return to a generator , It includes all the modules at all levels under the module , Including the module itself .

  • named_modules() Method : Return to a generator , It includes all the modules at all levels under the module and their names , Including the module itself .

among chidren() Methods and named_children() The method is more used .

modules() Methods and named_modules() The method is less used , Its functions can be achieved through multiple named_children() The nested use implementation of .

class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.embedding = nn.Embedding(num_embeddings = 10000,embedding_dim = 3,padding_idx = 1)
self.conv = nn.Sequential()
self.conv.add_module("conv_1",nn.Conv1d(in_channels = 3,out_channels = 16,kernel_size = 5))
self.conv.add_module("pool_1",nn.MaxPool1d(kernel_size = 2))
self.conv.add_module("conv_2",nn.Conv1d(in_channels = 16,out_channels = 128,kernel_size = 2))
self.conv.add_module("pool_2",nn.MaxPool1d(kernel_size = 2))
self.dense = nn.Sequential()
def forward(self,x):
x = self.embedding(x).transpose(1,2)
x = self.conv(x)
y = self.dense(x)
return y
net = Net()
i = 0
for child in net.children():
print("child number",i)

So let's go through named_children How to find embedding layer , and Set its parameters to untrainable ( It's equivalent to freezing embedding layer ).


children_dict = {name:module for name,module in net.named_children()}
embedding = children_dict["embedding"]
embedding.requires_grad_(False) # Freeze its parameters
# You can see that the parameters of the first layer can no longer be trained .
for param in embedding.parameters():
from torchkeras import summary
summary(net,input_shape = (200,),input_dtype = torch.LongTensor)
# The number of untrainable parameters increases 

本文为[The sky is full of stars_]所创,转载请带上原文链接,感谢

  1. 利用Python爬虫获取招聘网站职位信息
  2. Using Python crawler to obtain job information of recruitment website
  3. Several highly rated Python libraries arrow, jsonpath, psutil and tenacity are recommended
  4. Python装饰器
  5. Python实现LDAP认证
  6. Python decorator
  7. Implementing LDAP authentication with Python
  8. Vscode configures Python development environment!
  9. In Python, how dare you say you can't log module? ️
  10. 我收藏的有关Python的电子书和资料
  11. python 中 lambda的一些tips
  12. python中字典的一些tips
  13. python 用生成器生成斐波那契数列
  14. python脚本转pyc踩了个坑。。。
  15. My collection of e-books and materials about Python
  16. Some tips of lambda in Python
  17. Some tips of dictionary in Python
  18. Using Python generator to generate Fibonacci sequence
  19. The conversion of Python script to PyC stepped on a pit...
  20. Python游戏开发,pygame模块,Python实现扫雷小游戏
  21. Python game development, pyGame module, python implementation of minesweeping games
  22. Python实用工具,email模块,Python实现邮件远程控制自己电脑
  23. Python utility, email module, python realizes mail remote control of its own computer
  24. 毫无头绪的自学Python,你可能连门槛都摸不到!【最佳学习路线】
  25. Python读取二进制文件代码方法解析
  26. Python字典的实现原理
  27. Without a clue, you may not even touch the threshold【 Best learning route]
  28. Parsing method of Python reading binary file code
  29. Implementation principle of Python dictionary
  30. You must know the function of pandas to parse JSON data - JSON_ normalize()
  31. Python实用案例,私人定制,Python自动化生成爱豆专属2021日历
  32. Python practical case, private customization, python automatic generation of Adu exclusive 2021 calendar
  33. 《Python实例》震惊了,用Python这么简单实现了聊天系统的脏话,广告检测
  34. "Python instance" was shocked and realized the dirty words and advertisement detection of the chat system in Python
  35. Convolutional neural network processing sequence for Python deep learning
  36. Python data structure and algorithm (1) -- enum type enum
  37. 超全大厂算法岗百问百答(推荐系统/机器学习/深度学习/C++/Spark/python)
  38. 【Python进阶】你真的明白NumPy中的ndarray吗?
  39. All questions and answers for algorithm posts of super large factories (recommended system / machine learning / deep learning / C + + / spark / Python)
  40. [advanced Python] do you really understand ndarray in numpy?
  41. 【Python进阶】Python进阶专栏栏主自述:不忘初心,砥砺前行
  42. [advanced Python] Python advanced column main readme: never forget the original intention and forge ahead
  43. python垃圾回收和缓存管理
  44. java调用Python程序
  45. java调用Python程序
  46. Python常用函数有哪些?Python基础入门课程
  47. Python garbage collection and cache management
  48. Java calling Python program
  49. Java calling Python program
  50. What functions are commonly used in Python? Introduction to Python Basics
  51. Python basic knowledge
  52. Anaconda5.2 安装 Python 库(MySQLdb)的方法
  53. Python实现对脑电数据情绪分析
  54. Anaconda 5.2 method of installing Python Library (mysqldb)
  55. Python implements emotion analysis of EEG data
  56. Master some advanced usage of Python in 30 seconds, which makes others envy it
  57. python爬取百度图片并对图片做一系列处理
  58. Python crawls Baidu pictures and does a series of processing on them
  59. python链接mysql数据库
  60. Python link MySQL database