当前位置:网站首页>Broadcast mechanism in pytoch
Broadcast mechanism in pytoch
2022-07-18 08:10:00 【cv_ lhp】
1. Definition of broadcasting mechanism
If one PyTorch The operation supports broadcasting , Then Tensor Parameters can be automatically expanded to equal size ( No need to copy data ). Usually , Smaller arrays will be broadcast To a bigger one , In order to keep the same size .
2. Broadcast mechanism rules
2.1 If the following rules are observed , Two tensor yes “ It's broadcast ”:
- Every tensor At least one dimension ;
- Traverse tensor When all dimensions , Traverse from the end ( Traverse from right to left )( Start traversing from back to front ), Two tensor The following conditions exist :
- tensor Dimension equality .
- tensor The dimensions are different and one of them is 1.
- tensor The dimensions are different and one of them does not exist .
2.2 If two tensor yes “ It's broadcast ”, Then the calculation process follows the following rules :
- If two tensor Different dimensions of , In the smaller dimension tensor Add dimension in front of , Make them equal in dimensions .
- For each dimension , The dimension value of the calculation result takes two tensor The larger value in .
- Two tensor The process of expanding dimensions is to copy values .
3. The code for
3.1 Same dimension , You can definitely make it broadcasting.
# Same dimension , You can definitely make it broadcasting
x=torch.ones(5,7,3)
y=torch.ones(5,7,3)
z = x+y
x.shape,y.shape,z.shape
The output is as follows :
(torch.Size([5, 7, 3]), torch.Size([5, 7, 3]), torch.Size([5, 7, 3]))
3.2 x and y Cannot be broadcast , because x No coincidence “ At least one dimension ”, So no broadcasting.
# x and y Cannot be broadcast , because x No coincidence “ At least one dimension ”, So no broadcasting
x=torch.ones((0,))
y=torch.ones(5,7,3)
z = x+y
x.shape,y.shape,z.shape

3.3 x and y You can broadcast .
# x and y You can broadcast
x=torch.ones(5,3,4,1)
y=torch.ones( 3,1,1)
z = x+y
x.shape,y.shape,z.shape
# Traverse from the tail dimension
# 1st Tail dimension : x and y identical , All for 1.
# 2nd Tail dimension : y by 1,x by 4, The compliance dimensions are different and one of them is 1, Then broadcast as 4.
# 3rd Tail dimension : x and y identical , All for 3.
# 4th Tail dimension : y Dimension does not exist ,x by 5, The matching dimensions are different and one of them does not exist , Then broadcast as 5.
The output is as follows :
(torch.Size([5, 3, 4, 1]), torch.Size([3, 1, 1]), torch.Size([5, 3, 4, 1]))
3.4 x and y You can't broadcast , Because the penultimate dimension x by 2,y by 3, Inconsistent dimensions are different and one of them is 1.
# x and y You can't broadcast , Because the penultimate dimension x by 2,y by 3, Inconsistent dimensions are different and one of them is 1.
x=torch.ones(5,2,4,1)
y=torch.ones( 3,1,1)
z = x+y
x.shape,y.shape,z.shape

3.5 x and y You can broadcast , In smaller dimensions y Add dimension in front , Make them equal in dimensions , At the same time, make them the same dimension .
# x and y You can broadcast , In smaller dimensions y Add dimension in front , Make them equal in dimensions .
x=torch.ones(5,2,4,1)
y=torch.ones(1,1)
z = x+y
x.shape,y.shape,z.shape
The output is as follows :
(torch.Size([5, 2, 4, 1]), torch.Size([1, 1]), torch.Size([5, 2, 4, 1]))
4. in - place semantics
in-place operation It is called the in place operator , stay pytorch Middle refers to changing a tensor When , Without copying , Instead, change its value directly in the original memory . stay pytorch Suffixes are often added to “” To represent the in place operator , example :.add _()、.scatter(),in-place The operation does not allow tensor Use broadcast mechanism to change the size of tensor shape dimension , As shown in the following example .
# x and y You can't broadcast
x=torch.empty(1,3,1)
y=torch.empty(3,1,7)
z = x.add_(y)
x.shape,y.shape,z.shape

边栏推荐
- 每日一题·1252.奇数值单元格的数目·模拟优化
- Matlab calculates the integral of normal function, and the quantile corresponding to the integral
- (shangsilicon Valley) JDBC general review
- Pytoch distributed training
- A hot pot restaurant has collected three IPOs
- Matlab: usage of split dataset spliteachlabel()
- 990. Satisfiability of equation · union search set
- Software architecture and design (x) -- Architecture Technology
- Two way merge sort summary
- Pytorch中torch.repeat()函数解析
猜你喜欢
随机推荐
Pytorch中torch.repeat_interleave()函数解析
每日一题·1252.奇数值单元格的数目·模拟优化
C语言·前缀树实现
oracle跨网断转发如何解决?
GeoServer完整教程
LDAP introduction
Matlab:trainingoptions() explain training options in detail
Matlab: exchange two rows (columns) of the matrix
秋招拿了6offer,总结了20多类1100道面试题含答案解析
每日一题·735.行星碰撞·栈模拟
GeoServer complete tutorial
This year, how many war investment departments have become "decorations"
Matlab:图像增强 imageDataAugmenter() 的用法
Matlab:拆分数据集 splitEachLabel() 的用法
數百億數據壓縮至 600GB,TDengine 落地協鑫能科移動能源平臺
Sorting out knowledge points of binarytree and heap
Pytorch分布式训练
Qiu Zhao took 6 offers and summarized more than 20 categories of 1100 interview questions, including answer analysis
If the scheduled task is based on the framework
Pytoch distributed training









