当前位置:网站首页>Torch in pytoch Repeat() function parsing
Torch in pytoch Repeat() function parsing
2022-07-18 08:10:00 【cv_ lhp】
One . torch.repeat() Function analysis
1. explain
Official website :torch.tensor.repeat(), The function description is shown in the following figure :

2. The functionality
torch.tensor.repeat() Function can repeatedly expand the tensor
1) When there are only two parameters :( Repetition multiple of row , Repetition times of columns ),1 Means not to repeat .
2) When there are three parameters :( Repetition multiple of the number of channels , Repetition multiple of row , Repetition times of columns ),1 Means not to repeat .
3. The code example is as follows :
3.1 Input one-dimensional tensor , Parameter is one , That is, repeat on the column n Time
a = torch.randn(3)
a,a.repeat(4)
The results are shown below :
(tensor([ 0.81, -0.57, 0.10]),
tensor([ 0.81, -0.57, 0.10, 0.81, -0.57, 0.10, 0.81, -0.57, 0.10, 0.81,
-0.57, 0.10]))
3.2 Input one-dimensional tensor , The parameters are two (m,n), That is, repeat on the column first n Time , Repeat above the line m Time , The output tensor is two-dimensional
a = torch.randn(3)
a,a.repeat(4,2)
(tensor([ 0.06, -0.76, -0.59]),
tensor([[ 0.06, -0.76, -0.59, 0.06, -0.76, -0.59],
[ 0.06, -0.76, -0.59, 0.06, -0.76, -0.59],
[ 0.06, -0.76, -0.59, 0.06, -0.76, -0.59],
[ 0.06, -0.76, -0.59, 0.06, -0.76, -0.59]]))
3.3 Input one-dimensional tensor , The parameters are three (b,m,n), That is, repeat on the column first n Time , Repeat above the line m Time , Finally repeat on the channel b Time , The output tensor is three-dimensional
a = torch.randn(3)
a,a.repeat(3,4,2)
The output is as follows :
(tensor([2.25, 0.49, 1.47]),
tensor([[[2.25, 0.49, 1.47, 2.25, 0.49, 1.47],
[2.25, 0.49, 1.47, 2.25, 0.49, 1.47],
[2.25, 0.49, 1.47, 2.25, 0.49, 1.47],
[2.25, 0.49, 1.47, 2.25, 0.49, 1.47]],
[[2.25, 0.49, 1.47, 2.25, 0.49, 1.47],
[2.25, 0.49, 1.47, 2.25, 0.49, 1.47],
[2.25, 0.49, 1.47, 2.25, 0.49, 1.47],
[2.25, 0.49, 1.47, 2.25, 0.49, 1.47]],
[[2.25, 0.49, 1.47, 2.25, 0.49, 1.47],
[2.25, 0.49, 1.47, 2.25, 0.49, 1.47],
[2.25, 0.49, 1.47, 2.25, 0.49, 1.47],
[2.25, 0.49, 1.47, 2.25, 0.49, 1.47]]]))
3.4 Enter the two-dimensional tensor , The parameters are two (m,n), That is, repeat on the column first n Time , Repeat above the line m Time , The output tensor is two-dimensional ( Note that the number of parameters must be greater than the number of input tensor dimensions )
a = torch.randn(3,2)
a,a.repeat(4,2)
The output is as follows :
(tensor([[-0.58, -1.21],
[-0.35, 0.68],
[ 0.33, 0.70]]),
tensor([[-0.58, -1.21, -0.58, -1.21],
[-0.35, 0.68, -0.35, 0.68],
[ 0.33, 0.70, 0.33, 0.70],
[-0.58, -1.21, -0.58, -1.21],
[-0.35, 0.68, -0.35, 0.68],
[ 0.33, 0.70, 0.33, 0.70],
[-0.58, -1.21, -0.58, -1.21],
[-0.35, 0.68, -0.35, 0.68],
[ 0.33, 0.70, 0.33, 0.70],
[-0.58, -1.21, -0.58, -1.21],
[-0.35, 0.68, -0.35, 0.68],
[ 0.33, 0.70, 0.33, 0.70]]))
3.5 Enter the two-dimensional tensor , The parameters are three (b,m,n), That is, repeat on the column first n Time , Repeat above the line m Time , Finally repeat on the channel b Time , The output tensor is three-dimensional .( Note that the number of output tensor dimensions is the number of parameters )
a = torch.randn(3,2)
a,a.repeat(3,4,2)
The output is as follows :
(tensor([[-0.75, 1.20],
[-1.50, 1.75],
[-0.05, 0.40]]),
tensor([[[-0.75, 1.20, -0.75, 1.20],
[-1.50, 1.75, -1.50, 1.75],
[-0.05, 0.40, -0.05, 0.40],
[-0.75, 1.20, -0.75, 1.20],
[-1.50, 1.75, -1.50, 1.75],
[-0.05, 0.40, -0.05, 0.40],
[-0.75, 1.20, -0.75, 1.20],
[-1.50, 1.75, -1.50, 1.75],
[-0.05, 0.40, -0.05, 0.40],
[-0.75, 1.20, -0.75, 1.20],
[-1.50, 1.75, -1.50, 1.75],
[-0.05, 0.40, -0.05, 0.40]],
[[-0.75, 1.20, -0.75, 1.20],
[-1.50, 1.75, -1.50, 1.75],
[-0.05, 0.40, -0.05, 0.40],
[-0.75, 1.20, -0.75, 1.20],
[-1.50, 1.75, -1.50, 1.75],
[-0.05, 0.40, -0.05, 0.40],
[-0.75, 1.20, -0.75, 1.20],
[-1.50, 1.75, -1.50, 1.75],
[-0.05, 0.40, -0.05, 0.40],
[-0.75, 1.20, -0.75, 1.20],
[-1.50, 1.75, -1.50, 1.75],
[-0.05, 0.40, -0.05, 0.40]],
[[-0.75, 1.20, -0.75, 1.20],
[-1.50, 1.75, -1.50, 1.75],
[-0.05, 0.40, -0.05, 0.40],
[-0.75, 1.20, -0.75, 1.20],
[-1.50, 1.75, -1.50, 1.75],
[-0.05, 0.40, -0.05, 0.40],
[-0.75, 1.20, -0.75, 1.20],
[-1.50, 1.75, -1.50, 1.75],
[-0.05, 0.40, -0.05, 0.40],
[-0.75, 1.20, -0.75, 1.20],
[-1.50, 1.75, -1.50, 1.75],
[-0.05, 0.40, -0.05, 0.40]]]))
边栏推荐
- 8254 timer / counter application experiment
- Software architecture and design (VII) -- interactive architecture
- Matlab calculates the integral of normal function, and the quantile corresponding to the integral
- Blue whale configuration framework
- Pytorch中的广播机制(Broadcast)
- Pytorch中的广播机制(Broadcast)
- Matlab: usage of reading imagedatastore() from dataset
- One question per day · 648 Word replacement · prefix tree
- 【详细教程】一文参透MongoDB聚合查询
- Is it true or false that blue collar workers are sleepy and live broadcasting is needed?
猜你喜欢

GeoServer complete tutorial

Matlab:trainingOptions()详解 训练选项

676.实现一个魔法字典·前缀树

Pytorch中的广播机制(Broadcast)

Sorting out knowledge points of binarytree and heap

Advanced testers must see: basic knowledge of automated testing

394. String decoding · stack

1. Huawei machine test question record

【详细教程】一文参透MongoDB聚合查询

數百億數據壓縮至 600GB,TDengine 落地協鑫能科移動能源平臺
随机推荐
软件架构与设计(六)-----层次结构体
竞赛·6116·计算布尔二叉树的值·递归
Software architecture and design (VI) -- hierarchy
337.打家劫舍·动态规划
leetcode 605. Can Place Flowers 种花问题 (简单)
One question per day · 648 Word replacement · prefix tree
Implement browser servlet database interaction
347.前K个高频元素.结构体数组排序
Software architecture and design (IV) -- data flow architecture
Framework source code - binder driver analysis
二路归并排序总结
一家火锅店,凑了三个IPO
软件架构与设计(一)-----关键原则
Pytorch中torch.numel(),torch.shape,torch.size()和torch.reshape()函数解析
Sorting out knowledge points of binarytree and heap
leetcode 605. Can place flowers planting problem (simple)
GeoServer complete tutorial
每日一题·735.行星碰撞·栈模拟
745. Prefix and suffix search
驳'一切不谈考核的管理都是扯淡'