当前位置:网站首页>Sorting out of neural network basics exercises in the second week of in-depth study
Sorting out of neural network basics exercises in the second week of in-depth study
2022-07-19 08:35:00 【l8947943】
Neural Network Basics
- In logistic regression given the input x \mathbf{x} x, and parameters w ∈ R n x w\in\mathbb{R}^{n_x} w∈Rnx, b ∈ R b \in \mathbb{R} b∈R,how do we generate the output y ^ \hat{y} y^?
- σ ( W x ) σ(Wx) σ(Wx)
- W x + b Wx+b Wx+b
- σ ( W x + b ) σ(Wx+b) σ(Wx+b)
- t a n h ( W x + b ) tanh(Wx+b) tanh(Wx+b)
- suppose that y ^ = 0.9 \hat{y} = 0.9 y^=0.9 and y = 1 y = 1 y=1. What is the value of the “Logistic Loss”? Choose the best option.
- 0.005 0.005 0.005
- 0.105 0.105 0.105
- + ∞ +∞ +∞
- L ( y ^ , y ) = − ( y ^ l o g y + ( 1 − y ^ ) l o g ( 1 − y ) ) L(\hat{y},y)=−(\hat{y}logy+(1−\hat{y}) log(1−y)) L(y^,y)=−(y^logy+(1−y^)log(1−y))
- Suppose img is a (32,32,3) array, representing a 32x32 image with 3 color channels red, green and blue. How do you reshape this into a column vector x x x?
- x = img.reshape((3,32*32))
- x = img.reshape((32*32,3))
- x = img.reshape((1,32*32,3))
- x = img.reshape((32*32*3,1))
- Consider the following random arrays a a a and b b b, and c c c:
a = n p . r a n d o m . r a n d n ( 3 , 4 ) a=np.random.randn(3,4) a=np.random.randn(3,4) # a . s h a p e = ( 3 , 4 ) a . s h a p e = ( 3 , 4 ) a.shape = (3, 4)a.shape=(3,4) a.shape=(3,4)a.shape=(3,4)
b = n p . r a n d o m . r a n d n ( 1 , 4 ) b=np.random.randn(1,4) b=np.random.randn(1,4) # b . s h a p e = ( 1 , 4 ) b . s h a p e = ( 1 , 4 ) b.shape = (1, 4)b.shape=(1,4) b.shape=(1,4)b.shape=(1,4)
c = a + b c=a+b c=a+b
What will be the shape of c c c?
- The computation cannot happen because it is not possible to broadcast more than one dimension.
- c.shape = (3, 4)
- c.shape = (1, 4)
- c.shape = (3, 1)
- Consider the two following random arrays aa and bb:
a = n p . r a n d o m . r a n d n ( 1 , 3 ) a = np.random.randn(1, 3) a=np.random.randn(1,3) # a . s h a p e = ( 1 , 3 ) a . s h a p e = ( 1 , 3 ) a.shape = (1, 3)a.shape=(1,3) a.shape=(1,3)a.shape=(1,3)
b = n p . r a n d o m . r a n d n ( 3 , 3 ) b = np.random.randn(3, 3) b=np.random.randn(3,3) # b . s h a p e = ( 3 , 3 ) b . s h a p e = ( 3 , 3 ) b.shape = (3, 3)b.shape=(3,3) b.shape=(3,3)b.shape=(3,3)
c = a ∗ b c = a*b c=a∗b
What will be the shape of #c#?
- The computation cannot happen because the sizes don’t match.
- c.shape = (3, 3)
- c.shape = (1, 3)
- The computation cannot happen because it is not possible to broadcast more than one dimension.
- Suppose you have n x n_x nx input features per example. Recall that X = [ x ( 1 ) x ( 2 ) . . . x ( m ) ] X=[x^{(1)} x^{(2)} ... x^{(m)}] X=[x(1)x(2)...x(m)]. What is the dimension of X X X?
- ( m , n x ) (m,n_x) (m,nx)
- ( m , 1 ) (m,1) (m,1)
- ( n x , m ) (n_x,m) (nx,m)
- ( 1 , m ) (1,m) (1,m)
- Consider the following array:
a = n p . a r r a y ( [ [ 2 , 1 ] , [ 1 , 3 ] ] ) a = np.array([[2, 1], [1, 3]]) a=np.array([[2,1],[1,3]])
What is the result of np.dot(a,a)?
- ( 5 5 5 10 ) \begin{pmatrix} 5 & 5\\ 5 & 10 \end{pmatrix} (55510)
- ( 4 2 2 6 ) \begin{pmatrix} 4 & 2\\ 2 & 6 \end{pmatrix} (4226)
- The computation cannot happen because the sizes don’t match. It’s going to be an “Error”!
- ( 4 1 1 9 ) \begin{pmatrix} 4 & 1\\ 1 & 9 \end{pmatrix} (4119)
- Consider the following code snippet:
a . s h a p e = ( 4 , 3 ) a.shape = (4, 3) a.shape=(4,3)
b . s h a p e = ( 4 , 1 ) b.shape = (4, 1) b.shape=(4,1)
for i in range(3):
for j in range(4):
c[i][j] = a[j][i] + b[j]
How do you vectorize this?
- c = a + b.T
- c = a + b
- c = a.T + b.T
- c = a.T + b
- Consider the following code:
a = n p . r a n d o m . r a n d n ( 3 , 3 ) a = np.random.randn(3, 3) a=np.random.randn(3,3)
b = n p . r a n d o m . r a n d n ( 3 , 1 ) b = np.random.randn(3, 1) b=np.random.randn(3,1)
c = a ∗ b c = a*b c=a∗b
What will be cc? (If you’re not sure, feel free to run this in python to find out).
- This will multiply a 3x3 matrix a with a 3x1 vector, thus resulting in a 3x1 vector. That is, c.shape = (3,1).
- This will invoke broadcasting, so b is copied three times to become (3, 3), and ∗ invokes a matrix multiplication operation of two 3x3 matrices so c.shape will be (3, 3)
- This will invoke broadcasting, so b is copied three times to become (3,3), and ∗ is an element-wise product so c.shape will be (3, 3)
- It will lead to an error since you cannot use “*” to operate on these two matrices. You need to instead use np.dot(a,b)
- Consider the following computational graph.

- ( a + c ) , ( b − 1 ) (a+c),(b−1) (a+c),(b−1)
- a b + b c + a c ab+bc+ac ab+bc+ac
- ( a − 1 ) , ( b + c ) (a−1),(b+c) (a−1),(b+c)
- ( c − 1 ) , ( a + c ) (c−1),(a+c) (c−1),(a+c)
边栏推荐
猜你喜欢

Quanzhi v3s learning record (13) use of ov2640

Redis新数据类型——Bitmaps

Snap 1669 combine deux notes de liste
![[kernel] character device that drives development and learning](/img/99/2eaed37078c3245be29d82382cfd59.png)
[kernel] character device that drives development and learning

Eureka基础知识

46、IO模型

Use of OpenCV polar transformation function warppolar

Redis introduction

The latest generation of Internet: Web 3.0

QT related problems encountered when writing code
随机推荐
Hand in hand practice a DAPP, the road to Web3.0!
Solution to sudo PIP install gevent installation failure
No module named ‘yaml‘ 解决办法
Gateway new generation gateway
5.1 vulnérabilités et précautions en matière de sécurité
Redis常用数据类型——哈希(Hash)和有序集合 Zset(sorted set)
力扣1669合並兩個鏈錶筆記
1、决策树
Redis新数据类型——Bitmaps
网传USDT和USDC要做空?带你一探究竟 | Tokenview
Picture browser
3D激光SLAM:ALOAM---帧间里程计代码解读
搭建嵌入式开发环境
JS learning notes 09-12: prototype objects, foreach+tostring and recycle bin
力扣43字符串相乘笔记
数据库写入优化:分库分表及相关问题
深度学习第二周Neural Network Basics习题整理
65. Restful specification
1. Decision tree
JS学习笔记06-08:数组的遍历以及数组的四个方法