文章目录
- 前言:
- 今日所学:
- 1. 定义模型类
- 2. 模型层
- 3. 模型参数
前言:
在第六节中我们学习了网络构建,了解了神经网络模型是由神经网络层和Tensor操作构成,我们使用的mindspore.nn中提供了常见的升级网络层的实现,其中我们了解到了Cell类作为构建所有网络的基类,由不同的子Cell构成,一个Cell表示一个神经网络模型,在本节中将实现构建一个用于Mnist数据集分类的神经网络模型,来让我们更加生动形象的学习网络的构建。
今日所学:
1. 定义模型类
首先我们通过如下方法来进行构建,构建完成后实例化Network对象并查看其结构:
import mindspore
from mindspore import nn, opsclass Network(nn.Cell):def __init__(self):super().__init__()self.flatten = nn.Flatten()self.dense_relu_sequential = nn.SequentialCell(nn.Dense(28*28, 512, weight_init="normal", bias_init="zeros"),nn.ReLU(),nn.Dense(512, 512, weight_init="normal", bias_init="zeros"),nn.ReLU(),nn.Dense(512, 10, weight_init="normal", bias_init="zeros"))def construct(self, x):x = self.flatten(x)logits = self.dense_relu_sequential(x)return logitsmodel = Network()
print(model)
查看其结构如下所示:
Network<(flatten): Flatten<>(dense_relu_sequential): SequentialCell<(0): Dense<input_channels=784, output_channels=512, has_bias=True>(1): ReLU<>(2): Dense<input_channels=512, output_channels=512, has_bias=True>(3): ReLU<>(4): Dense<input_channels=512, output_channels=10, has_bias=True>>>
在此基础上还学习了构造一个输入数据,然后来调用模型获得十维的Tensor输出,并且进一步的通过nn.Softmax层来获取了预测的概率。
2. 模型层
在模型层上面,我们通过分解上面所构建的神经网络模型中的每一层,观察每一个神经网络层的效果
首先我们构造一个shape为(3, 28, 28)的随机数据:
input_image = ops.ones((3, 28, 28), mindspore.float32)
print(input_image.shape)
(3, 28, 28)
不同层的代码以及结果如下:
#nn.Flatten
flatten = nn.Flatten()
flat_image = flatten(input_image)
print(flat_image.shape)
(3, 28, 28)
#nn.Dense
layer1 = nn.Dense(in_channels=28*28, out_channels=20)
hidden1 = layer1(flat_image)
print(hidden1.shape)
(3, 20)
#nn.ReLU
print(f"Before ReLU: {hidden1}\n\n")
hidden1 = nn.ReLU()(hidden1)
print(f"After ReLU: {hidden1}")
Before ReLU: [[-0.04736331 0.2939465 -0.02713677 -0.30988005 -0.11504349 -0.116612640.18007928 0.43213072 0.12091967 -0.17465964 0.53133243 0.126057920.01825903 0.01287796 0.17238477 -0.1621131 -0.0080034 -0.24523425-0.10083733 0.05171938][-0.04736331 0.2939465 -0.02713677 -0.30988005 -0.11504349 -0.116612640.18007928 0.43213072 0.12091967 -0.17465964 0.53133243 0.126057920.01825903 0.01287796 0.17238477 -0.1621131 -0.0080034 -0.24523425-0.10083733 0.05171938][-0.04736331 0.2939465 -0.02713677 -0.30988005 -0.11504349 -0.116612640.18007928 0.43213072 0.12091967 -0.17465964 0.53133243 0.126057920.01825903 0.01287796 0.17238477 -0.1621131 -0.0080034 -0.24523425-0.10083733 0.05171938]]After ReLU: [[0. 0.2939465 0. 0. 0. 0.0.18007928 0.43213072 0.12091967 0. 0.53133243 0.126057920.01825903 0.01287796 0.17238477 0. 0. 0.0. 0.05171938][0. 0.2939465 0. 0. 0. 0.0.18007928 0.43213072 0.12091967 0. 0.53133243 0.126057920.01825903 0.01287796 0.17238477 0. 0. 0.0. 0.05171938][0. 0.2939465 0. 0. 0. 0.0.18007928 0.43213072 0.12091967 0. 0.53133243 0.126057920.01825903 0.01287796 0.17238477 0. 0. 0.0. 0.05171938]]
#nn.SequentialCell
seq_modules = nn.SequentialCell(flatten,layer1,nn.ReLU(),nn.Dense(20, 10)
)logits = seq_modules(input_image)
print(logits.shape)
(3, 10)
3. 模型参数
并且我进一步的学习到因为我们的网络内部神经网络具有权重参数和偏置参数,这些参数在我们的不断优化过程中会进行变化,所以我们可以通过model.parameters_and_names()函数来对参数详情进行获取:
print(f"Model structure: {model}\n\n")for name, param in model.parameters_and_names():print(f"Layer: {name}\nSize: {param.shape}\nValues : {param[:2]} \n")
Model structure: Network<(flatten): Flatten<>(dense_relu_sequential): SequentialCell<(0): Dense<input_channels=784, output_channels=512, has_bias=True>(1): ReLU<>(2): Dense<input_channels=512, output_channels=512, has_bias=True>(3): ReLU<>(4): Dense<input_channels=512, output_channels=10, has_bias=True>>>Layer: dense_relu_sequential.0.weight
Size: (512, 784)
Values : [[-0.01491369 0.00353318 -0.00694948 ... 0.01226766 -0.000144230.00544263][ 0.00212971 0.0019974 -0.00624789 ... -0.01214037 0.00118004-0.01594325]] Layer: dense_relu_sequential.0.bias
Size: (512,)
Values : [0. 0.] Layer: dense_relu_sequential.2.weight
Size: (512, 512)
Values : [[ 0.00565423 0.00354313 0.00637383 ... -0.00352688 0.002629490.01157355][-0.01284141 0.00657666 -0.01217057 ... 0.00318963 0.00319115-0.00186801]] Layer: dense_relu_sequential.2.bias
Size: (512,)
Values : [0. 0.] Layer: dense_relu_sequential.4.weight
Size: (10, 512)
Values : [[ 0.0087168 -0.00381866 -0.00865665 ... -0.00273731 -0.003916230.00612853][-0.00593031 0.0008721 -0.0060081 ... -0.00271535 -0.00850481-0.00820513]] Layer: dense_relu_sequential.4.bias
Size: (10,)
Values : [0. 0.]
以上就是今天我所学习的内容啦~
mindspore.nn的相关内容可见:mindspore.nn链接
Cell类可见:Cell类链接