J2 - ResNet-50v2实战

  • 🍨 本文为🔗365天深度学习训练营 中的学习记录博客
  • 🍖 原作者:K同学啊 | 接辅导、项目定制

目录

  • 环境
  • 步骤
    • 环境设置
    • 数据准备
      • 图像信息查看
    • 模型设计
      • ResidualBlock块
      • stack堆叠
      • resnet50v2模型
    • 模型训练
    • 模型效果展示
  • 总结与心得体会


环境

  • 系统: Linux
  • 语言: Python3.8.10
  • 深度学习框架: Pytorch2.0.0+cu118

步骤

环境设置

包引用

import torch
import torch.nn as nn
import torch.optim as optim
from torch.utils.data import DataLoader, random_split
from torchvision import datasets, transformsimport copy, random, pathlib
import matplotlib.pyplot as plt
from PIL import Image
from torchinfo import summary
import numpy as np

设置一个全局的设备,使后面的模型和数据放置在统一的设备中

device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')

数据准备

从K同学提供的网盘中下载鸟类数据集,解压到data目录下,数据集的结构如下:
目录结构
其中bird_photos下不同的文件夹中保存了不同类型的鸟类图像,这个目录结构可以使用torchvision.datasets.ImageFolder直接加载

图像信息查看

  1. 获取到所有的图像
root_dir = 'data/bird_photos'
root_directory = pathlib.Path(root_dir)
image_list = root_directory.glob("*/*")
  1. 随机打印5个图像的尺寸
for _ in range(5):print(np.array(Image.open(str(random.choice(image_list)))).shape)

图像尺寸
发现都是224*224大小的三通道图像,所以我们可以在数据集处理时省略Resize这一步,或者加上224的Resize排除异常情况
3. 随机打印20个图像

plt.figure(figsize=(20, 4))
for i in range(20):plt.subplot(2, 10, i+1)plt.axis('off')image = random.choice(image_list)class_name = image.parts[-2]plt.title(class_name)plt.imshow(Image.open(str(image)))

图像展示
4. 创建数据集
首先定义一个图像的预处理

transform = transforms.Compose([transforms.ToTensor(),transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225],),
])

然后通过datasets.ImageFolder加载文件夹

dataset = datasets.ImageFolder(root_dir, transform=transform)

从数据中提取图像不同的分类名称

class_names = [x for x in dataset.class_to_idx]

划分训练集和验证集

train_size = int(len(dataset) * 0.8)
test_size = len(dataset) - train_sizetrain_dataset, test_dataset = random_split(dataset, [train_size, test_size])

最后,将数据集划分批次

batch_size = 8
train_loader = DataLoader(train_dataset, shuffle=True, batch_size=batch_size)
test_loader = DataLoader(test_dataset, batch_size=batch_size)

模型设计

基于上次的ResNet-50,v2主要改进了BatchNormalization和激活函数的顺序。

ResidualBlock块

class ResidualBlock(nn.Module):def __init__(self, input_size, filters, kernel_size=3, stride=1, conv_shortcut=False):super().__init__()self.preact = nn.Sequential(nn.BatchNorm2d(input_size), nn.ReLU())self.conv_shortcut = conv_shortcutif conv_shortcut:self.shortcut = nn.Conv2d(input_size, 4*filters, 1, stride=stride, bias=False)elif stride > 1:self.shortcut = nn.MaxPool2d(1, stride=stride)else:self.shortcut = nn.Identity()self.conv1 = nn.Sequential(nn.Conv2d(input_size, filters, 1, stride=1, bias=False),nn.BatchNorm2d(filters),nn.ReLU())self.conv2 = nn.Sequential(nn.Conv2d(filters, filters, kernel_size, padding=1, stride=stride, bias=False),nn.BatchNorm2d(filters),nn.ReLU())self.conv3 = nn.Conv2d(filters, 4*filters, 1, bias=False)def forward(self, x):pre = self.preact(x)if self.conv_shortcut:shortcut = self.shortcut(pre)else:shortcut = self.shortcut(x)x = self.conv1(pre)x = self.conv2(x)x = self.conv3(x)x = x + shortcutreturn x

stack堆叠

class ResidualStack(nn.Module):def __init__(self, input_size, filters, blocks, stride=2):super().__init__()self.first = ResidualBlock(input_size, filters, conv_shortcut=True)self.module_list = nn.ModuleList([])for i in range(2, blocks):self.module_list.append(ResidualBlock(filters*4, filters))self.last = ResidualBlock(filters*4, filters, stride=stride)def forward(self, x):x = self.first(x)for layer in self.module_list:x = layer(x)x = self.last(x)return x

resnet50v2模型

class ResNet50v2(nn.Module):def __init__(self, include_top=True, preact=True, use_bias=True, input_size=None, pooling=None, classes=1000, classifier_activation = 'softmax'):super().__init__()self.input_conv = nn.Conv2d(3, 64, 7, padding=3, stride=2, bias=use_bias)if not preact:self.pre = nn.Sequential(nn.BatchNorm2d(64), nn.ReLU())else:self.pre = nn.Identity()self.pool = nn.MaxPool2d(3, padding=1, stride=2)self.stack1 = ResidualStack(64, 64, 3)self.stack2 = ResidualStack(256, 128, 4)self.stack3 = ResidualStack(512, 256, 6)self.stack4 = ResidualStack(1024, 512, 3, stride=1)if preact:self.post = nn.Sequential(nn.BatchNorm2d(2048), nn.ReLU())else:self.post = nn.Identity()if include_top:self.final = nn.Sequential(nn.AdaptiveAvgPool2d(1), nn.Flatten(), nn.Linear(2048, classes), nn.Softmax() if 'softmax' == classifier_activation else nn.Identity())elif 'avg' == pooling:self.final = nn.AdaptiveAvgPool2d(1)elif 'max' == pooling:self.final = nn.AdaptiveMaxPool2d(1)def forward(self, x):x = self.input_conv(x)x = self.pre(x)x = self.pool(x)x = self.stack1(x)x = self.stack2(x)x = self.stack3(x)x = self.stack4(x)x = self.post(x)x = self.final(x)return x

创建模型并打印

model = ResNet50v2(classes=len(class_names)).to(device)
summary(model, input_size=(8, 3, 224, 224))
===============================================================================================
Layer (type:depth-idx)                        Output Shape              Param #
===============================================================================================
ResNet50v2                                    [8, 4]                    --
├─Conv2d: 1-1                                 [8, 64, 112, 112]         9,472
├─Identity: 1-2                               [8, 64, 112, 112]         --
├─MaxPool2d: 1-3                              [8, 64, 56, 56]           --
├─ResidualStack: 1-4                          [8, 256, 28, 28]          --
│    └─ResidualBlock: 2-1                     [8, 256, 56, 56]          --
│    │    └─Sequential: 3-1                   [8, 64, 56, 56]           128
│    │    └─Conv2d: 3-2                       [8, 256, 56, 56]          16,384
│    │    └─Sequential: 3-3                   [8, 64, 56, 56]           4,224
│    │    └─Sequential: 3-4                   [8, 64, 56, 56]           36,992
│    │    └─Conv2d: 3-5                       [8, 256, 56, 56]          16,384
│    └─ModuleList: 2-2                        --                        --
│    │    └─ResidualBlock: 3-6                [8, 256, 56, 56]          70,400
│    └─ResidualBlock: 2-3                     [8, 256, 28, 28]          --
│    │    └─Sequential: 3-7                   [8, 256, 56, 56]          512
│    │    └─MaxPool2d: 3-8                    [8, 256, 28, 28]          --
│    │    └─Sequential: 3-9                   [8, 64, 56, 56]           16,512
│    │    └─Sequential: 3-10                  [8, 64, 28, 28]           36,992
│    │    └─Conv2d: 3-11                      [8, 256, 28, 28]          16,384
├─ResidualStack: 1-5                          [8, 512, 14, 14]          --
│    └─ResidualBlock: 2-4                     [8, 512, 28, 28]          --
│    │    └─Sequential: 3-12                  [8, 256, 28, 28]          512
│    │    └─Conv2d: 3-13                      [8, 512, 28, 28]          131,072
│    │    └─Sequential: 3-14                  [8, 128, 28, 28]          33,024
│    │    └─Sequential: 3-15                  [8, 128, 28, 28]          147,712
│    │    └─Conv2d: 3-16                      [8, 512, 28, 28]          65,536
│    └─ModuleList: 2-5                        --                        --
│    │    └─ResidualBlock: 3-17               [8, 512, 28, 28]          280,064
│    │    └─ResidualBlock: 3-18               [8, 512, 28, 28]          280,064
│    └─ResidualBlock: 2-6                     [8, 512, 14, 14]          --
│    │    └─Sequential: 3-19                  [8, 512, 28, 28]          1,024
│    │    └─MaxPool2d: 3-20                   [8, 512, 14, 14]          --
│    │    └─Sequential: 3-21                  [8, 128, 28, 28]          65,792
│    │    └─Sequential: 3-22                  [8, 128, 14, 14]          147,712
│    │    └─Conv2d: 3-23                      [8, 512, 14, 14]          65,536
├─ResidualStack: 1-6                          [8, 1024, 7, 7]           --
│    └─ResidualBlock: 2-7                     [8, 1024, 14, 14]         --
│    │    └─Sequential: 3-24                  [8, 512, 14, 14]          1,024
│    │    └─Conv2d: 3-25                      [8, 1024, 14, 14]         524,288
│    │    └─Sequential: 3-26                  [8, 256, 14, 14]          131,584
│    │    └─Sequential: 3-27                  [8, 256, 14, 14]          590,336
│    │    └─Conv2d: 3-28                      [8, 1024, 14, 14]         262,144
│    └─ModuleList: 2-8                        --                        --
│    │    └─ResidualBlock: 3-29               [8, 1024, 14, 14]         1,117,184
│    │    └─ResidualBlock: 3-30               [8, 1024, 14, 14]         1,117,184
│    │    └─ResidualBlock: 3-31               [8, 1024, 14, 14]         1,117,184
│    │    └─ResidualBlock: 3-32               [8, 1024, 14, 14]         1,117,184
│    └─ResidualBlock: 2-9                     [8, 1024, 7, 7]           --
│    │    └─Sequential: 3-33                  [8, 1024, 14, 14]         2,048
│    │    └─MaxPool2d: 3-34                   [8, 1024, 7, 7]           --
│    │    └─Sequential: 3-35                  [8, 256, 14, 14]          262,656
│    │    └─Sequential: 3-36                  [8, 256, 7, 7]            590,336
│    │    └─Conv2d: 3-37                      [8, 1024, 7, 7]           262,144
├─ResidualStack: 1-7                          [8, 2048, 7, 7]           --
│    └─ResidualBlock: 2-10                    [8, 2048, 7, 7]           --
│    │    └─Sequential: 3-38                  [8, 1024, 7, 7]           2,048
│    │    └─Conv2d: 3-39                      [8, 2048, 7, 7]           2,097,152
│    │    └─Sequential: 3-40                  [8, 512, 7, 7]            525,312
│    │    └─Sequential: 3-41                  [8, 512, 7, 7]            2,360,320
│    │    └─Conv2d: 3-42                      [8, 2048, 7, 7]           1,048,576
│    └─ModuleList: 2-11                       --                        --
│    │    └─ResidualBlock: 3-43               [8, 2048, 7, 7]           4,462,592
│    └─ResidualBlock: 2-12                    [8, 2048, 7, 7]           --
│    │    └─Sequential: 3-44                  [8, 2048, 7, 7]           4,096
│    │    └─Identity: 3-45                    [8, 2048, 7, 7]           --
│    │    └─Sequential: 3-46                  [8, 512, 7, 7]            1,049,600
│    │    └─Sequential: 3-47                  [8, 512, 7, 7]            2,360,320
│    │    └─Conv2d: 3-48                      [8, 2048, 7, 7]           1,048,576
├─Sequential: 1-8                             [8, 2048, 7, 7]           --
│    └─BatchNorm2d: 2-13                      [8, 2048, 7, 7]           4,096
│    └─ReLU: 2-14                             [8, 2048, 7, 7]           --
├─Sequential: 1-9                             [8, 4]                    --
│    └─AdaptiveAvgPool2d: 2-15                [8, 2048, 1, 1]           --
│    └─Flatten: 2-16                          [8, 2048]                 --
│    └─Linear: 2-17                           [8, 4]                    8,196
│    └─Softmax: 2-18                          [8, 4]                    --
===============================================================================================
Total params: 23,508,612
Trainable params: 23,508,612
Non-trainable params: 0
Total mult-adds (G): 27.85
===============================================================================================
Input size (MB): 4.82
Forward/backward pass size (MB): 1051.69
Params size (MB): 94.03
Estimated Total Size (MB): 1150.54
===============================================================================================

模型训练

训练代码和上一节完全一样,就不再写了, 训练结果如下:

Epoch: 1, Lr:1e-05, TrainAcc: 34.1, TrainLoss: 1.368, TestAcc: 49.6, TestLoss: 1.338
Epoch: 2, Lr:1e-05, TrainAcc: 52.4, TrainLoss: 1.297, TestAcc: 63.7, TestLoss: 1.192
Epoch: 3, Lr:1e-05, TrainAcc: 59.3, TrainLoss: 1.190, TestAcc: 68.1, TestLoss: 1.090
Epoch: 4, Lr:1e-05, TrainAcc: 61.1, TrainLoss: 1.150, TestAcc: 74.3, TestLoss: 1.049
Epoch: 5, Lr:9.5e-06, TrainAcc: 69.0, TrainLoss: 1.078, TestAcc: 74.3, TestLoss: 1.019
Epoch: 6, Lr:9.5e-06, TrainAcc: 75.7, TrainLoss: 1.029, TestAcc: 77.0, TestLoss: 0.984
Epoch: 7, Lr:9.5e-06, TrainAcc: 75.9, TrainLoss: 1.011, TestAcc: 78.8, TestLoss: 0.972
Epoch: 8, Lr:9.5e-06, TrainAcc: 78.5, TrainLoss: 0.986, TestAcc: 76.1, TestLoss: 0.974
Epoch: 9, Lr:9.5e-06, TrainAcc: 79.6, TrainLoss: 0.973, TestAcc: 82.3, TestLoss: 0.934
Epoch: 10, Lr:9.025e-06, TrainAcc: 83.6, TrainLoss: 0.938, TestAcc: 80.5, TestLoss: 0.931
Epoch: 11, Lr:9.025e-06, TrainAcc: 86.3, TrainLoss: 0.914, TestAcc: 80.5, TestLoss: 0.940
Epoch: 12, Lr:9.025e-06, TrainAcc: 86.1, TrainLoss: 0.914, TestAcc: 79.6, TestLoss: 0.934
Epoch: 13, Lr:9.025e-06, TrainAcc: 87.2, TrainLoss: 0.893, TestAcc: 84.1, TestLoss: 0.918
Epoch: 14, Lr:9.025e-06, TrainAcc: 89.6, TrainLoss: 0.879, TestAcc: 84.1, TestLoss: 0.917
Epoch: 15, Lr:8.573749999999999e-06, TrainAcc: 91.6, TrainLoss: 0.853, TestAcc: 80.5, TestLoss: 0.944
Epoch: 16, Lr:8.573749999999999e-06, TrainAcc: 90.3, TrainLoss: 0.863, TestAcc: 83.2, TestLoss: 0.919
Epoch: 17, Lr:8.573749999999999e-06, TrainAcc: 93.6, TrainLoss: 0.837, TestAcc: 77.0, TestLoss: 0.979
Epoch: 18, Lr:8.573749999999999e-06, TrainAcc: 93.1, TrainLoss: 0.838, TestAcc: 81.4, TestLoss: 0.934
Epoch: 19, Lr:8.573749999999999e-06, TrainAcc: 94.9, TrainLoss: 0.819, TestAcc: 81.4, TestLoss: 0.932
Epoch: 20, Lr:8.1450625e-06, TrainAcc: 94.7, TrainLoss: 0.825, TestAcc: 85.8, TestLoss: 0.918
Epoch: 21, Lr:8.1450625e-06, TrainAcc: 95.8, TrainLoss: 0.810, TestAcc: 80.5, TestLoss: 0.928
Epoch: 22, Lr:8.1450625e-06, TrainAcc: 93.8, TrainLoss: 0.823, TestAcc: 84.1, TestLoss: 0.900
Epoch: 23, Lr:8.1450625e-06, TrainAcc: 94.0, TrainLoss: 0.821, TestAcc: 86.7, TestLoss: 0.891
Epoch: 24, Lr:8.1450625e-06, TrainAcc: 94.5, TrainLoss: 0.829, TestAcc: 84.1, TestLoss: 0.909
Epoch: 25, Lr:7.737809374999999e-06, TrainAcc: 92.3, TrainLoss: 0.836, TestAcc: 77.0, TestLoss: 0.938
Epoch: 26, Lr:7.737809374999999e-06, TrainAcc: 93.8, TrainLoss: 0.828, TestAcc: 85.8, TestLoss: 0.897
Epoch: 27, Lr:7.737809374999999e-06, TrainAcc: 95.1, TrainLoss: 0.806, TestAcc: 85.0, TestLoss: 0.912
Epoch: 28, Lr:7.737809374999999e-06, TrainAcc: 96.9, TrainLoss: 0.794, TestAcc: 81.4, TestLoss: 0.919
Epoch: 29, Lr:7.737809374999999e-06, TrainAcc: 96.9, TrainLoss: 0.796, TestAcc: 83.2, TestLoss: 0.901
Epoch: 30, Lr:7.350918906249998e-06, TrainAcc: 97.3, TrainLoss: 0.789, TestAcc: 85.8, TestLoss: 0.885
Epoch: 31, Lr:7.350918906249998e-06, TrainAcc: 96.2, TrainLoss: 0.805, TestAcc: 83.2, TestLoss: 0.898
Epoch: 32, Lr:7.350918906249998e-06, TrainAcc: 97.8, TrainLoss: 0.791, TestAcc: 81.4, TestLoss: 0.920
Epoch: 33, Lr:7.350918906249998e-06, TrainAcc: 95.4, TrainLoss: 0.802, TestAcc: 82.3, TestLoss: 0.910
Epoch: 34, Lr:7.350918906249998e-06, TrainAcc: 96.5, TrainLoss: 0.792, TestAcc: 87.6, TestLoss: 0.882
Epoch: 35, Lr:6.983372960937498e-06, TrainAcc: 96.2, TrainLoss: 0.793, TestAcc: 85.8, TestLoss: 0.877
Epoch: 36, Lr:6.983372960937498e-06, TrainAcc: 98.2, TrainLoss: 0.780, TestAcc: 86.7, TestLoss: 0.874
Epoch: 37, Lr:6.983372960937498e-06, TrainAcc: 97.8, TrainLoss: 0.775, TestAcc: 87.6, TestLoss: 0.875
Epoch: 38, Lr:6.983372960937498e-06, TrainAcc: 97.6, TrainLoss: 0.790, TestAcc: 87.6, TestLoss: 0.881
Epoch: 39, Lr:6.983372960937498e-06, TrainAcc: 96.9, TrainLoss: 0.787, TestAcc: 81.4, TestLoss: 0.923
Epoch: 40, Lr:6.634204312890623e-06, TrainAcc: 98.5, TrainLoss: 0.774, TestAcc: 85.0, TestLoss: 0.903
Epoch: 41, Lr:6.634204312890623e-06, TrainAcc: 97.3, TrainLoss: 0.793, TestAcc: 90.3, TestLoss: 0.866
Epoch: 42, Lr:6.634204312890623e-06, TrainAcc: 98.2, TrainLoss: 0.779, TestAcc: 87.6, TestLoss: 0.882
Epoch: 43, Lr:6.634204312890623e-06, TrainAcc: 97.6, TrainLoss: 0.784, TestAcc: 84.1, TestLoss: 0.902
Epoch: 44, Lr:6.634204312890623e-06, TrainAcc: 97.3, TrainLoss: 0.786, TestAcc: 85.8, TestLoss: 0.889
Epoch: 45, Lr:6.302494097246091e-06, TrainAcc: 99.1, TrainLoss: 0.774, TestAcc: 86.7, TestLoss: 0.883
Epoch: 46, Lr:6.302494097246091e-06, TrainAcc: 98.2, TrainLoss: 0.774, TestAcc: 83.2, TestLoss: 0.891
Epoch: 47, Lr:6.302494097246091e-06, TrainAcc: 98.9, TrainLoss: 0.770, TestAcc: 85.0, TestLoss: 0.890
Epoch: 48, Lr:6.302494097246091e-06, TrainAcc: 99.1, TrainLoss: 0.766, TestAcc: 85.0, TestLoss: 0.886
Epoch: 49, Lr:6.302494097246091e-06, TrainAcc: 98.7, TrainLoss: 0.771, TestAcc: 86.7, TestLoss: 0.884
Epoch: 50, Lr:5.987369392383788e-06, TrainAcc: 99.3, TrainLoss: 0.758, TestAcc: 87.6, TestLoss: 0.874
Epoch: 51, Lr:5.987369392383788e-06, TrainAcc: 98.5, TrainLoss: 0.772, TestAcc: 87.6, TestLoss: 0.870
Epoch: 52, Lr:5.987369392383788e-06, TrainAcc: 98.7, TrainLoss: 0.767, TestAcc: 82.3, TestLoss: 0.900
Epoch: 53, Lr:5.987369392383788e-06, TrainAcc: 100.0, TrainLoss: 0.754, TestAcc: 85.8, TestLoss: 0.882
Epoch: 54, Lr:5.987369392383788e-06, TrainAcc: 98.7, TrainLoss: 0.770, TestAcc: 85.0, TestLoss: 0.884
Epoch: 55, Lr:5.688000922764597e-06, TrainAcc: 100.0, TrainLoss: 0.756, TestAcc: 85.8, TestLoss: 0.870
Epoch: 56, Lr:5.688000922764597e-06, TrainAcc: 98.5, TrainLoss: 0.772, TestAcc: 86.7, TestLoss: 0.881
Epoch: 57, Lr:5.688000922764597e-06, TrainAcc: 98.5, TrainLoss: 0.768, TestAcc: 85.0, TestLoss: 0.884
Epoch: 58, Lr:5.688000922764597e-06, TrainAcc: 98.9, TrainLoss: 0.766, TestAcc: 85.8, TestLoss: 0.883
Epoch: 59, Lr:5.688000922764597e-06, TrainAcc: 99.6, TrainLoss: 0.765, TestAcc: 86.7, TestLoss: 0.880
Epoch: 60, Lr:5.403600876626367e-06, TrainAcc: 99.3, TrainLoss: 0.766, TestAcc: 85.8, TestLoss: 0.880
Epoch: 61, Lr:5.403600876626367e-06, TrainAcc: 99.8, TrainLoss: 0.755, TestAcc: 85.8, TestLoss: 0.876
Epoch: 62, Lr:5.403600876626367e-06, TrainAcc: 99.6, TrainLoss: 0.759, TestAcc: 85.0, TestLoss: 0.874
Epoch: 63, Lr:5.403600876626367e-06, TrainAcc: 97.6, TrainLoss: 0.772, TestAcc: 86.7, TestLoss: 0.878
Epoch: 64, Lr:5.403600876626367e-06, TrainAcc: 99.6, TrainLoss: 0.761, TestAcc: 88.5, TestLoss: 0.866
Epoch: 65, Lr:5.133420832795049e-06, TrainAcc: 98.5, TrainLoss: 0.771, TestAcc: 85.0, TestLoss: 0.890
Epoch: 66, Lr:5.133420832795049e-06, TrainAcc: 98.9, TrainLoss: 0.763, TestAcc: 85.8, TestLoss: 0.887
Epoch: 67, Lr:5.133420832795049e-06, TrainAcc: 99.1, TrainLoss: 0.766, TestAcc: 85.0, TestLoss: 0.889
Epoch: 68, Lr:5.133420832795049e-06, TrainAcc: 98.7, TrainLoss: 0.763, TestAcc: 88.5, TestLoss: 0.872
Epoch: 69, Lr:5.133420832795049e-06, TrainAcc: 99.1, TrainLoss: 0.762, TestAcc: 87.6, TestLoss: 0.872
Epoch: 70, Lr:4.876749791155296e-06, TrainAcc: 99.6, TrainLoss: 0.757, TestAcc: 85.8, TestLoss: 0.876
Epoch: 71, Lr:4.876749791155296e-06, TrainAcc: 99.3, TrainLoss: 0.762, TestAcc: 85.0, TestLoss: 0.884
Epoch: 72, Lr:4.876749791155296e-06, TrainAcc: 98.9, TrainLoss: 0.767, TestAcc: 85.0, TestLoss: 0.882
Epoch: 73, Lr:4.876749791155296e-06, TrainAcc: 99.3, TrainLoss: 0.758, TestAcc: 85.0, TestLoss: 0.892
Epoch: 74, Lr:4.876749791155296e-06, TrainAcc: 98.7, TrainLoss: 0.768, TestAcc: 84.1, TestLoss: 0.880
Epoch: 75, Lr:4.632912301597531e-06, TrainAcc: 99.1, TrainLoss: 0.763, TestAcc: 87.6, TestLoss: 0.863
Epoch: 76, Lr:4.632912301597531e-06, TrainAcc: 99.6, TrainLoss: 0.757, TestAcc: 87.6, TestLoss: 0.865
Epoch: 77, Lr:4.632912301597531e-06, TrainAcc: 99.1, TrainLoss: 0.759, TestAcc: 81.4, TestLoss: 0.913
Epoch: 78, Lr:4.632912301597531e-06, TrainAcc: 99.3, TrainLoss: 0.758, TestAcc: 86.7, TestLoss: 0.870
Epoch: 79, Lr:4.632912301597531e-06, TrainAcc: 99.3, TrainLoss: 0.760, TestAcc: 81.4, TestLoss: 0.911
Epoch: 80, Lr:4.401266686517654e-06, TrainAcc: 99.3, TrainLoss: 0.761, TestAcc: 86.7, TestLoss: 0.883
Epoch: 81, Lr:4.401266686517654e-06, TrainAcc: 99.6, TrainLoss: 0.761, TestAcc: 92.0, TestLoss: 0.844
Epoch: 82, Lr:4.401266686517654e-06, TrainAcc: 99.6, TrainLoss: 0.758, TestAcc: 86.7, TestLoss: 0.870
Epoch: 83, Lr:4.401266686517654e-06, TrainAcc: 99.8, TrainLoss: 0.757, TestAcc: 88.5, TestLoss: 0.865
Epoch: 84, Lr:4.401266686517654e-06, TrainAcc: 99.6, TrainLoss: 0.756, TestAcc: 88.5, TestLoss: 0.860
Epoch: 85, Lr:4.181203352191771e-06, TrainAcc: 98.2, TrainLoss: 0.769, TestAcc: 87.6, TestLoss: 0.884
Epoch: 86, Lr:4.181203352191771e-06, TrainAcc: 99.3, TrainLoss: 0.756, TestAcc: 90.3, TestLoss: 0.858
Epoch: 87, Lr:4.181203352191771e-06, TrainAcc: 99.1, TrainLoss: 0.758, TestAcc: 85.0, TestLoss: 0.878
Epoch: 88, Lr:4.181203352191771e-06, TrainAcc: 99.6, TrainLoss: 0.756, TestAcc: 91.2, TestLoss: 0.853
Epoch: 89, Lr:4.181203352191771e-06, TrainAcc: 99.8, TrainLoss: 0.752, TestAcc: 86.7, TestLoss: 0.878
Epoch: 90, Lr:3.972143184582182e-06, TrainAcc: 98.5, TrainLoss: 0.762, TestAcc: 87.6, TestLoss: 0.874
Epoch: 91, Lr:3.972143184582182e-06, TrainAcc: 100.0, TrainLoss: 0.751, TestAcc: 88.5, TestLoss: 0.865
Epoch: 92, Lr:3.972143184582182e-06, TrainAcc: 99.3, TrainLoss: 0.756, TestAcc: 85.0, TestLoss: 0.898
Epoch: 93, Lr:3.972143184582182e-06, TrainAcc: 99.3, TrainLoss: 0.758, TestAcc: 85.0, TestLoss: 0.873
Epoch: 94, Lr:3.972143184582182e-06, TrainAcc: 99.3, TrainLoss: 0.755, TestAcc: 85.8, TestLoss: 0.894
Epoch: 95, Lr:3.7735360253530726e-06, TrainAcc: 99.3, TrainLoss: 0.759, TestAcc: 85.8, TestLoss: 0.888
Epoch: 96, Lr:3.7735360253530726e-06, TrainAcc: 99.6, TrainLoss: 0.755, TestAcc: 86.7, TestLoss: 0.868
Epoch: 97, Lr:3.7735360253530726e-06, TrainAcc: 99.3, TrainLoss: 0.756, TestAcc: 78.8, TestLoss: 0.940
Epoch: 98, Lr:3.7735360253530726e-06, TrainAcc: 98.7, TrainLoss: 0.761, TestAcc: 85.0, TestLoss: 0.893
Epoch: 99, Lr:3.7735360253530726e-06, TrainAcc: 99.8, TrainLoss: 0.755, TestAcc: 85.0, TestLoss: 0.888
Epoch: 100, Lr:3.584859224085419e-06, TrainAcc: 98.9, TrainLoss: 0.763, TestAcc: 87.6, TestLoss: 0.866
Epoch: 101, Lr:3.584859224085419e-06, TrainAcc: 99.8, TrainLoss: 0.753, TestAcc: 86.7, TestLoss: 0.879
Epoch: 102, Lr:3.584859224085419e-06, TrainAcc: 99.3, TrainLoss: 0.758, TestAcc: 85.0, TestLoss: 0.879
Epoch: 103, Lr:3.584859224085419e-06, TrainAcc: 99.3, TrainLoss: 0.758, TestAcc: 85.0, TestLoss: 0.885
Epoch: 104, Lr:3.584859224085419e-06, TrainAcc: 98.9, TrainLoss: 0.758, TestAcc: 87.6, TestLoss: 0.871
Epoch: 105, Lr:3.4056162628811484e-06, TrainAcc: 99.8, TrainLoss: 0.755, TestAcc: 91.2, TestLoss: 0.853
Epoch: 106, Lr:3.4056162628811484e-06, TrainAcc: 99.3, TrainLoss: 0.753, TestAcc: 87.6, TestLoss: 0.872
Epoch: 107, Lr:3.4056162628811484e-06, TrainAcc: 99.8, TrainLoss: 0.755, TestAcc: 91.2, TestLoss: 0.861
Epoch: 108, Lr:3.4056162628811484e-06, TrainAcc: 99.8, TrainLoss: 0.754, TestAcc: 90.3, TestLoss: 0.859
Epoch: 109, Lr:3.4056162628811484e-06, TrainAcc: 99.8, TrainLoss: 0.753, TestAcc: 89.4, TestLoss: 0.870
Epoch: 110, Lr:3.2353354497370905e-06, TrainAcc: 99.6, TrainLoss: 0.754, TestAcc: 85.0, TestLoss: 0.878
Epoch: 111, Lr:3.2353354497370905e-06, TrainAcc: 99.3, TrainLoss: 0.760, TestAcc: 86.7, TestLoss: 0.869
Epoch: 112, Lr:3.2353354497370905e-06, TrainAcc: 99.8, TrainLoss: 0.750, TestAcc: 87.6, TestLoss: 0.870
Epoch: 113, Lr:3.2353354497370905e-06, TrainAcc: 100.0, TrainLoss: 0.752, TestAcc: 88.5, TestLoss: 0.870
Epoch: 114, Lr:3.2353354497370905e-06, TrainAcc: 99.6, TrainLoss: 0.755, TestAcc: 89.4, TestLoss: 0.865
Epoch: 115, Lr:3.073568677250236e-06, TrainAcc: 100.0, TrainLoss: 0.752, TestAcc: 88.5, TestLoss: 0.865
Epoch: 116, Lr:3.073568677250236e-06, TrainAcc: 99.8, TrainLoss: 0.751, TestAcc: 86.7, TestLoss: 0.877
Epoch: 117, Lr:3.073568677250236e-06, TrainAcc: 99.6, TrainLoss: 0.754, TestAcc: 89.4, TestLoss: 0.864
Epoch: 118, Lr:3.073568677250236e-06, TrainAcc: 99.3, TrainLoss: 0.759, TestAcc: 85.0, TestLoss: 0.894
Epoch: 119, Lr:3.073568677250236e-06, TrainAcc: 99.8, TrainLoss: 0.754, TestAcc: 87.6, TestLoss: 0.867
Epoch: 120, Lr:2.919890243387724e-06, TrainAcc: 99.6, TrainLoss: 0.755, TestAcc: 89.4, TestLoss: 0.860
Epoch: 121, Lr:2.919890243387724e-06, TrainAcc: 99.1, TrainLoss: 0.758, TestAcc: 86.7, TestLoss: 0.884
Epoch: 122, Lr:2.919890243387724e-06, TrainAcc: 100.0, TrainLoss: 0.749, TestAcc: 86.7, TestLoss: 0.879
Epoch: 123, Lr:2.919890243387724e-06, TrainAcc: 100.0, TrainLoss: 0.749, TestAcc: 87.6, TestLoss: 0.876
Epoch: 124, Lr:2.919890243387724e-06, TrainAcc: 99.8, TrainLoss: 0.749, TestAcc: 89.4, TestLoss: 0.866
Epoch: 125, Lr:2.7738957312183377e-06, TrainAcc: 99.3, TrainLoss: 0.764, TestAcc: 83.2, TestLoss: 0.895
Epoch: 126, Lr:2.7738957312183377e-06, TrainAcc: 100.0, TrainLoss: 0.747, TestAcc: 86.7, TestLoss: 0.875
Epoch: 127, Lr:2.7738957312183377e-06, TrainAcc: 100.0, TrainLoss: 0.748, TestAcc: 90.3, TestLoss: 0.859
Epoch: 128, Lr:2.7738957312183377e-06, TrainAcc: 99.8, TrainLoss: 0.752, TestAcc: 88.5, TestLoss: 0.871
Epoch: 129, Lr:2.7738957312183377e-06, TrainAcc: 99.8, TrainLoss: 0.750, TestAcc: 88.5, TestLoss: 0.866
Epoch: 130, Lr:2.6352009446574206e-06, TrainAcc: 100.0, TrainLoss: 0.748, TestAcc: 89.4, TestLoss: 0.873
Epoch: 131, Lr:2.6352009446574206e-06, TrainAcc: 99.8, TrainLoss: 0.749, TestAcc: 90.3, TestLoss: 0.861
Epoch: 132, Lr:2.6352009446574206e-06, TrainAcc: 100.0, TrainLoss: 0.746, TestAcc: 89.4, TestLoss: 0.868
Epoch: 133, Lr:2.6352009446574206e-06, TrainAcc: 99.8, TrainLoss: 0.750, TestAcc: 87.6, TestLoss: 0.867
Epoch: 134, Lr:2.6352009446574206e-06, TrainAcc: 99.6, TrainLoss: 0.754, TestAcc: 89.4, TestLoss: 0.868
Epoch: 135, Lr:2.5034408974245495e-06, TrainAcc: 100.0, TrainLoss: 0.748, TestAcc: 87.6, TestLoss: 0.873
Epoch: 136, Lr:2.5034408974245495e-06, TrainAcc: 99.6, TrainLoss: 0.754, TestAcc: 89.4, TestLoss: 0.863
Epoch: 137, Lr:2.5034408974245495e-06, TrainAcc: 100.0, TrainLoss: 0.749, TestAcc: 88.5, TestLoss: 0.860
Epoch: 138, Lr:2.5034408974245495e-06, TrainAcc: 99.8, TrainLoss: 0.751, TestAcc: 89.4, TestLoss: 0.861
Epoch: 139, Lr:2.5034408974245495e-06, TrainAcc: 99.8, TrainLoss: 0.754, TestAcc: 88.5, TestLoss: 0.853
Epoch: 140, Lr:2.378268852553322e-06, TrainAcc: 100.0, TrainLoss: 0.751, TestAcc: 91.2, TestLoss: 0.861
Epoch: 141, Lr:2.378268852553322e-06, TrainAcc: 100.0, TrainLoss: 0.755, TestAcc: 89.4, TestLoss: 0.852
Epoch: 142, Lr:2.378268852553322e-06, TrainAcc: 100.0, TrainLoss: 0.751, TestAcc: 88.5, TestLoss: 0.865
Epoch: 143, Lr:2.378268852553322e-06, TrainAcc: 99.8, TrainLoss: 0.752, TestAcc: 85.8, TestLoss: 0.872
Epoch: 144, Lr:2.378268852553322e-06, TrainAcc: 99.8, TrainLoss: 0.751, TestAcc: 85.0, TestLoss: 0.883
Epoch: 145, Lr:2.2593554099256557e-06, TrainAcc: 100.0, TrainLoss: 0.750, TestAcc: 88.5, TestLoss: 0.857
Epoch: 146, Lr:2.2593554099256557e-06, TrainAcc: 100.0, TrainLoss: 0.747, TestAcc: 89.4, TestLoss: 0.866
Epoch: 147, Lr:2.2593554099256557e-06, TrainAcc: 99.8, TrainLoss: 0.751, TestAcc: 91.2, TestLoss: 0.856
Epoch: 148, Lr:2.2593554099256557e-06, TrainAcc: 100.0, TrainLoss: 0.749, TestAcc: 88.5, TestLoss: 0.850
Epoch: 149, Lr:2.2593554099256557e-06, TrainAcc: 99.6, TrainLoss: 0.754, TestAcc: 92.0, TestLoss: 0.856
Epoch: 150, Lr:2.146387639429373e-06, TrainAcc: 99.8, TrainLoss: 0.748, TestAcc: 85.0, TestLoss: 0.883
Epoch: 151, Lr:2.146387639429373e-06, TrainAcc: 100.0, TrainLoss: 0.748, TestAcc: 89.4, TestLoss: 0.856
Epoch: 152, Lr:2.146387639429373e-06, TrainAcc: 100.0, TrainLoss: 0.750, TestAcc: 88.5, TestLoss: 0.854
Epoch: 153, Lr:2.146387639429373e-06, TrainAcc: 99.8, TrainLoss: 0.747, TestAcc: 90.3, TestLoss: 0.863
Epoch: 154, Lr:2.146387639429373e-06, TrainAcc: 100.0, TrainLoss: 0.747, TestAcc: 89.4, TestLoss: 0.849
Epoch: 155, Lr:2.039068257457904e-06, TrainAcc: 99.6, TrainLoss: 0.751, TestAcc: 87.6, TestLoss: 0.866
Epoch: 156, Lr:2.039068257457904e-06, TrainAcc: 99.8, TrainLoss: 0.751, TestAcc: 91.2, TestLoss: 0.855
Epoch: 157, Lr:2.039068257457904e-06, TrainAcc: 99.8, TrainLoss: 0.752, TestAcc: 90.3, TestLoss: 0.858
Epoch: 158, Lr:2.039068257457904e-06, TrainAcc: 99.6, TrainLoss: 0.752, TestAcc: 87.6, TestLoss: 0.867
Epoch: 159, Lr:2.039068257457904e-06, TrainAcc: 99.8, TrainLoss: 0.748, TestAcc: 88.5, TestLoss: 0.859
Epoch: 160, Lr:1.937114844585009e-06, TrainAcc: 100.0, TrainLoss: 0.748, TestAcc: 89.4, TestLoss: 0.849
Epoch: 161, Lr:1.937114844585009e-06, TrainAcc: 100.0, TrainLoss: 0.748, TestAcc: 86.7, TestLoss: 0.874
Epoch: 162, Lr:1.937114844585009e-06, TrainAcc: 99.3, TrainLoss: 0.754, TestAcc: 90.3, TestLoss: 0.859
Epoch: 163, Lr:1.937114844585009e-06, TrainAcc: 99.3, TrainLoss: 0.754, TestAcc: 90.3, TestLoss: 0.853
Epoch: 164, Lr:1.937114844585009e-06, TrainAcc: 100.0, TrainLoss: 0.748, TestAcc: 86.7, TestLoss: 0.868
Epoch: 165, Lr:1.8402591023557584e-06, TrainAcc: 99.3, TrainLoss: 0.754, TestAcc: 87.6, TestLoss: 0.861
Epoch: 166, Lr:1.8402591023557584e-06, TrainAcc: 99.8, TrainLoss: 0.750, TestAcc: 90.3, TestLoss: 0.853
Epoch: 167, Lr:1.8402591023557584e-06, TrainAcc: 99.3, TrainLoss: 0.759, TestAcc: 90.3, TestLoss: 0.854
Epoch: 168, Lr:1.8402591023557584e-06, TrainAcc: 99.6, TrainLoss: 0.753, TestAcc: 90.3, TestLoss: 0.854
Epoch: 169, Lr:1.8402591023557584e-06, TrainAcc: 100.0, TrainLoss: 0.746, TestAcc: 86.7, TestLoss: 0.869
Epoch: 170, Lr:1.7482461472379704e-06, TrainAcc: 100.0, TrainLoss: 0.747, TestAcc: 90.3, TestLoss: 0.858
Epoch: 171, Lr:1.7482461472379704e-06, TrainAcc: 99.8, TrainLoss: 0.751, TestAcc: 85.0, TestLoss: 0.870
Epoch: 172, Lr:1.7482461472379704e-06, TrainAcc: 99.8, TrainLoss: 0.749, TestAcc: 87.6, TestLoss: 0.870
Epoch: 173, Lr:1.7482461472379704e-06, TrainAcc: 100.0, TrainLoss: 0.748, TestAcc: 89.4, TestLoss: 0.853
Epoch: 174, Lr:1.7482461472379704e-06, TrainAcc: 98.2, TrainLoss: 0.764, TestAcc: 90.3, TestLoss: 0.857
Epoch: 175, Lr:1.6608338398760719e-06, TrainAcc: 99.8, TrainLoss: 0.751, TestAcc: 90.3, TestLoss: 0.849
Epoch: 176, Lr:1.6608338398760719e-06, TrainAcc: 99.3, TrainLoss: 0.753, TestAcc: 84.1, TestLoss: 0.892
Epoch: 177, Lr:1.6608338398760719e-06, TrainAcc: 100.0, TrainLoss: 0.746, TestAcc: 88.5, TestLoss: 0.866
Epoch: 178, Lr:1.6608338398760719e-06, TrainAcc: 99.1, TrainLoss: 0.758, TestAcc: 88.5, TestLoss: 0.864
Epoch: 179, Lr:1.6608338398760719e-06, TrainAcc: 100.0, TrainLoss: 0.750, TestAcc: 90.3, TestLoss: 0.861
Epoch: 180, Lr:1.577792147882268e-06, TrainAcc: 100.0, TrainLoss: 0.747, TestAcc: 89.4, TestLoss: 0.863
Epoch: 181, Lr:1.577792147882268e-06, TrainAcc: 100.0, TrainLoss: 0.747, TestAcc: 91.2, TestLoss: 0.851
Epoch: 182, Lr:1.577792147882268e-06, TrainAcc: 100.0, TrainLoss: 0.748, TestAcc: 90.3, TestLoss: 0.863
Epoch: 183, Lr:1.577792147882268e-06, TrainAcc: 100.0, TrainLoss: 0.746, TestAcc: 89.4, TestLoss: 0.855
Epoch: 184, Lr:1.577792147882268e-06, TrainAcc: 100.0, TrainLoss: 0.748, TestAcc: 90.3, TestLoss: 0.846
Epoch: 185, Lr:1.4989025404881547e-06, TrainAcc: 100.0, TrainLoss: 0.747, TestAcc: 89.4, TestLoss: 0.853
Epoch: 186, Lr:1.4989025404881547e-06, TrainAcc: 100.0, TrainLoss: 0.747, TestAcc: 88.5, TestLoss: 0.866
Epoch: 187, Lr:1.4989025404881547e-06, TrainAcc: 100.0, TrainLoss: 0.747, TestAcc: 87.6, TestLoss: 0.862
Epoch: 188, Lr:1.4989025404881547e-06, TrainAcc: 100.0, TrainLoss: 0.747, TestAcc: 86.7, TestLoss: 0.870
Epoch: 189, Lr:1.4989025404881547e-06, TrainAcc: 99.8, TrainLoss: 0.748, TestAcc: 87.6, TestLoss: 0.858
Epoch: 190, Lr:1.4239574134637468e-06, TrainAcc: 100.0, TrainLoss: 0.748, TestAcc: 90.3, TestLoss: 0.841
Epoch: 191, Lr:1.4239574134637468e-06, TrainAcc: 99.6, TrainLoss: 0.752, TestAcc: 87.6, TestLoss: 0.869
Epoch: 192, Lr:1.4239574134637468e-06, TrainAcc: 99.3, TrainLoss: 0.757, TestAcc: 88.5, TestLoss: 0.859
Epoch: 193, Lr:1.4239574134637468e-06, TrainAcc: 100.0, TrainLoss: 0.751, TestAcc: 85.0, TestLoss: 0.886
Epoch: 194, Lr:1.4239574134637468e-06, TrainAcc: 100.0, TrainLoss: 0.746, TestAcc: 92.0, TestLoss: 0.854
Epoch: 195, Lr:1.3527595427905593e-06, TrainAcc: 99.6, TrainLoss: 0.752, TestAcc: 86.7, TestLoss: 0.867
Epoch: 196, Lr:1.3527595427905593e-06, TrainAcc: 100.0, TrainLoss: 0.746, TestAcc: 90.3, TestLoss: 0.858
Epoch: 197, Lr:1.3527595427905593e-06, TrainAcc: 99.8, TrainLoss: 0.750, TestAcc: 84.1, TestLoss: 0.880
Epoch: 198, Lr:1.3527595427905593e-06, TrainAcc: 99.8, TrainLoss: 0.752, TestAcc: 87.6, TestLoss: 0.863
Epoch: 199, Lr:1.3527595427905593e-06, TrainAcc: 100.0, TrainLoss: 0.746, TestAcc: 89.4, TestLoss: 0.860
Epoch: 200, Lr:1.2851215656510314e-06, TrainAcc: 100.0, TrainLoss: 0.747, TestAcc: 87.6, TestLoss: 0.857
Epoch: 201, Lr:1.2851215656510314e-06, TrainAcc: 100.0, TrainLoss: 0.748, TestAcc: 88.5, TestLoss: 0.858
Epoch: 202, Lr:1.2851215656510314e-06, TrainAcc: 99.8, TrainLoss: 0.753, TestAcc: 87.6, TestLoss: 0.876
Epoch: 203, Lr:1.2851215656510314e-06, TrainAcc: 100.0, TrainLoss: 0.746, TestAcc: 91.2, TestLoss: 0.858
Epoch: 204, Lr:1.2851215656510314e-06, TrainAcc: 99.8, TrainLoss: 0.748, TestAcc: 90.3, TestLoss: 0.848
Epoch: 205, Lr:1.2208654873684798e-06, TrainAcc: 100.0, TrainLoss: 0.751, TestAcc: 90.3, TestLoss: 0.855
Epoch: 206, Lr:1.2208654873684798e-06, TrainAcc: 100.0, TrainLoss: 0.748, TestAcc: 90.3, TestLoss: 0.865
Epoch: 207, Lr:1.2208654873684798e-06, TrainAcc: 100.0, TrainLoss: 0.746, TestAcc: 88.5, TestLoss: 0.858
Epoch: 208, Lr:1.2208654873684798e-06, TrainAcc: 98.9, TrainLoss: 0.766, TestAcc: 88.5, TestLoss: 0.861

模型效果展示

损失
损失
准确度

准确度

总结与心得体会

可以发现逆置了BN和ReLU之后,模型的收敛速度更快,然后在测试集上的效果也更好。和作者的相比,我的效果提升给加明显。分析原因是作者的模型训练层数较深,提升的是模型的上限,而我的应该还没有收敛。由此可以得出结论逆置的BN和ReLU具有更强的特征提取能力,也有一定的表达能力的提升

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mzph.cn/news/602416.shtml

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

网页服务, 静态页面

文章目录 概要demo示例说明 概要 创建微服务时, 可以将静态资源(前端界面)放入resource中, 通过接口来访问 参考博客: https://blog.csdn.net/wangxin1949/article/details/89016428 demo示例 Controller RequestMapping(“/terminal/task”) public…

深度学习中的知识蒸馏

一.概念 知识蒸馏(Knowledge Distillation)是一种深度学习中的模型压缩技术,旨在通过从一个教师模型(teacher model)向一个学生模型(student model)传递知识来减小模型的规模,同时保…

通过聚道云软件连接器实现钉钉与自研主数据系统的完美融合

客户介绍 某知名高校,拥有数千名教职工,日常管理涉及大量的人员异动信息。该高校设有多个学院和研究所,涵盖了工、理、管、文等多个学科领域。该高校是一所充满活力和潜力的学府,致力于为学生提供优质的教育资源和多元化的学习环…

体系化的进阶学习内容

UWA学堂:传播游戏行业的体系化的进阶学习内容。UWA学堂作为面向开发者的在线学习平台,目前已经上线272门课程,涵盖了3D引擎渲染、UI、逻辑代码等多个模块,拥有完整的学习体系,一直致力于为广大的开发者提供更丰富、更优…

数据结构——堆排序

什么是堆排序 堆排序就是利用堆(假设利用大堆)进行排序的算法。他的基本思想是,将待排序的序列构造成一个大顶堆。此时,整个序列的最大值就是堆顶的根节点。将他移走(其实就是将其与堆数组的末尾元素交换,…

NVIDIA Jetpack6.0DP使用过程中的问题

Jetpack6.0DP是2023年12月才发布, 操作系统使用了ubuntu 22.04, gcc是11.4,版本都很高, 用起来还存在一些问题 无法使用jtop https://forums.developer.nvidia.com/t/jtop-no-longer-works-on-jp-6-0-dp/275215 使用$ sudo -H p…

常用网络接口自动化测试框架

(一)GUI界面测试工具:jmeter 1、添加线程组 2、添加http请求 3、为线程组添加察看结果树 4、写入接口参数并运行 5、在查看结果树窗口查看结果 6、多组数据可增加CSVDataSetConfig(添加.csv格式的文件,并在参数值里以${x}格式写入) 此时变量…

条件随机场 (CRF) 的损失函数以及faiss 的原理介绍

1、条件随机场 (CRF) 的损失函数 条件随机场(CRF)是一种统计建模方法,常用于结构化预测问题,如序列标注、分词和命名实体识别等。在CRF模型中,损失函数用于衡量模型预测的标记序列与真实标记序列之间的差异。CRF的目标…

基于华为云解析服务实现网站区域封禁

前言 中国大陆以外的网络攻击不断,个人博客时常遭受不明个人或组织的攻击,给网站的安全运行带来了巨大的风险,同时DDoS、CC攻击等还会消耗服务器的资源,站长可能需要因此支付高昂的服务器、CDN的流量费用。 因此,如果…

【非关系型数据库】Redis概述及安装、命令使用

目录 前瞻 关系型数据库 非关系型数据库 关系型数据库和非关系型数据库区别 数据存储方式不同 扩展方式不同 对事务性的支持不同 非关系型数据库产生背景 总结 Redis简介 什么是Redis Redis具有的优点 Redis使用场景 哪些数据适合放入缓存中? Redis为什…

JAVAEE初阶相关内容第二十弹--HTTP协议【续集】

写在前:在前一篇博客中我们初步掌握了HTTP(超文本传输协议)的相关知识【点击跳转】,认识了HYYP协议的工作过程,掌握抓包工具Fiddler的使用。在“方法”中重点需要理解“GET”方法与“POST”方法的格式与内容,并了解了请求“报头”…

el-table 展开行表格,展开的内容高度可以变化时,导致的固定列错位的问题

问题描述 一个可展开的表格(列设置了type“expand”),并且展开后的内容高度可以变化,会导致后面所有行的固定列错位,图如下,展示行中是一个树形表格,默认不展示子级,点击树形表格的…

彻底解决vue-video-player视频铺满div

需求 最近需要接入海康视频摄像头,然后把视频的画面接入到自己的网站系统中。以前对接过rtsp固定IP的显示视频,这次的不一样,没有了固定IP。海康的解决办法是,摄像头通过配置服务器到萤石云平台,然后购买企业版账号和…

Rocky9.3 安装MySQL后如何设置初始密码

Rocky9.3 安装MySQL后如何设置初始密码 启动MySQL服务查看临时密码设置新密码 启动MySQL服务 安装MySQL后需要看一下服务是否已经启动: systemctl status mysqld如果没有启动的话,需要先启动MySQL服务: systemctl start mysqld # 临时启动…

Spring Boot学习随笔- 集成MyBatis-Plus(二)条件查询QueryWrapper、聚合函数的使用、Lambda条件查询

学习视频:【编程不良人】Mybatis-Plus整合SpringBoot实战教程,提高的你开发效率,后端人员必备! 查询方法详解 普通查询 // 根据主键id去查询单个结果的。 Test public void selectById() {User user userMapper.selectById(1739970502337392641L);System.out.print…

Linux的基本指令(5)

目录 bc指令 uname指令 压缩解压相关的指令 zip指令 unzip指令 tar打包压缩指令 tar解压解包指令 ​编辑​编辑sz&rz 热键 关机命令 安装:yum install -y 指令 bc指令 bc命令可以很方便的进行浮点运算 Linux中的计算器 uname指令 语法:un…

春招冲刺第一天:Excel入门

春招冲刺第一天 前言: 转行换方向了家人们,准备往数据分析那转了,实习我现在也找不到,打算先猛学两周技术,过完年再投简历了。 时间确实非常紧张,目前一天计划学8小时以上,主要参考视频——&g…

从vue小白到高手,从一个内容管理网站开始实战开发第六天,登录功能后台功能设计--API项目中的登录实现(二),工厂模式创建数据库连接

一、回顾 在第五天的时候我们开始创建后台所以需项目,并创建项目所需要的相关实体类,具体内容没有掌握的小伙伴可以看点击下面的链接去学习。 从vue小白到高手,从一个内容管理网站开始实战开发第六天,登录功能后台功能设计--API项目中的登录实现(一)-CSDN博客文章浏览阅读…

uniappVue3版本中组件生命周期和页面生命周期的详细介绍

一、什么是生命周期? 生命周期有多重叫法,有叫生命周期函数的,也有叫生命周期钩子的,还有钩子函数的,其实都是代表,在 Vue 实例创建、更新和销毁的不同阶段触发的一组钩子函数,这些生命周期函数…

【java爬虫】首页显示沪深300指数走势图以及前后端整合部署方法

添加首页 本文我们将在首页添加沪深300指数成立以来的整体走势数据展示,最后的效果是这样的 单独贴一张沪深300整体走势图 我感觉从总体上来看指数还是比较稳的,没有特别大的波动,当然,这只是相对而言哈哈。 首先是前端页面 &l…