MNIST 手写数字识别,我是如何做到886个可训练参数,识别率达到98.2%? (参数、模型压缩), Keras实现,模型优化

一 项目展示

下面可以看到验证集可以到了0.9823了,实际上,在下面的另外一个训练,可以得到0.9839,我保守的写了0.982
在这里插入图片描述
在这里插入图片描述

二 项目参数展示

我们先来看看LeNet 5 的结构与参数,参数有61,706个。
在这里插入图片描述
这个是我用keras写的,可以看到参数只有886个。
在这里插入图片描述

项目代码

我们来看一下模型的代码,模型由传统卷积改成了可分离卷积;
这里padding 主要是为了后面降维方便而设置
这里设置了5层卷积,是考虑到如果层数比较少的话,感受野比较小,增加卷积层,从而增加感受野(我觉得感受野,这时候比模型的复杂度重要)
因为mnist的字体大概占图像面积70-85之间

代码的注释也写得比较多,可以自行研究。

# 我用的是GPU训练,如果训练报错的话,可以把SperableConv2D改成Conv2D
# 先训练1个epoch,再改成如下的sperableconv2d
def my_network(input_shape=(32, 32, 1), classes=10):X_input = Input(input_shape)X = X_input# 这里padding 主要是为了后面降维方便而设置# 这里设置了5层卷积,是考虑到如果层数比较少的话,感受野比较小,增加卷积层,从而增加感受野# 因为mnist的字体大概占图像面积70-85之间X = ZeroPadding2D((2, 2))(X_input)X = Conv2D(4, (3, 3), strides=(1, 1), padding='valid', activation='elu', name='conv1')(X)X = Conv2D(6, (3, 3), strides=(2, 2), padding='valid', activation='elu', name='conv2')(X)X = SeparableConv2D(8, (3, 3), strides=(2, 2), padding='valid', activation='elu', name='conv3')(X)X = SeparableConv2D(10, (3, 3), strides=(2, 2), padding='same', activation='elu', name='conv4')(X)X = SeparableConv2D(12, (3, 3), strides=(2, 2), padding='valid', activation='elu', name='conv5')(X)# 利用浮点卷积做为输出,注意激活函数是softmaxX = Conv2D(classes, (1, 1), strides=(1, 1), padding='same', activation='softmax')(X)X = keras.layers.Reshape((classes,))(X)model = Model(inputs=X_input, outputs=X, name='my_network')return modelmodel_con = my_network(input_shape=(28, 28, 1), classes=10)# 这里的lr,如果用了学习衰减的话,就可以不设置了
optimizer = keras.optimizers.Adam(lr=0.001)
model_con.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy'])
model_con.summary()

训练过程

训练的过程,用的是Adam优化器,利用了学习率衰减的策略,先由大的学习,加快训练,再调整比较小的学习率来训练,慢慢训练得出的结果比较稳定。
下面是100个epochs的全部,识别的输出,其它lr初始值为0.01,而不是0.001。

Epoch 1/100
lr change to 0.01
235/235 [==============================] - 2s 7ms/step - loss: 0.1065 - acc: 0.9673 - val_loss: 0.0792 - val_acc: 0.9755
Epoch 2/100
lr change to 0.01
235/235 [==============================] - 1s 6ms/step - loss: 0.0886 - acc: 0.9726 - val_loss: 0.1013 - val_acc: 0.9686
Epoch 3/100
lr change to 0.01
235/235 [==============================] - 1s 6ms/step - loss: 0.0884 - acc: 0.9732 - val_loss: 0.0744 - val_acc: 0.9768
Epoch 4/100
lr change to 0.01
235/235 [==============================] - 2s 6ms/step - loss: 0.0873 - acc: 0.9730 - val_loss: 0.0815 - val_acc: 0.9740
Epoch 5/100
lr change to 0.01
235/235 [==============================] - 1s 6ms/step - loss: 0.0851 - acc: 0.9733 - val_loss: 0.0724 - val_acc: 0.9768
Epoch 6/100
lr change to 0.01
235/235 [==============================] - 2s 6ms/step - loss: 0.0817 - acc: 0.9752 - val_loss: 0.0826 - val_acc: 0.9745
Epoch 7/100
lr change to 0.01
235/235 [==============================] - 1s 6ms/step - loss: 0.0855 - acc: 0.9741 - val_loss: 0.0830 - val_acc: 0.9740
Epoch 8/100
lr change to 0.01
235/235 [==============================] - 2s 6ms/step - loss: 0.0870 - acc: 0.9728 - val_loss: 0.0847 - val_acc: 0.9729
Epoch 9/100
lr change to 0.01
235/235 [==============================] - 2s 7ms/step - loss: 0.0813 - acc: 0.9752 - val_loss: 0.0701 - val_acc: 0.9787
Epoch 10/100
lr change to 0.01
235/235 [==============================] - 2s 7ms/step - loss: 0.0815 - acc: 0.9748 - val_loss: 0.0742 - val_acc: 0.9765
Epoch 11/100
lr change to 0.01
235/235 [==============================] - 2s 7ms/step - loss: 0.0809 - acc: 0.9749 - val_loss: 0.0721 - val_acc: 0.9770
Epoch 12/100
lr change to 0.005
235/235 [==============================] - 1s 6ms/step - loss: 0.0672 - acc: 0.9796 - val_loss: 0.0635 - val_acc: 0.9788
Epoch 13/100
lr change to 0.005
235/235 [==============================] - 2s 6ms/step - loss: 0.0662 - acc: 0.9800 - val_loss: 0.0575 - val_acc: 0.9808
Epoch 14/100
lr change to 0.005
235/235 [==============================] - 2s 6ms/step - loss: 0.0659 - acc: 0.9796 - val_loss: 0.0624 - val_acc: 0.9803
Epoch 15/100
lr change to 0.005
235/235 [==============================] - 1s 6ms/step - loss: 0.0662 - acc: 0.9793 - val_loss: 0.0631 - val_acc: 0.9791
Epoch 16/100
lr change to 0.005
235/235 [==============================] - 2s 6ms/step - loss: 0.0655 - acc: 0.9802 - val_loss: 0.0561 - val_acc: 0.9816
Epoch 17/100
lr change to 0.005
235/235 [==============================] - 2s 7ms/step - loss: 0.0659 - acc: 0.9799 - val_loss: 0.0653 - val_acc: 0.9808
Epoch 18/100
lr change to 0.005
235/235 [==============================] - 2s 6ms/step - loss: 0.0653 - acc: 0.9798 - val_loss: 0.0765 - val_acc: 0.9748
Epoch 19/100
lr change to 0.005
235/235 [==============================] - 2s 7ms/step - loss: 0.0656 - acc: 0.9792 - val_loss: 0.0565 - val_acc: 0.9811
Epoch 20/100
lr change to 0.005
235/235 [==============================] - 2s 7ms/step - loss: 0.0645 - acc: 0.9800 - val_loss: 0.0569 - val_acc: 0.9814
Epoch 21/100
lr change to 0.005
235/235 [==============================] - 2s 7ms/step - loss: 0.0639 - acc: 0.9804 - val_loss: 0.0566 - val_acc: 0.9814
Epoch 22/100
lr change to 0.005
235/235 [==============================] - 2s 7ms/step - loss: 0.0637 - acc: 0.9801 - val_loss: 0.0672 - val_acc: 0.9778
Epoch 23/100
lr change to 0.005
235/235 [==============================] - 2s 7ms/step - loss: 0.0631 - acc: 0.9807 - val_loss: 0.0607 - val_acc: 0.9804
Epoch 24/100
lr change to 0.0025
235/235 [==============================] - 2s 6ms/step - loss: 0.0562 - acc: 0.9825 - val_loss: 0.0587 - val_acc: 0.9804
Epoch 25/100
lr change to 0.0025
235/235 [==============================] - 2s 6ms/step - loss: 0.0550 - acc: 0.9831 - val_loss: 0.0542 - val_acc: 0.9819
Epoch 26/100
lr change to 0.0025
235/235 [==============================] - 2s 7ms/step - loss: 0.0541 - acc: 0.9833 - val_loss: 0.0541 - val_acc: 0.9818
Epoch 27/100
lr change to 0.0025
235/235 [==============================] - 2s 7ms/step - loss: 0.0544 - acc: 0.9829 - val_loss: 0.0545 - val_acc: 0.9825
Epoch 28/100
lr change to 0.0025
235/235 [==============================] - 2s 7ms/step - loss: 0.0548 - acc: 0.9829 - val_loss: 0.0514 - val_acc: 0.9827
Epoch 29/100
lr change to 0.0025
235/235 [==============================] - 2s 7ms/step - loss: 0.0536 - acc: 0.9833 - val_loss: 0.0587 - val_acc: 0.9805
Epoch 30/100
lr change to 0.0025
235/235 [==============================] - 2s 7ms/step - loss: 0.0534 - acc: 0.9831 - val_loss: 0.0683 - val_acc: 0.9783
Epoch 31/100
lr change to 0.0025
235/235 [==============================] - 2s 7ms/step - loss: 0.0535 - acc: 0.9839 - val_loss: 0.0517 - val_acc: 0.9831
Epoch 32/100
lr change to 0.0025
235/235 [==============================] - 2s 7ms/step - loss: 0.0537 - acc: 0.9831 - val_loss: 0.0524 - val_acc: 0.9821
Epoch 33/100
lr change to 0.0025
235/235 [==============================] - 2s 8ms/step - loss: 0.0533 - acc: 0.9833 - val_loss: 0.0543 - val_acc: 0.9820
Epoch 34/100
lr change to 0.0025
235/235 [==============================] - 2s 8ms/step - loss: 0.0527 - acc: 0.9836 - val_loss: 0.0529 - val_acc: 0.9814
Epoch 35/100
lr change to 0.0025
235/235 [==============================] - 2s 7ms/step - loss: 0.0534 - acc: 0.9835 - val_loss: 0.0558 - val_acc: 0.9814
Epoch 36/100
lr change to 0.00125
235/235 [==============================] - 2s 7ms/step - loss: 0.0487 - acc: 0.9852 - val_loss: 0.0523 - val_acc: 0.9823
Epoch 37/100
lr change to 0.00125
235/235 [==============================] - 2s 7ms/step - loss: 0.0478 - acc: 0.9854 - val_loss: 0.0500 - val_acc: 0.9837
Epoch 38/100
lr change to 0.00125
235/235 [==============================] - 2s 7ms/step - loss: 0.0482 - acc: 0.9851 - val_loss: 0.0532 - val_acc: 0.9823
Epoch 39/100
lr change to 0.00125
235/235 [==============================] - 2s 7ms/step - loss: 0.0480 - acc: 0.9849 - val_loss: 0.0509 - val_acc: 0.9832
Epoch 40/100
lr change to 0.00125
235/235 [==============================] - 2s 7ms/step - loss: 0.0477 - acc: 0.9850 - val_loss: 0.0509 - val_acc: 0.9824
Epoch 41/100
lr change to 0.00125
235/235 [==============================] - 2s 7ms/step - loss: 0.0481 - acc: 0.9850 - val_loss: 0.0509 - val_acc: 0.9831
Epoch 42/100
lr change to 0.00125
235/235 [==============================] - 2s 7ms/step - loss: 0.0482 - acc: 0.9847 - val_loss: 0.0511 - val_acc: 0.9827
Epoch 43/100
lr change to 0.00125
235/235 [==============================] - 2s 7ms/step - loss: 0.0480 - acc: 0.9851 - val_loss: 0.0513 - val_acc: 0.9838
Epoch 44/100
lr change to 0.00125
235/235 [==============================] - 2s 7ms/step - loss: 0.0474 - acc: 0.9850 - val_loss: 0.0530 - val_acc: 0.9831
Epoch 45/100
lr change to 0.00125
235/235 [==============================] - 2s 7ms/step - loss: 0.0476 - acc: 0.9852 - val_loss: 0.0513 - val_acc: 0.9831
Epoch 46/100
lr change to 0.00125
235/235 [==============================] - 2s 7ms/step - loss: 0.0470 - acc: 0.9855 - val_loss: 0.0554 - val_acc: 0.9817
Epoch 47/100
lr change to 0.00125
235/235 [==============================] - 2s 7ms/step - loss: 0.0471 - acc: 0.9850 - val_loss: 0.0527 - val_acc: 0.9817
Epoch 48/100
lr change to 0.000625
235/235 [==============================] - 2s 7ms/step - loss: 0.0451 - acc: 0.9857 - val_loss: 0.0506 - val_acc: 0.9828
Epoch 49/100
lr change to 0.000625
235/235 [==============================] - 2s 7ms/step - loss: 0.0448 - acc: 0.9861 - val_loss: 0.0501 - val_acc: 0.9838
Epoch 50/100
lr change to 0.000625
235/235 [==============================] - 2s 7ms/step - loss: 0.0448 - acc: 0.9859 - val_loss: 0.0490 - val_acc: 0.9829
Epoch 51/100
lr change to 0.000625
235/235 [==============================] - 2s 7ms/step - loss: 0.0447 - acc: 0.9865 - val_loss: 0.0501 - val_acc: 0.9834
Epoch 52/100
lr change to 0.000625
235/235 [==============================] - 2s 7ms/step - loss: 0.0444 - acc: 0.9864 - val_loss: 0.0488 - val_acc: 0.9826
Epoch 53/100
lr change to 0.000625
235/235 [==============================] - 2s 7ms/step - loss: 0.0446 - acc: 0.9860 - val_loss: 0.0507 - val_acc: 0.9826
Epoch 54/100
lr change to 0.000625
235/235 [==============================] - 2s 7ms/step - loss: 0.0446 - acc: 0.9856 - val_loss: 0.0511 - val_acc: 0.9827
Epoch 55/100
lr change to 0.000625
235/235 [==============================] - 2s 7ms/step - loss: 0.0446 - acc: 0.9862 - val_loss: 0.0506 - val_acc: 0.9830
Epoch 56/100
lr change to 0.000625
235/235 [==============================] - 2s 7ms/step - loss: 0.0446 - acc: 0.9858 - val_loss: 0.0500 - val_acc: 0.9830
Epoch 57/100
lr change to 0.000625
235/235 [==============================] - 2s 7ms/step - loss: 0.0443 - acc: 0.9861 - val_loss: 0.0517 - val_acc: 0.9831
Epoch 58/100
lr change to 0.000625
235/235 [==============================] - 2s 7ms/step - loss: 0.0442 - acc: 0.9860 - val_loss: 0.0507 - val_acc: 0.9828
Epoch 59/100
lr change to 0.000625
235/235 [==============================] - 2s 7ms/step - loss: 0.0445 - acc: 0.9864 - val_loss: 0.0504 - val_acc: 0.9833
Epoch 60/100
lr change to 0.0003125
235/235 [==============================] - 2s 7ms/step - loss: 0.0430 - acc: 0.9868 - val_loss: 0.0499 - val_acc: 0.9825
Epoch 61/100
lr change to 0.0003125
235/235 [==============================] - 2s 7ms/step - loss: 0.0429 - acc: 0.9865 - val_loss: 0.0507 - val_acc: 0.9834
Epoch 62/100
lr change to 0.0003125
235/235 [==============================] - 2s 7ms/step - loss: 0.0430 - acc: 0.9866 - val_loss: 0.0494 - val_acc: 0.9830
Epoch 63/100
lr change to 0.0003125
235/235 [==============================] - 2s 7ms/step - loss: 0.0432 - acc: 0.9865 - val_loss: 0.0501 - val_acc: 0.9836
Epoch 64/100
lr change to 0.0003125
235/235 [==============================] - 2s 7ms/step - loss: 0.0426 - acc: 0.9866 - val_loss: 0.0518 - val_acc: 0.9830
Epoch 65/100
lr change to 0.0003125
235/235 [==============================] - 2s 7ms/step - loss: 0.0428 - acc: 0.9865 - val_loss: 0.0500 - val_acc: 0.9824
Epoch 66/100
lr change to 0.0003125
235/235 [==============================] - 2s 7ms/step - loss: 0.0427 - acc: 0.9866 - val_loss: 0.0506 - val_acc: 0.9824
Epoch 67/100
lr change to 0.0003125
235/235 [==============================] - 2s 7ms/step - loss: 0.0430 - acc: 0.9866 - val_loss: 0.0504 - val_acc: 0.9826
Epoch 68/100
lr change to 0.0003125
235/235 [==============================] - 2s 7ms/step - loss: 0.0427 - acc: 0.9868 - val_loss: 0.0501 - val_acc: 0.9832
Epoch 69/100
lr change to 0.0003125
235/235 [==============================] - 2s 7ms/step - loss: 0.0425 - acc: 0.9866 - val_loss: 0.0505 - val_acc: 0.9823
Epoch 70/100
lr change to 0.0003125
235/235 [==============================] - 2s 7ms/step - loss: 0.0427 - acc: 0.9869 - val_loss: 0.0505 - val_acc: 0.9839
Epoch 71/100
lr change to 0.0003125
235/235 [==============================] - 2s 7ms/step - loss: 0.0432 - acc: 0.9868 - val_loss: 0.0494 - val_acc: 0.9838
Epoch 72/100
lr change to 0.00015625
235/235 [==============================] - 2s 7ms/step - loss: 0.0419 - acc: 0.9870 - val_loss: 0.0493 - val_acc: 0.9833
Epoch 73/100
lr change to 0.00015625
235/235 [==============================] - 2s 7ms/step - loss: 0.0419 - acc: 0.9871 - val_loss: 0.0498 - val_acc: 0.9835
Epoch 74/100
lr change to 0.00015625
235/235 [==============================] - 2s 7ms/step - loss: 0.0419 - acc: 0.9871 - val_loss: 0.0496 - val_acc: 0.9831
Epoch 75/100
lr change to 0.00015625
235/235 [==============================] - 2s 7ms/step - loss: 0.0419 - acc: 0.9868 - val_loss: 0.0499 - val_acc: 0.9834
Epoch 76/100
lr change to 0.00015625
235/235 [==============================] - 2s 7ms/step - loss: 0.0419 - acc: 0.9868 - val_loss: 0.0496 - val_acc: 0.9835
Epoch 77/100
lr change to 0.00015625
235/235 [==============================] - 2s 7ms/step - loss: 0.0420 - acc: 0.9870 - val_loss: 0.0500 - val_acc: 0.9834
Epoch 78/100
lr change to 0.00015625
235/235 [==============================] - 2s 7ms/step - loss: 0.0419 - acc: 0.9871 - val_loss: 0.0496 - val_acc: 0.9833
Epoch 79/100
lr change to 0.00015625
235/235 [==============================] - 2s 7ms/step - loss: 0.0419 - acc: 0.9868 - val_loss: 0.0497 - val_acc: 0.9835
Epoch 80/100
lr change to 0.00015625
235/235 [==============================] - 2s 7ms/step - loss: 0.0419 - acc: 0.9869 - val_loss: 0.0499 - val_acc: 0.9835
Epoch 81/100
lr change to 0.00015625
235/235 [==============================] - 2s 7ms/step - loss: 0.0420 - acc: 0.9866 - val_loss: 0.0499 - val_acc: 0.9830
Epoch 82/100
lr change to 0.00015625
235/235 [==============================] - 2s 7ms/step - loss: 0.0418 - acc: 0.9870 - val_loss: 0.0503 - val_acc: 0.9839
Epoch 83/100
lr change to 0.00015625
235/235 [==============================] - 2s 7ms/step - loss: 0.0420 - acc: 0.9868 - val_loss: 0.0494 - val_acc: 0.9832
Epoch 84/100
lr change to 7.8125e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0414 - acc: 0.9873 - val_loss: 0.0499 - val_acc: 0.9837
Epoch 85/100
lr change to 7.8125e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0416 - acc: 0.9871 - val_loss: 0.0496 - val_acc: 0.9833
Epoch 86/100
lr change to 7.8125e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0414 - acc: 0.9872 - val_loss: 0.0501 - val_acc: 0.9837
Epoch 87/100
lr change to 7.8125e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0415 - acc: 0.9871 - val_loss: 0.0502 - val_acc: 0.9836
Epoch 88/100
lr change to 7.8125e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0415 - acc: 0.9871 - val_loss: 0.0501 - val_acc: 0.9834
Epoch 89/100
lr change to 7.8125e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0414 - acc: 0.9870 - val_loss: 0.0496 - val_acc: 0.9835
Epoch 90/100
lr change to 7.8125e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0414 - acc: 0.9872 - val_loss: 0.0498 - val_acc: 0.9837
Epoch 91/100
lr change to 7.8125e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0416 - acc: 0.9870 - val_loss: 0.0502 - val_acc: 0.9839
Epoch 92/100
lr change to 7.8125e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0414 - acc: 0.9872 - val_loss: 0.0503 - val_acc: 0.9834
Epoch 93/100
lr change to 7.8125e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0415 - acc: 0.9870 - val_loss: 0.0500 - val_acc: 0.9836
Epoch 94/100
lr change to 7.8125e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0415 - acc: 0.9869 - val_loss: 0.0499 - val_acc: 0.9832
Epoch 95/100
lr change to 7.8125e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0415 - acc: 0.9871 - val_loss: 0.0502 - val_acc: 0.9836
Epoch 96/100
lr change to 6.25e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0413 - acc: 0.9871 - val_loss: 0.0501 - val_acc: 0.9838
Epoch 97/100
lr change to 6.25e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0413 - acc: 0.9872 - val_loss: 0.0501 - val_acc: 0.9837
Epoch 98/100
lr change to 6.25e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0413 - acc: 0.9872 - val_loss: 0.0503 - val_acc: 0.9834
Epoch 99/100
lr change to 6.25e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0413 - acc: 0.9870 - val_loss: 0.0499 - val_acc: 0.9833
Epoch 100/100
lr change to 6.25e-05
235/235 [==============================] - 2s 7ms/step - loss: 0.0417 - acc: 0.9870 - val_loss: 0.0500 - val_acc: 0.9834

总结

通过此次,认识到了感受野有时候比模型的复杂度更重要,还有学习率衰减策略也对模型有比较大的影响。
就是你用CPU训练,这么少的参数,也会很快的。

下面附上本项目的代码的Jupyter notebook。

链接:https://pan.baidu.com/s/1EEzfRDD_PAgSeS999Eq9-Q 提取码:3131

如果你觉得好,请给我个star,如有问题,也可以与我联系。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mzph.cn/news/260884.shtml

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

javascript 计算两个坐标的距离 米_土方全面应用计算

各种土方量的计算方法汇总8.2.1 DTM法土方计算由DTM模型来计算土方量是根据实地测定的地面点坐标(X,Y,Z)和设计高程,通过生成三角网来计算每一个三棱锥的填挖方量,最后累计得到指定范围内填方和挖方的土方量,并绘出填…

unity 阳光插件_网络广告,阳光创信保驾护航

网络广告 就找阳光创信。网络营销的技术基础主要是以计算机网络技术为代表的信息技术。计算机网络是现代通信技术与计算机技术相结合的产物,它把分布在不同地理区域的计算机与专门的外部设备用通信线路互连成一个规模大、功能强的网络,从而使众多的计算机…

第2章 Python 数字图像处理(DIP) --数字图像基础1 - 视觉感知要素 - 亮度适应与辨别

数字图像基础1视觉感知要素亮度适应与辨别import sys import numpy as np import cv2 import matplotlib import matplotlib.pyplot as plt import PIL from PIL import Imageprint(f"Python version: {sys.version}") print(f"Numpy version: {np.__version__…

快速幂与快速乘法

List 快速幂与快速乘法 ListKnowledge快速幂 原理code快速乘法 原理codeKnowledge 快速幂 原理 a^b%p 采用二进制得思想,将b转化为二进制数。 b c02^0c12^1c22^2c32^3……cn2^n a^b a^(a12^0)a^(c12^1)……a^(cn2^n) 所以我们可以在log(b)的时间内求出a^(2^0)…

Java程序设计 图形用户界面 小巫版简易计算器

/** 作者:wwj 时间:2012/4/13 功能:实现一个计算器应用程序实验要求:编写一个模拟计算器的应用程序,使用面板和网格布局, 添加一个文本框,10个数字按钮(0~9),…

phython在file同时写入两个_轻松支撑百万级数据点写入 京东智联云时序数据库HoraeDB架构解密...

本文将通过对时序数据的基本概念、应用场景以及京东智联云时序数据库HoraeDB的介绍,为大家揭秘HoraeDB的核心技术架构和解决方案。首先我们来了解下时序数据库的基本概念。时序数据库全称时间序列数据库,主要用于处理带时间标签的数据,带时间…

飞雪迎春

转载于:https://www.cnblogs.com/ysx4221/p/3537810.html

高可用集群技术之corosync应用详解(一)

Corosync概述:Corosync是集群管理套件的一部分,它在传递信息的时候可以通过一个简单的配置文件来定义信息传递的方式和协议等。它是一个新兴的软件,2008年推出,但其实它并不是一个真正意义上的新软件,在2002年的时候有一个项目Ope…

一天总结

这几天忙着弄毕业设计和论文,有好几天都没总结了!学习进度也慢了下来!接下几天把毕业答辩弄好后!把精力放在数据库和编程熟练度上!还有很多要学习的多看书多敲代码!最重要的是要多思考,要有自己…

电脑dns_win10系统dns错误如何解决「系统天地」

最近有位win10系统用户在使用电脑的过程当中,碰到了dns错误的情况,用户不知道如何解决,为此非常苦恼,那么win10系统dns错误如何解决呢?下面为大家分享win10电脑dns错误的解决方法。第一步:使用 ipconfig /flushdns 命…

第2章 Python 数字图像处理(DIP) --数字图像基础5 -- 算术运算、集合、几何变换、傅里叶变换等

目录数字图像处理所有的基本数字工具介绍算术运算集合运算和逻辑运算空间运算向量与矩阵运算图像变换图像和随机变量数字图像处理所有的基本数字工具介绍 算术运算 # 相加 img_ori cv2.imread("DIP_Figures/DIP3E_Original_Images_CH02/Fig0226(galaxy_pair_original).…

Windows安装cnpm报错 The operation was rejected by your operating system.

Windows在安装cnpm时出现如下错误 npm ERR! The operation was rejected by your operating system. npm ERR! Its possible that the file was already in use (by a text editor or antivirus), npm ERR! or that you lack permissions to access it. npm ERR! npm ERR! If y…

hao123电脑版主页_hao123浏览器 原生网民的记忆 一代站长的传奇

百度又有产品说再见了!上线快8年的百度浏览器,再也不会更新了!4月3日,百度浏览器官网发公告称,“桌面百度、百度工具栏、百度地址栏、百度极速浏览器,hao123浏览器,产品将不再更新,基…

小米平板2刷remix_昆明小米售后维修点手机维修怎么收费?小米手机拆机换屏教程...

小编最近修了很多小米手机,大部分维修的故障基本都是手机碎屏,手机换电池之类的,小编从事小米手机维修十余年,小米手机整体机型性价比还是不错的,所以市场上用的人还是比较多,尤其是在校学生,今…

第2章 Python 数字图像处理(DIP) --数字图像基础2 - 图像感知要素 - 图像取样和量化 - 空间分辨率和灰度分辨率

目录图像感知与获取一个简单的成像模型图像取样和量化空间分辨率和灰度分辨率图像感知与获取 一个简单的成像模型 我们用形如 f(x,y)f(x,y)f(x,y) 的二维函数来表示图像。在空间坐标 (x,y)处f(x, y)处 f(x,y)处f的值是一个标量,其物理意义由图像源决定&#xff0c…

外部资源获取

处理外部资源是很繁琐的事情,我们可能需要处理URL资源、File资源资源、ClassPath相关资源、服务器相关资源(JBoss AS 5.x上的VFS资源)等等很多资源。因此处理这些资源需要使用不同的接口,这就增加了我们系统的复杂性;而…

芯明天debug assertion failed_YJLV铝芯电力电缆的基本介绍

原标题:YJLV铝芯电力电缆的基本介绍YJLV铝芯电力电缆,型号全称:铝芯交联聚乙烯绝缘聚氯乙烯护套电力电缆。YJLV电缆的含义为:YJ----交联聚乙烯绝缘;L----线芯材质为铝材。V----聚氯乙烯护套。YJLV电缆工作温度为导体额定工作温度9…

第2章 Python 数字图像处理(DIP) --数字图像基础3 - 图像内插 - 最近邻内插 - 双线性插值 - 双三次内插 - 图像放大

目录图像内插放大图像图像内插 内插通常在图像放大、缩小、旋转和几何校正等任务中使用。内插并用它来调整图像的大小(缩小和放大),缩小和放大基本上采用图像重取样方法 最近邻内插,这种方法将原图像中最近邻的灰度赋给了每个新…

然爸读书笔记(2014-2)----影响力

第一章:影响力的武器 动物可能会看到某种颜色的羽毛而变得具有攻击性,或者听到某种叫声久对自己的天敌呵护有加。动物的这种愚蠢机械反应在人类身上也有,在某个触发特征出现时,我们会不假思索的做出相应的反应,之所以会…

pb 如何导出csv_Firefox火狐浏览器将提供导出密码至本地的功能

6月2日,据外媒All About Lifehacks报道,Mozilla官方的bug报告页面显示,Firefox浏览器的导出或备份密码请求的问题在前两天被关闭,并被标记为已解决。据了解,该请求早在多年前就有人提出。如今被标记为已解决&#xff0…