【python实现卷积神经网络】激活层实现

时间:2024-04-30 13:23:27

代码来源:https://github.com/eriklindernoren/ML-From-Scratch

卷积神经网络中卷积层Conv2D(带stride、padding)的具体实现:https://www.cnblogs.com/xiximayou/p/12706576.html

激活函数的实现(sigmoid、softmax、tanh、relu、leakyrelu、elu、selu、softplus):https://www.cnblogs.com/xiximayou/p/12713081.html

损失函数定义(均方误差、交叉熵损失):https://www.cnblogs.com/xiximayou/p/12713198.html

优化器的实现(SGD、Nesterov、Adagrad、Adadelta、RMSprop、Adam):https://www.cnblogs.com/xiximayou/p/12713594.html

卷积层反向传播过程:https://www.cnblogs.com/xiximayou/p/12713930.html

全连接层实现:https://www.cnblogs.com/xiximayou/p/12720017.html

批量归一化层实现:https://www.cnblogs.com/xiximayou/p/12720211.html

池化层实现:https://www.cnblogs.com/xiximayou/p/12720324.html

padding2D实现:https://www.cnblogs.com/xiximayou/p/12720454.html

Flatten层实现:https://www.cnblogs.com/xiximayou/p/12720518.html

上采样层UpSampling2D实现:https://www.cnblogs.com/xiximayou/p/12720558.html

Dropout层实现:https://www.cnblogs.com/xiximayou/p/12720589.html

之前就已经定义过了各种激活函数的前向和反向计算,这里只需要将其封装成类。

activation_functions = {
'relu': ReLU,
'sigmoid': Sigmoid,
'selu': SELU,
'elu': ELU,
'softmax': Softmax,
'leaky_relu': LeakyReLU,
'tanh': TanH,
'softplus': SoftPlus
} class Activation(Layer):
"""A layer that applies an activation operation to the input.
Parameters:
-----------
name: string
The name of the activation function that will be used.
""" def __init__(self, name):
self.activation_name = name
self.activation_func = activation_functions[name]()
self.trainable = True def layer_name(self):
return "Activation (%s)" % (self.activation_func.__class__.__name__) def forward_pass(self, X, training=True):
self.layer_input = X
return self.activation_func(X) def backward_pass(self, accum_grad):
return accum_grad * self.activation_func.gradient(self.layer_input) def output_shape(self):
return self.input_shape