keras损失函数源码与函数公式

时间:2023-02-06 19:55:45


 

1.MSE

keras损失函数源码与函数公式

def mean_squared_error(y_true, y_pred):
return K.mean(math_ops.square(y_pred - y_true), axis=-1)

以语义分割为例:

       y_true :shape (224,224,2)

       y_pred:shape (224,224,2)

       代码中axis=-1,代表取出最后一维度取平均值,公式中的N=2

 

2.绝对值

keras损失函数源码与函数公式

def mean_absolute_error(y_true, y_pred):
return K.mean(math_ops.abs(y_pred - y_true), axis=-1)

 

3. 预测值百分率

           |y_pred - y_true| / | y_true |

def mean_absolute_percentage_error(y_true, y_pred):
diff = math_ops.abs(
(y_true - y_pred) / K.clip(math_ops.abs(y_true), K.epsilon(), None))
return 100. * K.mean(diff, axis=-1)

 4.  MSE + log

def mean_squared_logarithmic_error(y_true, y_pred):
first_log = math_ops.log(K.clip(y_pred, K.epsilon(), None) + 1.)
second_log = math_ops.log(K.clip(y_true, K.epsilon(), None) + 1.)
return K.mean(math_ops.square(first_log - second_log), axis=-1)

 

5.Hinge损失

keras损失函数源码与函数公式

def hinge(y_true, y_pred):
return K.mean(math_ops.maximum(1. - y_true * y_pred, 0.), axis=-1)

 

6.categorical_hinge 损失

def categorical_hinge(y_true, y_pred):
pos = math_ops.reduce_sum(y_true * y_pred, axis=-1)
neg = math_ops.reduce_max((1. - y_true) * y_pred, axis=-1)
return math_ops.maximum(0., neg - pos + 1.)

 

7.交叉熵

keras损失函数源码与函数公式

def categorical_crossentropy(y_true, y_pred):
return K.categorical_crossentropy(y_true, y_pred)

8. one-hot 交叉熵

def sparse_categorical_crossentropy(y_true, y_pred):
return K.sparse_categorical_crossentropy(y_true, y_pred)

9. 二值交叉熵

def binary_crossentropy(y_true, y_pred):
return K.mean(K.binary_crossentropy(y_true, y_pred), axis=-1)