tf 2.x keras: Losses,Optimizers,metrics 损失函数优化器度量学习率排程

tf 2.x keras




CategoricalCrossentropy,Use this crossentropy loss function when there are two or more label classes. We expect labels to be provided in a one_hot representation.

SparseCategoricalCrossentropy,Use this crossentropy loss function when there are two or more label classes. We expect labels to be provided as integers.

BinaryCrossentropy,Use this cross-entropy loss when there are only two label classes (assumed to be 0 and 1).

KLDivergence  Computes Kullback-Leibler divergence loss between y_true and y_pred.

MeanAbsoluteError: Computes the mean of absolute difference between labels and predictions.

MeanAbsolutePercentageError: Computes the mean absolute percentage error between y_true and y_pred.

MeanSquaredError: Computes the mean of squares of errors between labels and predictions.

MeanSquaredLogarithmicError: Computes the mean squared logarithmic error between y_true and y_pred.



class Adadelta: Optimizer that implements the Adadelta algorithm.
class Adagrad: Optimizer that implements the Adagrad algorithm.
class Adam: Optimizer that implements the Adam algorithm.
class Adamax: Optimizer that implements the Adamax algorithm.
class Ftrl: Optimizer that implements the FTRL algorithm.
class Nadam: Optimizer that implements the NAdam algorithm.
class RMSprop: Optimizer that implements the RMSprop algorithm.
class SGD: Gradient descent (with momentum) optimizer.
class Optimizer: Base class for Keras optimizers.

3.学习率排程 Learning Rate Scheduling

Power scheduling
Exponential scheduling
Piecewise constant scheduling
Performance scheduling
1cycle scheduling



Accuracy metrics

Probabilistic metrics

Regression metrics

Classification metrics based on True/False positives & negatives




PyTorch Loss Functions: The Ultimate Guide

Ultimate Guide To Loss functions In PyTorch With Python Implementation





Ultimate guide to PyTorch Optimizers






AdamW, LAMB: 大型预训练模型常用优化器





已标记关键词 清除标记
©️2020 CSDN 皮肤主题: 技术工厂 设计师:CSDN官方博客 返回首页