How do I define/change the accuracy for a non-classification convolutional neural network?

题意:我如何为非分类卷积神经网络定义/改变准确性?

问题背景:

I'm using Keras to make a prediction model. It takes in two time series and outputs a number between 0 and 1. Currently, I am getting very low accuracy as the model is only considered "correct" if it gets the exact number. For example, the correct number is 0.34, it would be considered incorrect if it predicted 0.35. I want to be able to consider all numbers within a range to be correct, for example: within 0.05 of the true value. Another option may be to round, but I have the problem of it outputting 6 decimal places.

我正在使用Keras来创建一个预测模型。该模型接收两个时间序列作为输入,并输出一个0到1之间的数字。目前,由于模型只有在其预测值完全正确时才会被视为“正确”,因此我得到的准确率非常低。例如,如果正确答案是0.34,那么即使模型预测了0.35,它也会被视为错误。我希望能够认为在真实值一定范围内的所有数字都是正确的,例如:在真实值的正负0.05范围内。另一个选择可能是进行四舍五入,但我遇到的问题是模型输出的数字有6位小数。

  1. How can I consider all numbers within a range to be "correct" for the accuracy?

我如何能在计算准确率时,将一定范围内的所有数字都视为“正确”

  1. How can I round the output of the CNN?

如何对CNN(卷积神经网络)的输出进行四舍五入?

Here is my CNN code:        这是我的CNN代码

def networkModel():
    model = tf.keras.Sequential([

tf.keras.layers.Conv2D(filters = 16, kernel_size=(2, 2), activation='relu',padding='same'),
tf.keras.layers.Conv2D(filters = 9, kernel_size=(2, 2), activation='relu',padding='same'),
tf.keras.layers.MaxPooling2D(pool_size=(2, 2)),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(256, activation='relu'),
tf.keras.layers.Dense(1, activation='sigmoid')

])

    model.compile(optimizer='adam',
            loss = tf.keras.losses.BinaryCrossentropy(),
            metrics=['accuracy'])

    return model

问题解决:

For this specific case, you can define a custom accuracy function as a metric and define a Callback for your Keras model.

对于这个特定的情况,你可以定义一个自定义的准确率函数作为度量标准,并为你的Keras模型定义一个Callback

Custom Accuracy Metric:        自定义准确率度量标准

import keras.backend as K

def custom_accuracy(y_true, y_pred, tolerance=0.05):
    absolute_difference = K.abs(y_true - y_pred)
    correct_predictions = K.cast(absolute_difference <= tolerance, dtype='float32')
    return K.mean(correct_predictions)

model.compile(optimizer='adam', loss='mse', metrics=[custom_accuracy])

Custom Callback:        自定义回调

from keras.callbacks import Callback
import numpy as np

class CustomAccuracyCallback(Callback):
    def __init__(self, validation_data, tolerance=0.05):
        super(CustomAccuracyCallback, self).__init__()
        self.validation_data = validation_data
        self.tolerance = tolerance

    def on_epoch_end(self, epoch, logs={}):
        x_val, y_val = self.validation_data
        y_pred = self.model.predict(x_val)
        accuracy = np.mean(np.abs(y_val - y_pred) <= self.tolerance)
        print(f"\nEpoch {epoch + 1}: Custom Accuracy: {accuracy:.4f}")
        logs['custom_accuracy'] = accuracy

custom_callback = CustomAccuracyCallback((x_val, y_val))
model.fit(x_train, y_train, validation_data=(x_val, y_val), callbacks=[custom_callback])

  • 15
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

营赢盈英

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值