精度,精确率,召回率_了解并记住精度和召回率

精度,精确率,召回率Hello folks, greetings. So, maybe you are thinking what’s so hard in precision and recall? Why just another article on this topic? 大家好,问候。 因此,也许您在思考精确度和召回率有何困难? 为什么只是关于该主题的另一篇文章? I recommen...
摘要由CSDN通过智能技术生成

精度,精确率,召回率

Hello folks, greetings. So, maybe you are thinking what’s so hard in precision and recall? Why just another article on this topic?

大家好,问候。 因此,也许您在思考精确度和召回率有何困难? 为什么只是关于该主题的另一篇文章?

I recommend reading this article with patience and a note and pencil in hand. Also, concentrate… Reread the same lines if needed.

我建议您耐心阅读本文,并准备好笔记和铅笔。 另外,集中精力……如果需要,请重读相同的内容。

I have hard time in remembering things. I tend to forget things that I haven’t used for a while. I tend to forget the FORMULAS of Precision and Recall over time.

我很难记住事情。 我倾向于忘记我已经一段时间没有使用过的东西了。 随着时间的流逝,我倾向于忘记“精确度”和“召回率”的公式。

BUT, I have a tendency to remake things up in my mind. In the high school, I was having hard time cramming things up. I couldn’t remember formulas for a long period. So, what I did was, understanding them in natural language (for ex: English). And then, during my exams, I would simply recreate the formula from my understanding. Such an ability also allowed me, at times, to invent new formulas. Actually, that wasn’t any kind of invention but it was specialization. But then, I was a kid at that time, right!! So, let’s keep that “invention” ;)

但是,我倾向于重新构想。 在高中时,我很难把东西塞满。 我很久都不记得公式了。 所以,我所做的就是用自然语言理解它们(例如:英语)。 然后,在考试期间,我将根据自己的理解简单地重新创建公式。 这种能力有时也使我能够发明新的公式。 实际上,这不是任何一种发明,而是专业化的。 但是那时候我还是个孩子,对吧! 因此,让我们保持“发明”;)

Image for post
Jobs in Machine Learning
机器学习工作

Now, you might be thinking that “I am not here to hear your story”. But I am here to make you hear my story XD. Just Kidding! Let’s start..

现在,您可能会想“我不是来这里听听您的故事”。 但是我在这里是为了让您听到我的XD故事。 开玩笑! 开始吧..

So, let’s understand Precision and Recall in an intuitive manner. And then, you won’t need to Google up every time what they mean and how are they formulated.

因此,让我们以直观的方式了解“精确度”和“调用率”。 然后,您不必每次都了解Google的含义和方式时就使用Google。

Mostly, you might be aware to the terms TP, FP, TN and FN. But I have habit of explaining thoroughly. So, maybe you should skip that section if you know it.

通常,您可能知道术语TP,FP,TN和FN。 但是我有彻底解释的习惯。 因此,如果您知道的话,也许应该跳过该部分。

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — -

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

TP,FP,TN和FN (TP, FP, TN and FN)

Assume that you are performing a classification task. Let us keep it very simple. Suppose you are performing a single label image classification. This means that, the image belongs to one and only one of the given classes. Also, let’s make it simpler. Consider that there is only one class.

假设您正在执行分类任务。 让我们保持非常简单。 假设您正在执行单个标签图像分类。 这意味着,图像属于给定类别中的一个,并且仅属于其中一个类别。 另外,让我们简化一下。 考虑到只有一类。

Now, if you don’t know the difference between single label and multi label classification, just google a bit.

现在,如果您不知道单标签分类和多标签分类之间的区别,只需谷歌一下。

So, you are now performing binary image classification. For example, the task of whether an image contains a dog or not, belongs to this category.

因此,您现在正在执行二进制 图像分类。 例如,图像是否包含狗的任务属于此类别。

So, there are two target labels depending on if the predicted value is 1 or 0: dog and not dog. Consider being a dog as “positive” (1) and not being a dog as “negative” (0). In short, define positive as one of the two classes and negative as the other (leftover) class.

因此,根据预测值是1还是0,有两个目标标签:狗而不是狗。 考虑将狗视为“阳性”(1),而不将狗视为“阴性”(0)。 简而言之,将正定义为两个类之一,将负定义为另一个(剩余)类。

Now, you input an image to the model and the model predicts that the image is of a dog. This means that the model is “positive” that there is a dog. Now, the case is, the image isn’t actually of a dog. The image is of a person and not of a dog. Hence, the output of model is wrong. Wrong means “false”. This is an example to false positive.

现在,您将图像输入模型,模型将预测该图像是狗的。 这意味着该模型“肯定”有一只狗。 现在,情况是,图像实际上不是狗的。 该图像是一个人而不是一条狗。 因此,模型的输出是错误的。 错误的意思是“假”。 这是误报的一个例子。

Suppose, that image actually contained a dog. Then, the model was correct. Correct means “true”. Now this became an example to true positive.

假设该图像实际上包含一只狗。 然后,该模型是正确的。 正确表示“正确”。 现在,这已成为一个真正正面的例子。

So, true positive means that the model is positive and is correct. And false positive means that the model is positive but is wrong/incorrect.

因此,真正的肯定意味着模型是正确的并且是正确的。 误报是指模型为正,但错误/不正确。

Same goes for true negative and false negative. If the model predicts that there is no dog (i.e. negative) but, actually there is a dog, then the model is wrong. This becomes a case of false negative. Similarly, if the model predicted that there is no dog and the image actually doesn’t contain a dog, then the model is correct. This is a case of true negative.

真否定和假否定也一样。 如果模型预测没有狗(即阴性),但实际上有狗,则该模型是错误的。 这成为假阴性的情况。 同样,如果模型预测没有狗,并且图像实际上不包含狗,则该模型是正确的。 这是真正的消极情况

So, you guys got an idea of these terms. Let’s extend this for the whole training data instead of a single image. Suppose, you are classifying 100 images. The model classified 70 images correctly and 30 images incorrectly. Kudos! You now have a 70% accurate model.

因此,你们对这些术语有所了解。 让我们将其扩展到整个训练数据而不是单个图像。 假设您要分类100张图像。 该模型正确分类了70张图像,错误地分类了30张图像。 荣誉! 您现在拥有70%的准确模型。

Now, let’s focus on the correct images, i.e. TRUE classifications. Suppose, 20 of the 70 correctly classified images were not of dog, i.e. they were NEGATIVES. In this case, the value of TRUE NEGATIVES is 20. And hence, the value of TRUE POSITIVES is 50.

现在,让我们关注正确的图像,即TRUE分类。 假设正确分类的70张图像中有20张不是狗的,即它们是负片 。 在这种情况下, TRUE NEGATIVES的值为20。因此, TRUE POSITIVES的值为50。

热门AI文章: (Trending AI Articles:

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值