evaluateDetectionPrecision
Evaluate precision metric for object detectioncollapse all in page
Syntax
averagePrecision = evaluateDetectionPrecision(detectionResults,groundTruthData)
[averagePrecision,recall,precision] = evaluateDetectionPrecision(___)
[___] = evaluateDetectionPrecision(___,threshold)
I want to use this function to test my model's accuracy, but the reulst is not good.
so, how does this function work?
There are several label in the picture both in My groundtruth and prediction result.
GrounTruth dataset
prediction result.
accuracy
%% Evaluate Detector Using Test Set
% Create a table to hold the bounding boxes, scores, and labels output by
% the detector.
numImages = height(testData);
results = table('Size',[numImages 3],...
'VariableTypes',{'cell','cell','cell'},...
'VariableNames',{'Boxes','Scores','Labels'});
% Run detector on each image in the test set and collect results.
%numImages
for i = 1:numImages
% Read the image.
testImage = imread(testData.filename{i});
% Run the detector.
%[bboxes,scores,labels] = detect(detector,I)
[bboxes,scores,labels]= detect(rcnn,testImage,'MiniBatchSize',128);
I = insertShape(testImage, 'Rectangle', bboxes);
imshow(I)
% Collect the results.
results.Boxes{i} = bboxes;
results.Scores{i} = scores;
results.Labels{i} = labels;
end
% Extract expected bounding box locations from test data.
expectedResults = testData(:, 2:end);
% Evaluate the object detector using average precision metric.
[ap, recall, precision] = evaluateDetectionPrecision(results, expectedResults);
% Plot precision/recall curve
plot(recall,precision)
xlabel('Recall')
ylabel('Precision')
grid on
title(sprintf('Average Precision = %.2f', ap))