特征点检测和匹配

详情https://mp.weixin.qq.com/s/3S8Myeh7cl8iTn5vLlMpwA
有些OPENCV函数可以同时用于检测器/描述符,但是有的组合会出现问题。

关键点检测

SIFT Detector/Descriptor

SIFT detector and ORB descriptor do not work together


int nfeatures = 0;// The number of best features to retain.
int nOctaveLayers = 3;// The number of layers in each octave. 3 is the value used in D. Lowe paper.
double contrastThreshold = 0.04;// The contrast threshold used to filter out weak features in semi-uniform (low-contrast) regions.        
double edgeThreshold = 10;// The threshold used to filter out edge-like features.        
double sigma = 1.6;        // The sigma of the Gaussian applied to the input image at the octave \#0.
xxx=cv::xfeatures2d::SIFT::create(nfeatures, nOctaveLayers, contrastThreshold, edgeThreshold, sigma);

HARRIS Detector

// 
// Detector parameters
int blockSize = 2;     // for every pixel, a blockSize × blockSize neighborhood is considered
int apertureSize = 3;  // aperture parameter for Sobel operator (must be odd)
int minResponse = 100; // minimum value for a corner in the 8bit scaled response matrix
double k = 0.04;       // Harris parameter (see equation for details)
// Detect Harris corners and normalize output
cv::Mat dst, dst_norm, dst_norm_scaled;
dst = cv::Mat::zeros(img.size(), CV_32FC1);
cv::cornerHarris(img, dst, blockSize, apertureSize, k, cv::BORDER_DEFAULT);
cv::normalize(dst, dst_norm, 0, 255, cv::NORM_MINMAX, CV_32FC1, cv::Mat());
cv::convertScaleAbs(dst_norm, dst_norm_scaled);
// Look for prominent corners and instantiate keypoints
double maxOverlap = 0.0; // max. permissible overlap between two features in %, used during non-maxima suppression

for (size_t j = 0; j < dst_norm.rows; j++) 
{    
	for (size_t i = 0; i < dst_norm.cols; i++) 
	{        
		int response = (int) dst_norm.at<float>(j, i);        			
		if (response > minResponse) 
		{ 
			// only store points above a threshold
            		cv::KeyPoint newKeyPoint;            	
            		newKeyPoint.pt = cv::Point2f(i, j);            
            		newKeyPoint.size = 2 * apertureSize;            
            		newKeyPoint.response = response;
      		// perform non-maximum suppression (NMS) in local neighbourhood around new key point            
      			bool bOverlap = false;            
      			for (auto it = keypoints.begin(); it != keypoints.end(); ++it) 
      			{                
      				double kptOverlap = cv::KeyPoint::overlap(newKeyPoint, *it);
				if (kptOverlap > maxOverlap) 
				{                    
					bOverlap = true;                    
					if (newKeyPoint.response >
(*it).response) 
					{                                              // if overlap is >t AND response is higher for new kpt                        
						*it = newKeyPoint; // replace old key point with new one                        
						break;             // quit loop over keypoints                    
					}                
				}            
			}            
			if (!bOverlap) 
			{                                                 
			// only add new key point if no overlap has been found in previous NMS                
				keypoints.push_back(newKeyPoint); // store new keypoint in dynamic list           
			}        
		}    
	} // eof loop over cols
} // eof loop over rows

SHI-TOMASI Detector


int blockSize = 6;       //  size of an average block for computing a derivative covariation matrix over each pixel neighborhood
double maxOverlap = 0.0; // max. permissible overlap between two features in %
double minDistance = (1.0 - maxOverlap) * blockSize;
int maxCorners = img.rows * img.cols / max(1.0, minDistance); // max. num. of keypoints
double qualityLevel = 0.01; // minimal accepted quality of image corners
double k = 0.04;
bool useHarris = false;// Apply corner detection
vector<cv::Point2f> corners;
cv::goodFeaturesToTrack(img, corners, maxCorners, qualityLevel, minDistance, cv::Mat(), blockSize, useHarris, k);
// add corners to result vector
for (auto it = corners.begin(); it != corners.end(); ++it) 
{    
	cv::KeyPoint newKeyPoint;    
	newKeyPoint.pt = cv::Point2f((*it).x, (*it).y);    		
	newKeyPoint.size = blockSize;    
	keypoints.push_back(newKeyPoint);
}

BRISIK Detector/Descriptor

// An highlighted block
int threshold = 30;        // FAST/AGAST detection threshold score.
int octaves = 3;           // detection octaves (use 0 to do single scale)
float patternScale = 1.0f; // apply this scale to the pattern used for sampling the neighbourhood of a keypoint.
xxx=cv::BRISK::create(threshold, octaves, patternScale);

FREAK Detector/Descriptor

// An highlighted block
bool orientationNormalized = true;// Enable orientation normalization.
bool scaleNormalized = true;// Enable scale normalization.
float patternScale = 22.0f;// Scaling of the description pattern.
int nOctaves = 4;// Number of octaves covered by the detected keypoints.
const std::vector<int> &selectedPairs = std::vector<int>(); 
// (Optional) user defined selected pairs indexes,
xxx=cv::xfeatures2d::FREAK::create(orientationNormalized, scaleNormalized, patternScale, nOctaves,selectedPairs);

FAST Detector/Descriptor

// An highlighted block
int threshold = 30;// Difference between intensity of the central pixel and pixels of a circle around this pixel
bool nonmaxSuppression = true;// perform non-maxima suppression on keypoints
cv::FastFeatureDetector::DetectorType type = cv::FastFeatureDetector::TYPE_9_16;// TYPE_9_16, TYPE_7_12, TYPE_5_8
xxx=cv::FastFeatureDetector::create(threshold, nonmaxSuppression, type);

ORB Detector/Descriptor

SIFT detector and ORB descriptor do not work together

// An highlighted block
int nfeatures = 500;// The maximum number of features to retain.
float scaleFactor = 1.2f;// Pyramid decimation ratio, greater than 1.
int nlevels = 8;// The number of pyramid levels.
int edgeThreshold = 31;// This is size of the border where the features are not detected.
int firstLevel = 0;// The level of pyramid to put source image to.
int WTA_K = 2;// The number of points that produce each element of the oriented BRIEF descriptor.
auto scoreType = cv::ORB::HARRIS_SCORE;// The default HARRIS_SCORE means that Harris algorithm is used to rank features.
int patchSize = 31;// Size of the patch used by the oriented BRIEF descriptor.
int fastThreshold = 20;// The fast threshold.
xxx=cv::ORB::create(nfeatures, scaleFactor, nlevels, edgeThreshold, firstLevel, WTA_K, scoreType,patchSize, fastThreshold);

AKAZE Detector/Descriptor

KAZE/AKAZE descriptors will only work with KAZE/AKAZE detectors.

// An highlighted block
auto descriptor_type = cv::AKAZE::DESCRIPTOR_MLDB;
// Type of the extracted descriptor: DESCRIPTOR_KAZE, DESCRIPTOR_KAZE_UPRIGHT, DESCRIPTOR_MLDB or DESCRIPTOR_MLDB_UPRIGHT.
int descriptor_size = 0;// Size of the descriptor in bits. 0 -> Full size
int descriptor_channels = 3;// Number of channels in the descriptor (1, 2, 3)
float threshold = 0.001f;// Detector response threshold to accept point
int nOctaves = 4;// Maximum octave evolution of the image
int nOctaveLayers = 4;// Default number of sublevels per scale level

auto diffusivity = cv::KAZE::DIFF_PM_G2;
// Diffusivity type. DIFF_PM_G1, DIFF_PM_G2, DIFF_WEICKERT or DIFF_CHARBONNIER
xxx=cv::AKAZE::create(descriptor_type, descriptor_size, descriptor_channels, threshold, nOctaves,nOctaveLayers, diffusivity);

BRIEF Detector/Descriptor

// An highlighted block
int bytes = 32;
// Legth of the descriptor in bytes, valid values are: 16, 32 (default) or 64 .
bool use_orientation = false;
// Sample patterns using keypoints orientation, disabled by default.
xxx=cv::xfeatures2d::BriefDescriptorExtractor::create(bytes, use_orientation);

关键点匹配

Nearest neighbor distance ratio (NN)/K-nearest-neighbor(KNN)
减少误报数量的另一种非常有效的方法是为每个关键点计算最近邻距离比(nearest neighbor distance ratio)。KNN与NN的区别在与 NN 每个特征点只保留一个最好的匹配 (keeping only the best match),而KNN 每个特征点保留k个最佳匹配(keeping the best k matches per keypoint). k一般为2.

在实践中,已证明阈值0.8可以在TP和FP之间提供良好的平衡。在原始SIFT中检查的图像序列中,使用此设置可以消除90%的错误匹配,而丢失少于5%的正确匹配。注意,只有KNN能设置阈值0.8。NN只会提供一个最佳匹配。以下是匹配的执行代码:

// An highlighted block
void matchDescriptors(std::vector<cv::KeyPoint> &kPtsSource, std::vector<cv::KeyPoint> &kPtsRef, cv::Mat &descSource,cv::Mat &descRef,std::vector<cv::DMatch> &matches, std::string descriptorclass, std::string matcherType,std::string selectorType) 
{    	
	// configure matcher    
	bool crossCheck = false;    
	cv::Ptr<cv::DescriptorMatcher> matcher;    
	int normType;
        if (matcherType.compare("MAT_BF") == 0) 
        {        
        	int normType = descriptorclass.compare("DES_BINARY") == 0 ? cv::NORM_HAMMING : cv::NORM_L2;        
        	matcher = cv::BFMatcher::create(normType, crossCheck);
        } 
        else if (matcherType.compare("MAT_FLANN") == 0) 
        {        
        	// OpenCV bug workaround : convert binary descriptors to floating point due to a bug in current OpenCV implementation        
        	if (descSource.type() !=CV_32F) 
        	{            
        		descSource.convertTo(descSource, CV_32F);            
        		// descRef.convertTo(descRef, CV_32F);        		}        
        	if (descRef.type() !=CV_32F) 
        	{            
        		descRef.convertTo(descRef, CV_32F);        }        
        		matcher = cv::DescriptorMatcher::create(cv::DescriptorMatcher::FLANNBASED);    		}
    // perform matching task    
    if (selectorType.compare("SEL_NN") == 0) 
    { 
    	// nearest neighbor (best match)        
    	matcher->match(descSource, descRef, matches);        
    	// Finds the best match for each descriptor in desc1    
    } 
    else if (selectorType.compare("SEL_KNN") == 0) 
    { 
    	// k nearest neighbors (k=2)        
    	vector<vector<cv::DMatch>> knn_matches;        
    	matcher->knnMatch(descSource, descRef, knn_matches, 2);        	//-- Filter matches using the Lowe's ratio test        
    	double minDescDistRatio = 0.8;        
    	for (auto it = knn_matches.begin(); it != knn_matches.end(); ++it) 
    	{            
    		if ((*it)[0].distance < minDescDistRatio * (*it)[1].distance) 
    		{                
    			matches.push_back((*it)[0]);            
    		}        
    	}    
    } 
}

通过考虑所有这些变化,我可以说检测器/描述符的前三个组合是:
FAST + BRIEF (Higher speed and relative good accuracy)
BRISK + BRIEF (Higher accuracy)
FAST + ORB (relatively good speed and accuracy)

定量比较表明,
特征检测描述器检测大量特征的能力的一般顺序为:ORB>BRISK>SURF>SIFT>AKAZE>KAZE

每个特征点的特征检测描述器的计算效率顺序为:
ORB>ORB (1000) >BRISK>BRISK (1000) >SURF (64D)>SURF (128D) >AKAZE>SIFT>KAZE

每个特征点的有效特征匹配顺序为:
ORB (1000) >BRISK (1000) >AKAZE>KAZE>SURF (64D)>ORB>BRISK>SIFT>SURF (128D)

特征检测描述器的整体图像匹配速度顺序为:
ORB (1000) >BRISK (1000) >AKAZE>KAZE>SURF (64D)>SIFT>ORB>BRISK>SURF (128D)

  • 1
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值