FREAK描述子的使用,基于opencv-3.2.0

如果按照上述title搜索,网上的demo都来自于论文的算法实现:github-freak,但是最新的3.2.0的接口是完全不适用的,果然是contrib…
opencv官方代码见链接

需要做以下修正

  • 头文件和命名空间(新的代码结构)

      #include <opencv2/features2d/features2d.hpp>    
      #include <opencv2/xfeatures2d.hpp>
      #include <opencv2/xfeatures2d/nonfree.hpp>
      
      using namespace cv::xfeatures2d; 
    
  • 如果使用 FREAK extractor; 会出现error : The function/feature is not implemented () in detectAndCompute,应该使用下面的写法

    Ptr<FREAK> extractor = FREAK::create();
    

    详细见链接:stackoverflow;官方文档也有说明:官方链接

    BFMatcher matcher; 也是类似的情况…如果是 BruteForceMatcher<HammingLUT> matcher_Freak; 的写法要更新…关于 BFMatcher 类型还需要注意: 链接

  • 关于 selectPairs 函数

    2.*有说明;3.0的文档只会说明 create 函数。

    官方链接

    Select the 512 best description pair indexes from an input (grayscale) image set. FREAK is available with a set of pairs learned offline. Researchers can run a training process to learn their own set of pair.

    FREAK描述子因为会互相覆盖,所以本身包含的信息就会存在冗余。论文中有说会通过排序只选择前512个pair!FREAK算法本身会提供一套训练好的pairs供使用…selectPairs 函数的作用就是可以让开发者自己重新训练一套pairs…

    stackoverflow也有类似的解答:链接

  • 剔除outliers

    可以使用RANSAC进行剔除:见 链接

  • 最终的实现

        #include <iostream>
        #include <string>
        #include <vector>
        
        #include <opencv2/core/core.hpp>
        #include <opencv2/highgui/highgui.hpp>
        // 下面三个头文件是FREAK需要的,路径和2.*完全不同
        #include <opencv2/features2d/features2d.hpp>    
        #include <opencv2/xfeatures2d.hpp>
        #include <opencv2/xfeatures2d/nonfree.hpp>
        
        using namespace cv;
        using namespace cv::xfeatures2d;        // 此命名空间也是3.0之后新加的
        using namespace std;
        
        void help( char** argv )
        {
            std::cout << "\nUsage: " << argv[0] << " [path/to/image1] [path/to/image2] \n"
                      << "This is an example on how to use the keypoint descriptor presented in the following paper: \n"
                      << "A. Alahi, R. Ortiz, and P. Vandergheynst. FREAK: Fast Retina Keypoint. \n"
                      << "In IEEE Conference on Computer Vision and Pattern Recognition, 2012. CVPR 2012 Open Source Award winner \n"
                      << std::endl;
        }
        
        int main( int argc, char** argv ) {
            // check http://opencv.itseez.com/doc/tutorials/features2d/table_of_content_features2d/table_of_content_features2d.html
            // for OpenCV general detection/matching framework details
        
            if( argc != 3 ) {
                help(argv);
                return -1;
            }
        
            // Load images
            Mat imgA = imread(argv[1], CV_LOAD_IMAGE_GRAYSCALE );
            if( !imgA.data ) {
                std::cout<< " --(!) Error reading image " << argv[1] << std::endl;
                return -1;
            }
        
            Mat imgB = imread(argv[2], CV_LOAD_IMAGE_GRAYSCALE );
            if( !imgA.data ) {
                std::cout << " --(!) Error reading image " << argv[2] << std::endl;
                return -1;
            }
        
            std::vector<KeyPoint> keypointsA, keypointsB;
            Mat descriptorsA, descriptorsB;
            std::vector<DMatch> matches;
        
            // DETECTION
            // Any openCV detector such as
            Ptr<cv::xfeatures2d::SurfFeatureDetector> detector;
            detector = cv::xfeatures2d::SurfFeatureDetector::create(2000);
        
            // DESCRIPTOR
            Ptr<FREAK> extractor = FREAK::create();
        
            // MATCHER
            // The standard Hamming distance can be used such as
            BFMatcher matcher(NORM_HAMMING);        // 使用Hamming距离测试
            
            // detect
            double t = (double)getTickCount();
            detector->detect( imgA, keypointsA );
            detector->detect( imgB, keypointsB );
            t = ((double)getTickCount() - t)/getTickFrequency();
            std::cout << "detection time [s]: " << t/1.0 << std::endl;
        
            // extract
            t = (double)getTickCount();
            extractor->compute( imgA, keypointsA, descriptorsA );
            extractor->compute( imgB, keypointsB, descriptorsB );
            t = ((double)getTickCount() - t)/getTickFrequency();
            std::cout << "extraction time [s]: " << t << std::endl;
        
            // match
            t = (double)getTickCount();
            matcher.match(descriptorsA, descriptorsB, matches);
            t = ((double)getTickCount() - t)/getTickFrequency();
            std::cout << "matching time [s]: " << t << std::endl;
        
            // Draw matches
            Mat imgMatch;
            drawMatches(imgA, keypointsA, imgB, keypointsB, matches, imgMatch); // 系统函数
        
            namedWindow("matches", CV_WINDOW_KEEPRATIO);
            imshow("matches", imgMatch);
            waitKey(0);
        }
    
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值