数据处理:map2world的初始对准

博客介绍了如何通过ICP算法优化GNSS和Lidar在融合定位中的坐标偏差问题。作者首先记录建图初始位置的外参,然后发现仅依赖这一点导致的定位偏差,通过PCL库的ICP方法调整静态外参,最小化两者定位差异,最终获得更精确的匹配结果。关键代码展示了ICP3D类的实现,包括读取数据、滤波、发布数据以及ICP匹配过程。
摘要由CSDN通过智能技术生成

在GNSS和Lidar的融合定位中有这样一个问题,如果得到准确的map坐标系相对于world坐标系的静态外参,这样的话就可以将同一载体上的GNSS和lidar定位映射到同一个坐标系下,且两者的定位偏差不超过20cm,这样才满足数据融合的条件。一般做法是将建图初始位置的经纬高、航向记录下来作为lidar全局定位过程中,map到world的静态外参,但是我们会发现如果只参考这一个位置点,跑出来的两条轨迹会是这样:
在这里插入图片描述
绿色:gnss定位
黄色:lidar定位
引起偏差的原因可能是lidar与惯导安装的机械外参尺寸存在偏差,又或许是地图精度问题,如何去保证两条轨迹尽可能一致?

可以观察并手动调节map2world的静态外参,这里的方法是通过GNSS和LIDAR定位的位置配准给静态外参加上一个补偿,本质就是一个最小二乘的问题,将lidar匹配的位置往gnss位置靠近,这里用PCL里面的ICP求解R,t。
在这里插入图片描述绿色:gnss定位
红色:配准后的雷达位置
求解得到的R、t分别为:
trans:-0.00119538,-0.00497306,0.000654312
rpy:-171.906,-177.667,177.856
将此参数补偿到map2world静态外参即可让lidar匹配的位姿与gnss最大程度保持一致。
关键源码:

ICP3D::ICP3D(ros::NodeHandle private_nh)
{
    gnss_read_.open("/trajectory_data/gnss_trajectory.csv", ios::out |ios::in);
    laser_read_.open("/trajectory_data/laser_trajectory.csv", ios::out | ios::in );
    gnss_point.reset(new pcl::PointCloud<pcl::PointXYZ>());
    laser_point.reset(new pcl::PointCloud<pcl::PointXYZ>());

    /* Loading parameters */
    private_nh.param("transformation_epsilon", _transformation_epsilon, 0.01);
    ROS_INFO("transformation_epsilon: %f", _transformation_epsilon);
    private_nh.param("max_iterations", _max_iters, 75);
    ROS_INFO("max_iterations: %d", _max_iters);
    private_nh.param("euclidean_fitness_epsilon", _euclidean_fitness_epsilon, 0.1);
    ROS_INFO("euclidean_fitness_epsilon: %f", _euclidean_fitness_epsilon);
    private_nh.param("max_correspondence_distance", _max_correspondence_distance, 1.0);
    ROS_INFO("max_correspondence_distance: %f", _max_correspondence_distance);

    gnss_pub_=private_nh.advertise<sensor_msgs::PointCloud2>("/gnss_points",3);
    laser_pub_=private_nh.advertise<sensor_msgs::PointCloud2>("/laser_points",3);
    transformed_pub_=private_nh.advertise<sensor_msgs::PointCloud2>("/transformed_point",3);
}
inline std::vector<std::string> split(const std::string& str, const std::string& delim)
{
    std::vector<std::string> res;
    if ("" == str) 
        return res;
    char * strs = new char[str.length() + 1];  
    strcpy(strs, str.c_str());

    char * d = new char[delim.length() + 1];
    strcpy(d, delim.c_str());

    char *p = strtok(strs, d);
    while (p) {
        std::string s = p;   
        res.push_back(s);  
        p = strtok(NULL, d);
    }
    return res;
}
void ICP3D::Filter()
{
    std::string gnss_line;
    std::string laser_line;
    while (getline(gnss_read_, gnss_line))
	{
		
		stringstream ss(gnss_line);
		string _sub;
		while (getline(ss, _sub, '\n'))
			gnsssubArray.push_back(_sub);
	}
    while (getline(laser_read_,laser_line))
    {
        stringstream ss(laser_line);
        string _sub;
        while (getline(ss, _sub, '\n'))
            lasersubArray.push_back(_sub);
    }
    int gnss_size=gnsssubArray.size();
    int laser_size=lasersubArray.size();
    std::cout<<"gnss size:"<<gnss_size<<","<<"laser size:"<<laser_size<<std::endl;
    gnss_point->resize(gnss_size);
    laser_point->resize(laser_size);

    for(size_t i=0;i<=gnss_size;i++)
    {
        pcl::PointXYZ point;
        vector<string> v;
        v = split(gnsssubArray[i],",");
        for(vector<string>::size_type i = 0; i != v.size(); ++i)
        {
            point.x=stof(v[0]);
            point.y=stof(v[1]);
            point.z=stof(v[2]);
        }
        gnss_point->push_back(point);     
    }
    for(size_t i=0;i<=laser_size;i++)
    {
        pcl::PointXYZ point;
        vector<string> v;
        v = split(lasersubArray[i],","); 
        for(vector<string>::size_type i = 0; i != v.size(); ++i)
        {
            point.x=stof(v[0]);
            point.y=stof(v[1]);
            point.z=stof(v[2]);
        }
        laser_point->push_back(point); 
    }
    
}
void ICP3D::publishData()
{
    sensor_msgs::PointCloud2 laser_msg;
    pcl::toROSMsg(*laser_point,laser_msg);
    laser_msg.header.stamp=ros::Time::now();
    laser_msg.header.frame_id="map";
    laser_pub_.publish(laser_msg);

    sensor_msgs::PointCloud2 gnss_msg;
    pcl::toROSMsg(*gnss_point,gnss_msg);
    gnss_msg.header.stamp=ros::Time::now();
    gnss_msg.header.frame_id="map";
    gnss_pub_.publish(gnss_msg);
    icp_matching(gnss_point,laser_point);
}
void ICP3D::icp_matching(const pcl::PointCloud<pcl::PointXYZ>::Ptr& gnss,const pcl::PointCloud<pcl::PointXYZ>::Ptr& laser)
{
    pcl::PointCloud<pcl::PointXYZ>::Ptr transfomed_points(new pcl::PointCloud<pcl::PointXYZ>());
    pcl::IterativeClosestPoint<pcl::PointXYZ, pcl::PointXYZ> icp;
    icp.setTransformationEpsilon(_transformation_epsilon);
    icp.setMaximumIterations(_max_iters);
    icp.setMaxCorrespondenceDistance(_max_correspondence_distance);
    icp.setEuclideanFitnessEpsilon(_euclidean_fitness_epsilon);
    icp.setInputSource(laser);
    icp.setInputTarget(gnss);

    Eigen::Matrix4f init_guess = Eigen::Matrix4f::Identity();
    icp.align(*transfomed_points, init_guess);
    Eigen::Matrix4f curr_transformation = icp.getFinalTransformation();
    Eigen::Matrix3f mat; //rotation matrix
    Eigen::Vector3f trans; //translation vector
    trans << curr_transformation(0,3), curr_transformation(1,3), curr_transformation(2,3);
    mat << curr_transformation(0,0), curr_transformation(0,1), curr_transformation(0,2),
        curr_transformation(1,0), curr_transformation(1,1), curr_transformation(1,2),
        curr_transformation(2,0), curr_transformation(2,1), curr_transformation(2,2);
    Eigen::Vector3f rpy=mat.eulerAngles(0,1,2)*(180/M_PI);
    std::cout<<"trans:"<<-trans(0)<<","<<-trans(1)<<","<<-trans(2)<<std::endl;
    std::cout<<"rpy:"<<-rpy(0)<<","<<-rpy(1)<<","<<-rpy(2)<<std::endl;

    pcl::transformPointCloud(*laser,*transfomed_points,curr_transformation);
    sensor_msgs::PointCloud2 transformed_msg;
    pcl::toROSMsg(*transfomed_points,transformed_msg);
    transformed_msg.header.stamp=ros::Time::now();
    transformed_msg.header.frame_id="map";
    transformed_pub_.publish(transformed_msg); 
}
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值