首页 > 学院 > 开发设计 > 正文

ORB-SLAM代码详解之SLAM系统初始化

2019-11-10 16:52:47
字体:
来源:转载
供稿:网友

systemhsystemc的SystemSystem

转载请注明出处:http://blog.csdn.net/c602273091/article/details/54933760

system.h

这里包含了整个SLAM系统所需要的一起,通过看这个文件可以对ORB-SLAM系统有什么有一个大概的了解。不过之前我们需要对于多线程了解一些基本的东西——信号量【1】和多线程【2】。

具体注释如下:

/*** This file is part of ORB-SLAM2.** Copyright (C) 2014-2016 Ra煤l Mur-Artal <raulmur at unizar dot es> (University of Zaragoza)* For more information see <https://github.com/raulmur/ORB_SLAM2>** ORB-SLAM2 is free software: you can redistribute it and/or modify* it under the terms of the GNU General Public License as published by* the Free Software Foundation, either version 3 of the License, or* (at your option) any later version.** ORB-SLAM2 is distributed in the hope that it will be useful,* but WITHOUT ANY WARRANTY; without even the implied warranty of* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the* GNU General Public License for more details.** You should have received a copy of the GNU General Public License* along with ORB-SLAM2. If not, see <http://www.gnu.org/licenses/>.*/#ifndef SYSTEM_H#define SYSTEM_H#include<string>#include<thread>#include<opencv2/core/core.hpp>#include "Tracking.h"#include "FrameDrawer.h"#include "MapDrawer.h"#include "Map.h"#include "LocalMapping.h"#include "LoopClosing.h"#include "KeyFrameDatabase.h"#include "ORBVocabulary.h"#include "Viewer.h"namespace ORB_SLAM2{class Viewer; // 画图class FrameDrawer; // 画每一帧class Map; // 对map进行操作class Tracking; // 追踪的过程class LocalMapping; // 局部地图class LoopClosing; // 闭环检测class System{public: // Input sensor // 输入设备:单目、立体视觉、RGBD enum eSensor{ MONOCULAR=0, STEREO=1, RGBD=2 };public: // Initialize the SLAM system. It launches the Local Mapping, Loop Closing and Viewer threads. // 对SLAM系统的初始化:包含局部地图、闭环检测、视图三个线程 System(const string &strVocFile, const string &strSettingsFile, const eSensor sensor, const bool bUseViewer = true); // PRoccess the given stereo frame. Images must be synchronized and rectified. // Input images: RGB (CV_8UC3) or grayscale (CV_8U). RGB is converted to grayscale. // Returns the camera pose (empty if tracking fails). cv::Mat TrackStereo(const cv::Mat &imLeft, const cv::Mat &imRight, const double &timestamp); // Process the given rgbd frame. Depthmap must be registered to the RGB frame. // Input image: RGB (CV_8UC3) or grayscale (CV_8U). RGB is converted to grayscale. // Input depthmap: Float (CV_32F). // Returns the camera pose (empty if tracking fails). cv::Mat TrackRGBD(const cv::Mat &im, const cv::Mat &depthmap, const double &timestamp); // Proccess the given monocular frame // Input images: RGB (CV_8UC3) or grayscale (CV_8U). RGB is converted to grayscale. // Returns the camera pose (empty if tracking fails). // 对单目的图片进行追踪 cv::Mat TrackMonocular(const cv::Mat &im, const double &timestamp); // This stops local mapping thread (map building) and performs only camera tracking. // 暂停局部地图的构建,只进行追踪,这个名字很有迷惑性 void ActivateLocalizationMode(); // This resumes local mapping thread and performs SLAM again. // 重新开启局部地图的线程 // 在这里使用的是mutex信号量的多线程编程 void DeactivateLocalizationMode(); // Reset the system (clear map) // 复位清楚地图 void Reset(); // All threads will be requested to finish. // It waits until all threads have finished. // This function must be called before saving the trajectory. // 等到所有线程结束任务的时候关闭每个线程 void Shutdown(); // Save camera trajectory in the TUM RGB-D dataset format. // Call first Shutdown() // See format details at: http://vision.in.tum.de/data/datasets/rgbd-dataset void SaveTrajectoryTUM(const string &filename); // Save keyframe poses in the TUM RGB-D dataset format. // Use this function in the monocular case. // Call first Shutdown() // See format details at: http://vision.in.tum.de/data/datasets/rgbd-dataset void SaveKeyFrameTrajectoryTUM(const string &filename); // Save camera trajectory in the KITTI dataset format. // Call first Shutdown() // See format details at: http://www.cvlibs.net/datasets/kitti/eval_odometry.php // kitti数据集的保存位姿的方法 void SaveTrajectoryKITTI(const string &filename); // TODO: Save/Load functions // SaveMap(const string &filename); // LoadMap(const string &filename);private: // Input sensor // 输入的传感器类型 eSensor mSensor; // 用于特征匹配和闭环检测的字典 // ORB vocabulary used for place recognition and feature matching. ORBVocabulary* mpVocabulary; // KeyFrame database for place recognition (relocalization and loop detection). // 关键帧存储的地方 KeyFrameDatabase* mpKeyFrameDatabase; // Map structure that stores the pointers to all KeyFrames and MapPoints. // 所有的关键帧和点云存储的地方 Map* mpMap; // Tracker. It receives a frame and computes the associated camera pose. // It also decides when to insert a new keyframe, create some new MapPoints and // performs relocalization if tracking fails. // 追踪器。接受一帧并且计算相机位姿,并决定何时插入关键帧,关键点。 // 当追踪失败以后进行重定位 Tracking* mpTracker; // Local Mapper. It manages the local map and performs local bundle adjustment. // 构建局部地图并对局部地图使用BA。 LocalMapping* mpLocalMapper; // Loop Closer. It searches loops with every new keyframe. If there is a loop it performs // a pose graph optimization and full bundle adjustment (in a new thread) afterwards. // 闭环检测,每插入一个关键帧就计算是否有闭环并且进行全局的BA。 LoopClosing* mpLoopCloser; // The viewer draws the map and the current camera pose. It uses Pangolin. // 使用Pangolin库看地图和相机位姿。 Viewer* mpViewer; FrameDrawer* mpFrameDrawer; MapDrawer* mpMapDrawer; // System threads: Local Mapping, Loop Closing, Viewer. // The Tracking thread "lives" in the main execution thread that creates the System object. // 追踪这个线程是在main函数里面,这里另外开了局部地图、局部闭环检测、显示地图三个线程 std::thread* mptLocalMapping; std::thread* mptLoopClosing; std::thread* mptViewer; // Reset flag std::mutex mMutexReset; bool mbReset; // Change mode flags std::mutex mMutexMode; bool mbActivateLocalizationMode; bool mbDeactivateLocalizationMode;};}// namespace ORB_SLAM#endif // SYSTEM_H

这样对于ORB-SLAM我们有了一个大致的认识。

接下来我们看System::System(const string &strVocFile, const string &strSettingsFile, const eSensor sensor, const bool bUseViewer).

system.c的System::System

使用understand的control flow,一剑封喉,直接看到各个部分的联系。流程图出来了,感觉看起来很爽。 这里写图片描述

在这个基础上,我再对整个流程进行注释。

System::System(const string &strVocFile, const string &strSettingsFile, const eSensor sensor, const bool bUseViewer):mSensor(sensor),mbReset(false),mbActivateLocalizationMode(false), mbDeactivateLocalizationMode(false){ // Output welcome message cout << endl << "ORB-SLAM2 Copyright (C) 2014-2016 Raul Mur-Artal, University of Zaragoza." << endl << "This program comes with ABSOLUTELY NO WARRANTY;" << endl << "This is free software, and you are welcome to redistribute it" << endl << "under certain conditions. See LICENSE.txt." << endl << endl; cout << "Input sensor was set to: "; // 判断什么类型的传感器 if(mSensor==MONOCULAR) cout << "Monocular" << endl; else if(mSensor==STEREO) cout << "Stereo" << endl; else if(mSensor==RGBD) cout << "RGB-D" << endl; //Check settings file // 检查配置文件是否存在 // cv::FileStorage对xml/YML的配置文件进行操作,读取配置文件 // yml的配置文件已经读入fsSettings了 cv::FileStorage fsSettings(strSettingsFile.c_str(), cv::FileStorage::READ); if(!fsSettings.isOpened()) { cerr << "Failed to open settings file at: " << strSettingsFile << endl; exit(-1); } // 加载ORB的字典 //Load ORB Vocabulary cout << endl << "Loading ORB Vocabulary. This could take a while..." << endl; // 加载字典到mpVocabulary mpVocabulary = new ORBVocabulary(); bool bVocLoad = mpVocabulary->loadFromTextFile(strVocFile); if(!bVocLoad) { cerr << "Wrong path to vocabulary. " << endl; cerr << "Falied to open at: " << strVocFile << endl; exit(-1); } cout << "Vocabulary loaded!" << endl << endl; //Create KeyFrame Database // 创建关键帧的数据库 mpKeyFrameDatabase = new KeyFrameDatabase(*mpVocabulary); //Create the Map // 创建地图 mpMap = new Map(); //Create Drawers. These are used by the Viewer // 创建视图 mpFrameDrawer = new FrameDrawer(mpMap); // 创建画图器 mpMapDrawer = new MapDrawer(mpMap, strSettingsFile); //Initialize the Tracking thread //(it will live in the main thread of execution, the one that called this constructor) mpTracker = new Tracking(this, mpVocabulary, mpFrameDrawer, mpMapDrawer, mpMap, mpKeyFrameDatabase, strSettingsFile, mSensor); //Initialize the Local Mapping thread and launch mpLocalMapper = new LocalMapping(mpMap, mSensor==MONOCULAR); mptLocalMapping = new thread(&ORB_SLAM2::LocalMapping::Run,mpLocalMapper); //Initialize the Loop Closing thread and launch // 初始化局部图线程 mpLoopCloser = new LoopClosing(mpMap, mpKeyFrameDatabase, mpVocabulary, mSensor!=MONOCULAR); mptLoopClosing = new thread(&ORB_SLAM2::LoopClosing::Run, mpLoopCloser); //Initialize the Viewer thread and launch // 初始化显示线程 mpViewer = new Viewer(this, mpFrameDrawer,mpMapDrawer,mpTracker,strSettingsFile); if(bUseViewer) mptViewer = new thread(&Viewer::Run, mpViewer); mpTracker->SetViewer(mpViewer); //Set pointers between threads mpTracker->SetLocalMapper(mpLocalMapper); mpTracker->SetLoopClosing(mpLoopCloser); mpLocalMapper->SetTracker(mpTracker); mpLocalMapper->SetLoopCloser(mpLoopCloser); mpLoopCloser->SetTracker(mpTracker); mpLoopCloser->SetLocalMapper(mpLocalMapper);}

接下来对每个类看一下初始化的效果。从字典类,关键帧类,地图类,局部图类等等看看它们如何进行初始化。

参考链接: 【1】mutex: http://www.cplusplus.com/reference/mutex/ 【2】thread: http://www.cplusplus.com/reference/thread/thread/


发表评论 共有条评论
用户名: 密码:
验证码: 匿名发表