A Simulator for the Development of Autonomous Robots
MetadataVis full innførsel
A simulator for the development of vision-based navigation, guidance and control algorithms of an autonomous flying robot is implemented. The simulator is a drop-in replacement for all system input/output and interfaces with the rest of the system as is. Synthetic data is generated in the form of inertial measurements, camera images and scanning laser range measurements. Computer graphics are used to generate image projections of a virtual scene, emulating calibrated intrinsic parameters of a specific camera. The effects of optical distortion from a wide angle lens are emulated using iterative methods, and the apparent duality between image distortion and undistortion methods is presented. Depth images are produced to calculate distance measurement to specific points in the virtual scene, and a depth buffer sampling model is created to generate point cloud scans for a simulated lidar sensor. The sensor data is used as input to a sensor fusion filter providing navigation feedback to guidance and control algorithms running in real-time. General six-degrees of freedom rigid body dynamics is simulated with a quadrotor actuation model responding to control input, closing the loop between control and navigation allowing for full system integration and testing on the desktop with minimal overhead. Visualization tools are demonstrated as a graphical interface to interact with the headless simulation, for real-time visual verification of system status.