A Simulator for Autonomous Ground Vehicles
The MAVS provides several example codes
An actor is any dynamic (movable) object in MAVS. This code provides an example of an actor in a MAVS simulation
Usage:
> ./actor_example scene.json actor.json
Where “scene.json” examples can be found in mavs/data/scenes and “actor.json” examples found in mavs/data/actors. The example will create two windows. One window is a camera simulation that automatically follows the hmmwv actor. The other is a free camera window that can be used to move through the scene with the WASD, PageUp, PageDown, and Arrow keys.
This is an example of a MAVS batch simulation for generating labeled training data.
Usage:
>./batch_halo_simulation (hotstart_num)
“hotstart_num” is an optional parameter that will restart the batch sim at a given frame. There are 10,725 frames of output, with each frame saving multiple image, lidar, and annotation files.
An example MAVS RGB camera sensor
Usage:
>./camera_example scene.json env.json camera.json
Where “scene.json” examples can be found in mavs/data/scenes and “camera.json” examples found in mavs/data/sensors/cameras
The output will be a camera window that moves through the scene in the X (east) direction.
An example sensor placement analysis with MAVS. Finds the coverage statistics of two LIDAR rotated through various mounting angles on the front of a vehicle
Usage:
>./forester_example sensor_inputs.json
The file “sensor_inputs.json” is specific to this example, and a sample can be found in mavs/data/sims/misc_sims/forester_sim.json.
Similar to forester_example, but with simplified analysis.
Usage:
>./forester_example_simple sensor_inputs.json
The file “sensor_inputs.json” is specific to this example, and a sample can be found in mavs/data/sims/misc_sims/forester_sim.json.
Example that demonstrates how MAVS can be used to load and write out applanix heightmap files (.hmf).
Usage:
>./hmf_read_write_example heightmap.hmf
The file “heightmap.hmf” is a binary input file in the Applanix .hmf format. An example can be found in mavs/data/hmf_files.
The code will load in the hmf file, then write it back out. A window displaying the hmf file will pop up, close the window to finish the program.
Example of a user-controllable camera view of a MAVS scene.
Usage:
>./free_camera_example scene.json
Where “scene.json” examples can be found in mavs/data/scenes
Fly the camera around the scene with the W-A-S-D keys. Page Up & Page Down move the camera up and down. Arrow keys rotate the view (left,right,up,down). Home and End keys rotate the roll of the camera. Close the view window to finish the program.
Generates labeled lidar data in a given scene along a given path.
Usage:
>./halo_lidar_trainer scene.json anvel_replay.vprp
Where “scene.json” examples can be found in mavs/data/scenes and “anvel_replay.vprp” is an ANVEL replay file in text format, examples in mavs/data/waypoints.
The output of the example is annotated lidar data.
Demonstrates a MAVS lidar simulation.
Usage:
>./lidar_example scene.json (lidar_num) (rain_rate)
Where “scene.json” examples can be found in mavs/data/scenes.
lidar_num is optional and specifies the type of lidar sensor.
The default LIDAR is the SICK LMS-291.
rain_rate is an optional parameter and specifies the rain rate in mm/h. Typical values are 2.5-25.0.
The simulation will save several files
Demonstrates a MAVS radar simulation.
Usage:
>./radar_example mavs_scene_file.json
Where “scene.json” examples can be found in mavs/data/scenes.
The simulation will save two files, “camera.bmp” and “radar.bmp”. “camera.bmp” shows the view from the position of the radar. “radar.bmp” shows the radar output with detected targets in yellow.
Demonstrates a MAVS closed-loop autonomy simulation with vehicle, driver, and sensors in the loop.
Usage:
>./simulation_example scene.json (lidar_num)
or, in the case where the code was built with MPI enabled
>mpirun -np 6 ./simulation_example scene.json (lidar_num)
Where “scene.json” examples can be found in mavs/data/scenes. When using MPI, the simulation requires at least 6 processors to run.
lidar_num is optional and specifies the type of lidar sensor.
Demonstrates how to use MAVS MPI framework to simulate multiple vehicles.
Usage:
>./multi_vehicle_example scene.json
or, in the case where the code was built with MPI enabled
>mpiexec -np 8 ./multi_vehicle_example scene.json
Where “scene.json” examples can be found in mavs/data/scenes. When using MPI, the simulation requires at least 6 processors to run.
When using the MPI version of the code, this example requires 8 processors to run.
The code will display to camera windows of separate vehicles moving through the simulated scene.
Usage:
>./utest_mavs_camera
The correct output is a camera frame, rotating around a simple scene with a red sphere, a yellow box, and a green surface.
Press Ctrl+C to stop the simulation.
Unit test to evaluate the MAVS-Chrono vehicle interface.
Usage:
>./utest_mavs_chrono_vehicle chrono_vehicle_input_file.json
where “chrono_vehicle_input_file.json” gives the path to the chrono data directory and the different vehicle configurations to use. An example can be found in “mavs/data/vehicles/chrono_inputs/hmmwv_windows.json”.
The simulation will run for 15 seconds of simulated time, printing a state update every second. The vehicle drives in a straight line and should reach a maximum speed of around 38.9 m/s.
Usage:
>./utest_mavs_camera_distortion
Creates a distortion model, then prints the undistorted and distorted positions of those pixels.
The correct output is:
Undistorted.u Undistorted.v Distorted.u Distorted.c
0 0 23.3065 18.3965
0 483 23.6183 466.037
603 0 581.996 18.3743
603 483 582.06 466.059
302 242 303.22 242.83
Usage:
>./utest_mavs_camera
The correct out put is a camera frame, rotating around a simple scene with a red sphere, a yellow box, and a green surface.
Press Ctrl+C to stop the simulation.
Usage:
>./utest_mavs_gps
Creates a differential and dual-band GPS and runs two iterations of each sensor. In the first simulation the GPS sensors are run in completely “open-field” conditions with no satellite occlusion or multipath errors. In the second, a simple scene with buildings surrounding the GPS units is created, causing multipath and dilution of precision errors. Correct errors should be on the order of 10-100 centimeters for the dual band GPS and a few centimeters for the differential gps.
Usage:
>./utest_mavs_lidar
If working correctly, program should show a simple scene with a red sphere, a green floor, and a yellow box. Smoke will rise up from the floor in front of the sphere. A top-down rendering of lidar scan will appear in a second window.
Press Ctrl+C to kill the simulation.
Usage:
>./utest_mavs_radar
If working correctly, should show a simple scene with several columns. A camera and a radar will rotate through the scene, and the radar returns will show up in yellow.
Usage:
>./utest_mavs_spherical_camera scene.json
Where “scene.json” examples can be found in mavs/data/scenes.
The result is a rendering of a spherical projection of the input scene. Two files will be saved, “simple_render.bmp” and “spherical_projection.bmp”