Link Search Menu Expand Document

VOXL Vision Hub

Project badge Project badge Project badge


The voxl-vision-hub package contains code necessary for vision and autonomy tasks such as autonomous flight, and contains 3 examples for flying a figure 8, following an Apriltag, and following a spline trajectory. It subscribes to multiple services to provide flight-critical sensor data and uses voxl-mavlink-server to write commands to a flight controller such as PX4.

voxl-vision-hub also contains code critical for many other services, such as voxl-calibrate-px4-horizon, MAVROS, and MAVSDK. For this reason, you will often find that voxl-vision-hub must be running in the background for a given service to work.

If you want to develop applications that involve flight control without using ROS, voxl-vision-hub is the recommended starting point and will save you thousands of lines of code.

How it Works

In the voxl-vision-hub Gitlab, you will find all the source code for the project. Here, we’ll provide an overview of what every .c file in the project does, which should provide a good overview of how voxl-vision-hub works overall.

  • autopilot_monitor.c: subscribes to the mavlink_onboard pipe published from voxl-mavlink-server, which publishes data sent by PX4, and saves relevant data (i.e. attitude, flight mode, whether it’s armed)
  • config_file.c: reads/writes from config file /etc/modalai/voxl-vision-hub.conf
  • control_input.c: reads the control pipe
  • fixed_pose_input.c: accepts a secondary (other than VIO) pose input to correct for VIO drift; this pose input should generally be slow but accurate source (i.e. every so often read april tag).
  • geometry.c: contains helper functions for transforming coordinate frames, used for VIO, VOA, relocalization
  • horizon_cal.c: calibrates IMU horizon (find orientation such that drone hovers perfectly still). This is not actively used for any of the demos in voxl-vision-hub, but depends on multiple .c files in voxl-vision-hub and so is part of this project. To perform horizon calibration, voxl-vision-hub must be running in the background, and voxl-calibrate-px4-horizon must be run.
  • horizon_cal_file.c: reads/writes the result of the horizon calibration to a JSON file on disk
  • main.c: starts every module in voxl-vision-hub, and then starts offboard mode (which will cause the drone to fly a Figure 8, follow an apriltag, or fly an spline trajectory).
  • mavlink_for_ros.c: opens the UDP port for MAVROS and MAVSDK to talk to voxl-mavlink-server. This is not actively used by the demos in voxl-vision-hub, but is grouped with the voxl-vision-hub project (this may change in the future). voxl-vision-hub must be running in the background for MAVROS and MAVSDK to work.
  • mavlink_io.c: functions to read/write Mavlink commands to an autopilot (i.e. PX4) or GCS (i.e. QGC)
  • misc.c: misc. stuff
  • obs_pc_filter.c: obstacle pointcloud filter; stores ring buffer of 2 seconds of pointcloud data, does temporal filtering, returns a final clean pointcloud at a constant 30 hz (regardless of sensor speed)
  • offboard_figure_eight.c: waits for the system to be armed and for offboard mode to be engaged, then commands the drone to fly a figure 8 (it generates a path, then continually follows waypoints in the path) around the position offboard mode was engaged, as long as offboard mode is engaged
  • offboard_follow_tag.c: (may be outdated and no longer functional) waits for the system to be armed and for offboard mode to be engaged,the ndetects and follows an Apriltag with an ID specified in the config file as long as offboard mode is engaged
  • offboard_mode.c: determines which offboard mode to initialize based on what was in the config file
  • offboard_trajectory.c: reads the plan_msgs pipe that accepts polynomial trajectory as input, and commands the drone follows that trajectory as long as the drone is armed and offboard mode is engaged (voxl-mapper is an example that writes into this pipe; when combined with voxl-mapper, the drone is capable of obstacle-avoiding trajectory generation and trajectory following. Note that offboard_trajectory.c expects the starting coordinate of the input path to begin where the drone currently is).
  • pose_filter.c: takes multiple Apriltag localizations from tag_manager.c, performs low-pass filter to output final pose (this is not used by the demos in voxl-vision-hub, but is used by the Relocalization tool)
  • tag_manager.c: subscribes the tag_detections pipe, which is written to by voxl-tag-detector, to get pose data to feed to pose_filter.c
  • trajectory_monitor.c: for “trajectory” offboard mode only; predicts whether collisions will occur based on current pose, VOA data, and trajectory
  • transform_ringbuf.c: ring buffer of transforms (timestamp, vector, rotation mat), used for multiple types of transforms (this is more of a library file). TODO: migrate this file to librc_math
  • vio_manager.c: takes output of VIO, transforms the VIO output from drone body (CoM) frame to local frame (fixed origin aligned with gravity, on the floor). Also does a health check on VIO and determines whether VIO is good, or has been lost or has reset.
  • voa_manager.c: subscribes to a user-specified selection of pipes from obstacle sensors (i.e. ToF sensors), converts distances from obstacles (and sensor FoV defined in config file) into approximated point clouds stored in a buffer from transform_ringbuf.c

Testing VOXL Vision Hub

voxl-vision-hub has 3 demonstrations of autonomous flight that can run out of the box.

Configuration

This is what the voxl-vision-hub configuration file (/etc/modalai/voxl-vision-hub.conf) looks like (this particular example is from a Starling’s factory state):

{
        "config_file_version":  1,
        "en_localhost_mavlink_udp":     false,
        "localhost_udp_port_number":    14551,
        "en_vio":       true,
        "vio_pipe":     "qvio",
        "secondary_vio_pipe":   "mvio",
        "en_reset_vio_if_initialized_inverted": true,
        "vio_warmup_s": 3,
        "send_odom_while_failed":       false,
        "horizon_cal_tolerance":        0.5,
        "offboard_mode":        "off",
        "follow_tag_id":        0,
        "figure_eight_move_home":       true,
        "robot_radius": 0.300000011920929,
        "collision_sampling_dt":        0.1,
        "max_lookahead_distance":       1,
        "en_tag_fixed_frame":   false,
        "fixed_frame_filter_len":       5,
        "en_transform_mavlink_pos_setpoints_from_fixed_frame":  false,
        "en_voa":       true,
        "voa_upper_bound_m":    -0.15000000596046448,
        "voa_lower_bound_m":    0.15000000596046448,
        "voa_voa_memory_s":     1,
        "voa_max_pc_per_fusion":        100,
        "voa_pie_max_dist_m":   20,
        "voa_pie_min_dist_m":   0.25,
        "voa_pie_under_trim_m": 1,
        "voa_pie_threshold":    3,
        "voa_send_rate_hz":     20,
        "voa_pie_slices":       36,
        "voa_pie_bin_depth_m":  0.15000000596046448,
        "voa_inputs":   [{
                        "enabled":      true,
                        "type": "point_cloud",
                        "input_pipe":   "dfs_point_cloud",
                        "frame":        "stereo_l",
                        "max_depth":    8,
                        "min_depth":    0.300000011920929,
                        "cell_size":    0.079999998211860657,
                        "threshold":    4,
                        "x_fov_deg":    68,
                        "y_fov_deg":    56,
                        "conf_cutoff":  0
                }, {
                        "enabled":      true,
                        "type": "point_cloud",
                        "input_pipe":   "stereo_front_pc",
                        "frame":        "stereo_front_l",
                        "max_depth":    8,
                        "min_depth":    0.300000011920929,
                        "cell_size":    0.079999998211860657,
                        "threshold":    4,
                        "x_fov_deg":    68,
                        "y_fov_deg":    56,
                        "conf_cutoff":  0
                }, {
                        "enabled":      true,
                        "type": "point_cloud",
                        "input_pipe":   "stereo_rear_pc",
                        "frame":        "stereo_rear_l",
                        "max_depth":    8,
                        "min_depth":    0.300000011920929,
                        "cell_size":    0.079999998211860657,
                        "threshold":    4,
                        "x_fov_deg":    68,
                        "y_fov_deg":    56,
                        "conf_cutoffTo get the b   0.079999998211860657,
                        "threshold":    3,
                        "x_fov_deg":    106.5,
                        "y_fov_deg":    85.0999984741211,
                        "conf_cutoff":  125
                }, {
                        "enabled":      true,
                        "type": "rangefinder",
                        "input_pipe":   "rangefinders",
                        "frame":        "body",
                        "max_depth":    8,
                        "min_depth":    0.300000011920929,
                        "cell_size":    0.079999998211860657,
                        "threshold":    4,
                        "x_fov_deg":    68,
                        "y_fov_deg":    56,
                        "conf_cutoff":  0
                }]
}

This may need to be edited depending on your setup and desired behaviors. An extensive description of every single parameter in this config file can be found on Gitlab.

Ensure that every parameter is reasonable before attempting flight.

Importantly, to test one of the 3 example offboard modes (Figure 8, Following Apriltag, and Following Trajectory), set the offboard_mode to one of the three:

  • “figure_eight”
  • “follow_tag”
  • “trajectory”

Other Requirements to Run

voxl-vision-hub itself subscribes to multiple pipes published by other services. These services must be running for voxl-vision-hub to work:

  • voxl-mavlink-server
  • voxl-imu-server
  • voxl-qvio-server
  • voxl-camera-server
  • apriltag-server (if running “follow_tag” mode)

Note that, for “trajectory” mode, another service, such as voxl-mapper (which does 3D mapping and trajectory generation) must be publishing trajectories to the plan_msgs pipe.

Running

Power on your VOXL-equipped drone, ensuring that all the required services are running. Ensure voxl-vision-hub is also running. In manual mode, using a radio transmitter, set your drone to a hovering position in a safe and open space. When you are ready, using your radio transmitter, switch your drone into offboard mode, and autonomous behavior will begin.

Note: voxl-vision-hub does not yet support autonomous takeoff and landing.