Programmatic Drone Control

The F-11 brings the power of Nvidia GPUs to the sky, enabling a full machine learning stack in flight. The onboard Jetson is capable of lightning fast computation for large workloads. This power can be utilized for automated drone flight, payload control, and anything in between.

AI/ML

Most AI/ML stacks can be loaded onto the drone, including CUDA, Python, cuDNN, Tensorflow, PyTorch, Keras, OpenCV, and supporting workflow tools such as Docker and Grafana, and pretrained models can be easily deployed. Model training on the drone is also possible, an approach we are excited to see employed.

Most Nvidia frameworks can come preloaded onto the drone by request; contact Jason@flybydev.com for more information.

To download and install CUDA, refer to this link from Nvidia.

Ingesting Live Video

Since every camera feed should publish to the RTSP server on the Jetson, getting the video feed into a computer vision pipeline is very simple. For example, OpenCV can directly read from an RTSP server using VideoCapture.

The YOLOv7 segmentation and object detection model can interface with the RTSP server and provide real-time object tracking and detection.

MAVLink is a messaging protocol that was developed for communication between the drone flight stack, including the flight computer, the drone controller, onboard computers, and payloads. Mavlink uses dialects to differentiate different systems; every dialect is built on top of the common dialect, though some systems may choose to implement custom messages as well. Popular systems have their own dialect that are readily available, such as the ardupilotmega dialect, which implement messages specific to ArduPilot.

Mavlink messages are sent over different channels, including UDP connections and serial UART. At a lower level, mavlink messages are defined in terms of their fields. For example, the HEARTBEAT message (defined here) contains information about the system type and other identifying information.

Mavlink systems rely on a heartbeat mechanism to know which components are still connected to the system. A component that wants to be included in the system has to send a heartbeat message at a typical frequency of 1 Hz, and if it fails to do so it may be considered offline.

There are two main ways of communicating with mavlink messages that we utilize, namely MAVSDK and pymavlink. These frameworks can facilitate higher-level operations while allowing lower-level control.

MAVSDK

MAVSDK is written in C/C++, though it exposes full functionality to many other languages through a gRPC server. The core library implements basic mavlink messaging and communication, while the plugins library implements a lot of common functionality, such as gimbal control and mission planning. It is important to note that MAVSDK is primarily built with PX4 in mind, while our F-11 drones are running Ardupilot. Some differences may arise, such as some less common messages having slightly different functionality or not being implemented altogether.

For programmatic control of the drone movement using MAVSDK, refer here.

In contrast, pymavlink provides a low-level interface for mavlink message processing. It is very useful for prototyping applications and for sending and receiving individual mavlink messages, and it is very quick to get started with. In comparison to MAVSDK, pymavlink does not have higher-level operations that are exposed through the plugins library.

To use pymavlink to programmatically move the drone, refer here.

Last updated