Control and analysis of quadcopter flight when setting a complex trajectory of motion

The article presents the solution to the problem of autonomous piloting of a quadcopter for a complex flight task to determine the positioning error. Mathematical model is proposed for the equation of the dynamics of angular motion in a connected coordinate system to solve this problem. The stages of elaboration and creation of a flight task are considered, a model based on directed graphs is presented, reflecting various options for organizing the movement of a quadcopter. Clover quadcopters were taken as the object of research. The study was carried out in the airfield of the laboratory of unmanned aerial systems of the Saint Petersburg State University of Aerospace Instrumentation (SUAI). Fragments of the program code for the implementation of autonomous piloting in Python are given. The flight mission consisted in the formation of the letters SUAI in space. The article discusses the necessary hardware, instrumental systems and camera modes of operation for fixing the movement of the quadcopter. As a result of the research, special photographs of the movement of the quadcopter are taken and the permissible range of positioning errors is determined for performing the flight task.


Introduction
Today, there is a significant increase in the practical reconciliation of quadcopters for the industry. Over the past five years, the aviation industry has developed a new ecosystem of Aeronet, which includes manufacturers and operators of unmanned aerial systems, software developers, integrators, service providers, industry associations, specialized research laboratories. The main major segments of the Application of the Aeronet [1,2] market are: agriculture, logistics and transport processes, remote analyzing of the land, construction of digital models based on data taken drones, monitoring tasks. The range of tasks and new solutions is constantly expanding now. New practical examples include tasks of inventory, analysis and monitoring of the urban environment, monitoring of passenger flows. Unmanned aerial systems should be applied to the tasks of passenger traffic research [3], or to monitor the waters in port areas, as they allow to obtain dynamic data on the state of complex technical systems. To perform such tasks, it is required to use unmanned aerial vehicles in autonomous mode.
Research in the field of autonomous micro-aerial vehicles (MAVs) is currently divided into several different approaches: One part of the community focuses on improving the accuracy of quadcopter control, and has led to notable advances such as aggressive flight maneuvers, [4,5], ping pong [6] and even collaborative construction tasks [7]. While these developments are certainly impressive, such systems nonetheless require external motion capture systems to perform these demanding tasks properly. Another part of the community focuses on outdoor flights where GPS-based pose estimation is possible. Although complete solutions in this field are already available [8], researchers continue to work on improvements in stabilization [9], photogrammetry, obstacle detection and avoidance.
The positioning method using Aruco-marker was chosen to carry out laboratory tests. Having studied the works of authors [10] and [11], we can conclude that the use of such a positioning system is permissible. As noted by the authors of [10] and [11], Aruco-marker makes it possible to obtain a sufficiently high accuracy of a quadcopter flight and a relatively high accuracy of landing.
In this experiment, a special camera will be used to determine the positioning error, thanks to which you can see how accurately the quadcopter moves along the coordinates.
The base for research was a specialized laboratory of unmanned aviation systems of the Engineering School of Saint Petersburg State University of Aerospace Instrumentation (SUAI). The study was based on "Clover" quadcopters from the industrial company Copter Express Technologies LLC [12]. The list of quadcopters of this company is shown in table 1. The feature of Clover quadcopters (figure 1) is that it is a training open-source constructor of a programmable quadcopter, which can be used for solving various problems: cargo transfer, FPVpiloting, autonomous piloting. In this case, the identification [12] of the movement of the quadcopter is based on identification in the flight field by Aruco-marker [13,14]. Well-tested solution is to set the trajectory of the reference points and further represent the trajectory in parametric form.
In this work, the quadcopter should draw the acronym "ГУАП (SUAI)", while the operator will not directly control the device, the flight will take place autonomously. In order to see the painted letters LED tape is used, as well as the camera brand Nikon 3320 with long exposure mode.

Experimental part
The position of the copter in space is characterized by coordinates x, y, z of center of the mass of the drone in the stationary Cartesian coordinate system and the three corners of the rotation around the axly associated coordinate system [15][16][17][18]. In the initial stages of development, you can also consider the independent part of the apparatus a solid body, and the wind is considered only as external disturbance. When performing a flight mission, you must take into account the hovering of the quadcopter at a certain point (about the Aruco-marker). Figure 2 shows the reciprocal position of the bound (xk,yk,zk)and the normal, corresponding marker,(xm,ym,zm)of the hovering coordinate systems and the forces and moments acting on the quadcopter.
where L is total thrust, b is traction strength ratio, air density, CD is lifting ratio, Ai is surface propeller area, Ri is propeller radius, wi is angular rotation speed of i-propeller.
The force of the thrust in the marker coordinate system: where is cos , sin , , , To build a complete system of equations, it is necessary to include the force of gravity and the force of air resistance (4).
[ , , ] (4) Taking into account the symmetry of the quadcopter and considering that the center of the mass is located at the beginning of the coordinates of the associated system, the equation of angular movement dynamics in the associated coordinate system can be recorded in the form of:  Based on the presented equations, the equation of the dynamics of the quadcopter is defined. Figure 3 shows the Aruco-markers field in the Unmanned Aerial Systems Laboratory SUAI.

Methods and materials
Letters "ГУАП" were chosen as a model of the flight mission, which is an abbreviated acronym for the Federal Autonomous Education Institution "St. Petersburg State University of Aerospace Instrumentation (SUAI)". The quadcopter must fly on the markers and display the letter in space. Copter will fly with one-letter mission. You need to present a model for movement of quadcopter. Different drawings of letters will depend on the given transitions and changes of heights. Let's present these variants with the graph theory ( figure 4). Let's take the letter "Г" as an example. Г Figure 4. Model of moving a quadcopter to build a letter in space.
In first version of flight and landing were made at different points of the airfield. At the same time, at motion change points, the copter produces a slight hovering and further movement. In second version of fullness implemented flight in which the quadcopter on the route moves both in the straight and in the opposite direction. In second case, a fully ring route is built through a system of defined points.
Similar to the tasks and transport logistics, this model is a model of pendulum routes [19] in the implementation of one-way and two-way traffic with cargo. Let's take a look at the practical implementation of flight missions.
To organize an autonomous flight of the quadcopter, you need to use a flight controller based on the PX4. This can be a Pixhawk, Pixracer and COEX Pix with special software for the flight controller, as well as a single-board Raspberry Pi computer with an RPi Camera (H) Fisheye Lens and a VL53L1X laser rangefinder.
The Clover platform allows a Raspberry Pi computer to be used for programming autonomous flights. The flight program is typically written using the Python programming language. The program may receive telemetry data (which includes battery data, attitude, position, and other parameters) and send commands like: fly to a point in space, set attitude, set angular rates, and others.
The platform utilizes the ROS framework, which allows the user program to communicate with the Clover services that are running as a clover systems daemon. The MAVROS package is used to interact with the flight controller.
MAVROS A drone has to use a positioning system to be able to hover still or to fly from point to point. The system should compute the drone position and feed this data into the flight controller. Clover allows using multiple positioning systems, such as optical flow (requires a camera and a rangefinder), fiducial markers (requires a camera and markers), GPS and others. Our experiment uses a positioning system using Aruco-marker. An Aruco-marker is a synthetic square marker composed by a wide black border and a inner binary matrix which determines its identifier (id). Fiducial markers allow the drone to compute its position relative to these markers. This data may then be transferred to the flight controller.
Given an image where some Aruco-marker are visible, the detection process has to return a list of detected markers. Each detected marker includes: The position of its four corners in the image (in their original order). The id of the marker. The marker detection process is comprised by two main steps:  Detection of marker candidates. In this step the image is analyzed in order to find square shapes that are candidates to be markers. It begins with an adaptive thresholding to segment the markers, then contours are extracted from the thresholded image and those that are not convex or do not approximate to a square shape are discarded. Some extra filtering are also applied (removing too small or too big contours, removing contours too close to each other, etc).  After the candidate detection, it is necessary to determine if they are actually markers by analyzing their inner codification. This step starts by extracting the marker bits of each marker.
To do so, first, perspective transformation is applied to obtain the marker in its canonical form. Then, the canonical image is thresholded using Otsu to separate white and black bits. The image is divided in different cells according to the marker size and the border size and the amount of black or white pixels on each cell is counted to determine if it is a white or a black bit. Finally, the bits are analyzed to determine if the marker belongs to the specific dictionary and error correction techniques are employed when necessary. To accurately determine the position of the drone, you need to calibrate the camera. Camera calibration can significantly improve the quality of nodes related to computer vision: Aruco-marker detection and optical flow.
Camera calibration process allows to define the parameters reflecting the specific lens installed. These parameters include focal lengths, principal point (which depends on camera lens placement regarding the centre), distortion coefficient D. There are several tools allowing to calibrate the camera and store calculated parameters into the system. Usually they use calibration images, "chessboards" or combinations of "chessboards" and Aruco-marker grids (ChArUco). The Clover 4 drone software has already entered the average calibration values for the Raspberry Pi Camera.

Results and discussion
To realize autonomous flight, the quadcopter must perform the following tasks:  Run autonomously and take off at a given altitude;  Show a specific color display on the address LED tape;  Follow a given route;  Make an autonomous landing and disarm of engines. The experiment will require: a single-payer Computer Raspberry PI, a NIKON camera with the ability to do long exposure, LED tape and Aruco-marker. The single-payer computer is attached to the bottom deck of the quadcopter (figure 5) and connected to the flight controller with a micro-USB to USB cable. Raspberry PI is required for the Python autonomous piloting program and for navigating the drone using machine vision without GPS. Based on series of experiments with the NIKON 3320 camera, it was found that exposure of up to 30 seconds is enough to capture the trajectory line of the letter.
The quadcopter uses the lower camera to recognize the marker and match the marker ID with the map file of all marks in flight zone.
The code fragment below (figure 6) takes drone off autonomously, hover's over the Aruco-marker at an altitude of 1 meter, and then flyie's to the marker with id 83. Aruco-map is pre-recorded in the memory of the quadcopter. The Clover 4 open-source platform allows to use the Raspberry Pi to program autonomous drone flight. Most often, the program for autonomous flight is written on the Python. Program can receive telemetry (battery charge, orientation, location, etc.) and send commands: fly to the point, set the oridirection, angular speed and other functions.
In order to indicate coordinates of the point on which the drone should move, the navigate service is used.
Options:  x, y, z -coordinates (m);  yaw -angle on the trotting (radian);  yaw_rate -angular speed on prowling (used when installing yaw in NaN) (rad/s);  speed -flight speed (setpoint speed) (m/s);  auto_arm -to transfer the copter to OFFBOARD and to arse automatically (copter will take off);  frame_id is a coordinate system that sets x, y, z and yaw. To navigate quadcopter during takeoff, landing and flight mission is used frame_id -the coordinate system for the values x, y, z, vx,vy,vz. For take-off, the body coordinate system is used -coordinates relative to the quadcopter without taking into account the slopes on aruco_N the pitch and roll.    Examples of feature graphs presented in figure 8 were built for each letter.

Conclusion
To test the proposed approach, experiments were carried out to move the quadcopter from the initial position with access to the trajectory. The movement of the quadcopter was carried out along a bit-linear trajectory. The quadcopter's positioning error relative to the desired trajectory in the analysis of autonomous flight by all four letters in the laboratory of unmanned aerial systems was 0.1 to 0.2 m. Within the framework of the results obtained, it can be concluded that the positioning of the quadcopter is sufficiently accurate in space. Positioning error is due to the average calibration values of the Raspberry Pi camera, as well as the data from the flight controller sensors. To improve positioning accuracy, it is worth checking the accuracy of applying Aruco-marker to the surface, rechecking the camera and flight controller calibration, and choosing the PID controller values for this quadcopter.