Design and implementation of firefighting reconnaissance robot control system based on master-slave control

In response to issues related to casualties and property damage caused by fires, this paper primarily investigates a deformable wheeled firefighting reconnaissance robot with functions such as fire source detection, obstacle avoidance, and map transmission. The control system of the robot is elaborated, with the main motion control system utilizing STM32 technology. Tests were conducted on the robot’s two motion capabilities (wheeled and legged), obstacle avoidance ability, reconnaissance capability, etc. The results indicate compliance with the design requirements.


Introduction
Fires pose a significant threat to lives, with smoke and high temperatures hindering quick evacuation.Challenges persist for rescuers, both internal and external, as the fire's internal dynamics remain poorly understood [1] .High temperatures and low visibility during a fire make finding evacuation routes difficult.While firefighters' PPE can withstand temperatures below 300°C, the actual fire often exceeds 500°C, endangering firefighter safety [2] .Thick smoke during a fire further complicates rescue efforts.Traditional firefighting robots, using tracked or wheeled systems, have limitations.There is a need for robots with improved terrain adaptability.
In recent years, Monica P Suresh and others designed and developed a firefighting robot system controlled by Arduino Uno.The robot consists of hardware, electronic interface circuits, and software programs, powered by four batteries.It is equipped with a flame sensor for fire detection [3] .Chen [4] designed a firefighting robot with STM32F103ZET6 as the core.The system uses a one-to-many communication mode for real-time monitoring of potential fire points.Chien et al. [5] developed a multisensor-based intelligent security system.This firefighting robot uses low-cost obstacle detection modules for autonomous obstacle avoidance, utilizing infrared and ultrasonic sensors.Ikbar et al. [6] designed a firefighting robot using Arduino MEGA 2560 as the microcontroller.The robot has a flame KY-026 installed at the front, making its movement more stable during detection.Experimental results demonstrated the correctness of fuzzy logic.In this paper, a deformable wheeled firefighting reconnaissance robot is designed.The robot uses an electromagnetic clutch to assist in motor drive for wheel deformation, providing adaptive terrain capabilities and achieving better obstacle traversal.

Mechanical structure of firefighting reconnaissance robot
The firefighting reconnaissance robot utilizes four electromagnetic clutches for wheel deformation, as shown in Figure 1.During distortion, the clutches disengage, enabling the transmission shaft to drive the central gear.This synchronizes the sleeve with the external planetary wheel-legs, causing a relative angular shift and completing the opening and closing of the external planetary wheel-legs.In regular movement, the clutches engage, ensuring synchronous motion of the sleeve with the drive shaft.The central gear and outer planetary wheel-legs remain static, allowing the deformable wheel to move in standard motion.

Control system architecture
The control system for the firefighting reconnaissance robot [7] consists of a host computer system and the robot system, as shown in Figure 2. The Intel-NUC-based host computer processes data from infrared, LiDAR, and camera.The operator commands the robot system using an interface or joystick [8] for tasks like turning and obstacle traversal.Sensor data from the robot system can be displayed on the interface or a mobile terminal for comprehensive monitoring [9] .The STM32 microcontroller at the core of the robotic system is the main controller that outputs the control signals.Electromagnetic clutches enable servo control of brushless motors.The robot's shell incorporates smoke, temperature, and humidity sensors, with an IMU on the body for attitude information in motion commands.

Hardware design
The main controller adopts the STM32F407IGHx microcontroller, which integrates digital signal processor (DSP) and floating-point unit (FPU) instructions to greatly improve the operation speed and algorithm execution efficiency [10] .As shown in Figure 3, the CAN1 channel of the STM32 outputs CAN signals to the motor controller, which converts the signals into corresponding current levels for motor control and adjustment of the speed and direction.the four channels of the STM32 timer output pulsewidth modulation (PWM) signals for controlling the electromagnetic Clutch.When the duty cycle of the PWM output signal to the MOS switch is 0%, the electromagnetic clutch disconnects, causing the motor to rotate and deform the wheel.When the duty cycle is 100%, the electromagnetic clutch engages, keeping the deformable wheel at a fixed size.The smoke sensors transmit data through an integrated circuit (IIC) interface.To process signals from the four smoke sensors, the STM32's GPIO port emulates an IIC interface.The LIDAR and HD camera are connected to the host computer, which communicates with the microcontroller via a serial connection at a baud rate of 921, 600.Autonomous navigation algorithm design.Upon powering on, the robot initializes external devices, creates tasks, assigns priorities, turns on the task scheduler, and awaits commands from the host computer.Communication with the host computer involves both sending and receiving.For transmission, data is written to the transmission buffer and sent through the serial port.Upon receiving a command from the host computer, control instructions are parsed, and corresponding tasks are executed.Sensor data is periodically obtained, processed, and calculated through commands sent by the STM32 timer, stored in a buffer for environmental perception and monitoring the robot's status.Autonomous navigation control relies on multiple sensors, forming a motion control subroutine.The algorithm, depicted in Figure 4, involves filtering and calculating LiDAR data for environmental parameters like obstacle distances.Motion planning is then executed to determine control outputs.The Inertial Measurement Unit (IMU) provides the robot's attitude information through averaging filtering and calibration, forming a closed-loop feedback control.

Experiment
After introducing the robot body and control system, the next step involves crafting and experimenting with the prototype.Figure 6 showcases the firefighting reconnaissance robot prototype.To validate the effectiveness and reliability of the firefighting reconnaissance robot control system, experiments were conducted on the prototype of the robot based on this control system.

Wheeled motion experiment
In the wheeled motion experiment (Figure 7), joystick manipulation achieves linear motion by transferring wheel speeds to the robot's body.It moved 9.3 meters from 0 to 9 seconds, and the robot's motion was nearly linear within the allowable error and met the basic requirements.

Legged motion experiment
In the remote control state, it can be seen from Figure 8 that the robot's radius increases by a maximum of 4.7 cm under legged motion, resulting in fast motion.Figure 9 shows the state of the robot's legged motion during the 0-8 second interval.

Obstacle avoidance experiment
To verify the obstacle avoidance capability of the firefighting reconnaissance robot, the experiment placed the robot in a laboratory room with obstacles positioned in the center.The specific test scenario is shown below (Figure 11).At 0.0 s, the robot initiates an obstacle avoidance program only 2 meters away from the obstacle.From 0.45 s to 2.15 s, the robot ably avoids the obstacle.When the distance measured by the LiDAR exceeds 2 meters, the robot ends the avoidance process and returns to the normal path.
Figure 11.This is the obstacle avoidance graph of the robot in 0~2.15 seconds.

Reconnaissance experiment
The following is the robot reconnaissance experiment, and the results are shown in Figure 12 and the comparison metrics are shown in Figure 13.It can be seen that the average precision, accuracy, and recall of the improved Yolov5 are 0.943, 0.95, and 0.92, respectively, which are higher than that of the pre-improved Yolov5 model.The accuracy of fire source detection is greatly improved by using the improved Yolov5 model for detection.

Conclusion
This paper introduces a deformable wheeled firefighting reconnaissance robot designed for inspecting fire scenes.Using STM32 as the main controller and implementing both lower-level and upper-level systems, the robot communicates through the SBUS protocol.Integration of sensors, including inertial measurement units, cameras, and LiDAR, enables motion control and environmental reconnaissance.Experimental results validate the robot's outstanding motion capability, its ability to navigate complex environments, and its obstacle evasion and reconnaissance capabilities, meeting the mission profile of a firefighting reconnaissance robot.

Figure 3 .
Figure 3. Connection between controller and peripherals.

2. 4 . 2 .
Fire source identification algorithm design.In this paper, we use Yolov5 based model and Convolutional Neural Networks to identify fire sources.The backbone network of Yolov5 combines CSP and Darknet.The CSP structure is designed to detect fires by splitting the input feature map into two paths, where one path contains more convolutional layers for extracting high resolution data.The convolutional layers are used to extract high level features and the other path is used to fuse low level features to detect fires.Figure5shows the network structure of Yolov5.

Figure 6 .
Figure 6.Prototype images of the robot.To validate the effectiveness and reliability of the firefighting reconnaissance robot control system, experiments were conducted on the prototype of the robot based on this control system.

Figure 7 .
Figure 7. Wheel motion experiments in the laboratory corridor

Figure 8 .Figure 9 .
Figure 8. Controls the robot to switch between wheel and leg mode

Figure 10 .
Figure 10.The vertical coordinate represents the height of the robot's body above the ground, and the horizontal coordinate represents the displacement of the robot's movement.

Figure 13 .
Figure 13.Comparison of the accuracy of the two groups of models.