A mechanical arm based on image recognition and remote control

Mechanical arm, image recognition and wireless transmission have been a hot topic and technology in recent years. This paper mainly introduces the object recognition mechanical arm based on the mechanical arm and wireless transmission technology. The system is modular, and the whole equipment system is divided into remote control system and mechanical arm system according to function. The remote control system is an upper computer software which integrates image recognition, related data processing module and wireless transmission module. The remote end includes wireless transmission module, steering gear control module and image acquisition module. The image is collected and sent to the remote control system, and the motion trajectory of the mechanical arm is obtained after calculation, and the control signal of the stepping motor on the mechanical arm is further calculated, and finally sent to the mechanical arm system to achieve the purpose of remote control. At the same time, during the movement of the mechanical arm, the relevant data is collected by the sensor on the mechanical arm and sent back to the remote control terminal for display and judgment. In this project, the mechanical arm system can automatically recognize the target on a two-dimensional plane and move the mechanical arm according to the object position. Meanwhile, the upper computer can display relevant data information.


Introduction
As a significant part of industrial system, mechanical arm undertakes a large number of industrial generation tasks.Meanwhile, the mechanical arm has the characteristics of large range of movement and high precision, which makes the mechanical arm very sensitive.The manual control mechanical arm has high requirements for the control ability of the controller [1].The camera is installed on the mechanical arm to collect images and process and recognize them, so as to realize the control of the mechanical arm to accurately grasp the object, which greatly reduces the control demand for the mechanical arm [2][3].Therefore, in today's in-depth refinement of industrial development, automated mechanical arm is an important link to realize the modern automation industrial system, and has broad application prospects [3].
In the meantime, with the continuous development of electronic technology, the terminal products that support WiFi are more and more popular, especially in the field of industrial wireless control.People have a more urgent need for a unified management and control system that can connect all household electrical products to wireless network in some form [3]. At present, the main technologies for wireless 2 networking include Wi-Fi, HomeRF and Bluetooth.But the openness of HomeRF is not good and the technology itself has some drawbacks such as weakness of anti-interference, while Bluetooth bandwidth is limited and transmission distance is small.So the above two ways cannot well meet people's daily application needs [4][5].By contrast, WiFi, with its advantages of wide radio wave coverage, fast transmission rate, wide frequency band and small radiation, has become an excellent and efficient choice among the three [4].The continuous development of WiFi communication technology also brings a variety of excellent performance chips [6].
In 2016, Ren c. Luo et al. [7] designed an intelligent high degree of freedom mechanical arm based on dynamic obstacle avoidance and 3D object recognition, and introduced the working principle and evaluation criteria of the mechanical arm servo system in detail.The team also proposed a robotic arm structure.The structure mainly includes object recognition based on 3D model., image capture and human-computer interaction [8].
To sum up, on the basis of previous studies, through the construction of hardware and software, the system designed has the functions of image acquisition and recognition, wireless data transmission and stepson motor control of mechanical arm, etc.The ESP8266 wireless communication module and threedegree-of-freedom mechanical arm are connected by Arduino development board.Reduce the development difficulty and design cycle, with lower cost and design difficulty to complete the system design.The signal received by ESP8266 is processed by Arduino, and the motion of the mechanical arm is obtained after calculation, and converted into the control signal of the stepper motor of the mechanical arm, and the purpose of remote control of the mechanical arm is finally achieved [9].Meanwhile, the camera on the mechanical arm is used to capture pictures.The upper computer selects the target to be captured at the remote terminal, and uses the image recognition technology system to automatically judge the distance and angle between the object and the mechanical arm, to realize the automatic control of the mechanical arm.

System design and implementation
According to the functional requirements analysis of the project, combined with the current development of relevant technologies, the remote control terminal system and the mechanical arm system are designed, and the communication between the two systems is realized.Specific instructions are as follows.
Firstly, according to the design requirements, the remote control terminal should be able to receive the image information transmitted by the mechanical arm system.After receiving, image processing is realized, which mainly includes judging the distance and angle of rotation according to the image information, and obtaining the coordinates of the object on the two-dimensional plane.And sends coordinate information to the main control panel of the mechanical arm.Therefore, the remote end needs to complete the development of the upper computer program.
Secondly, the mechanical arm needs to carry out motion control under multiple constraints.The mechanical arm is installed with ultrasonic ranging sensors, temperature sensors and cameras.Ultrasonic ranging sensor is mainly responsible for measuring the distance between the mechanical arm and the target object, to ensure that the mechanical arm will not hit the target object or other items in the process of moving; The temperature sensor is responsible for monitoring the working environment temperature of the mechanical arm to ensure that the working environment of the mechanical arm and the whole system is in a suitable temperature state; The camera is responsible for the collection of image information.The data obtained by the sensor and camera need to be transmitted back to the remote control terminal, so it is necessary to finish the design of the stepper motor system on the main board, mechanical arm and sensor.

Hardware description
2.1.1.Arduino main control board.Arduino as an embedded development platform includes hardware and software.The hardware part is the main control board including the processor, GPIO pins, power management system and other modules; The software is mainly based on the development environment software Arduino IDE, in which C/C++ based Arduino language is used to write programs [10].
In this paper, the model of Arduino master control board used in the system is Arduino Uno R3, which is a design and development platform commonly used in Arduino models.Its physical picture is shown in Figure 1.

ESP8266 WiFi module.
The physical ESP8266 module is shown in Figure 2. ESP8266 has a complete system.As an independent WiFi network connection module, ESP8266 can be used as a processor, or as a slave to assist other hosts to run, to achieve corresponding functions.The ESP8266 chip can be used as a wireless network adapter, connecting to the CPU through the SPI interface or the AHB (High Performance Bus) bridge interface in the CPU.Devices can be connected to local area networks or even wide area networks using the ESP8266 chip to connect hardware terminals to networks [6].     1) [11].

𝑉 𝑜𝑢𝑡 (𝑡) = 10𝑇
(1) There are two power modes: One of them is the single power supply modes and the other is positive and negative dual power supply.The latter can measure the negative temperature.But the advantage of the single power supply mode is that it is very power efficient, it just has a current of about 50 mA at 25℃.Therefore, this system adopts single power mode.
2.1.4.HC-SR04 ultrasonic distance sensor.The core of HC-SR04 ultrasonic distance sensor is mainly a transmitter and receiver.The transmitter converts the electrical signal into a 40 KHz ultrasonic pulse and transmits it.The receiver listens for the transmitted pulse.Once the reflected pulse signal is received by the receiver, an output pulse is generated, the width of which can be used to confirm the distance the pulse travels [13][14].
Ultrasonic waves travel through the air at a speed of about 340 m/s, according to the time  recorded by the sensor's timer, so the distance s between the sensor and the object will be obtained according to equation (2).2.1.5.A4988 stepper motor drive module.A4988 is a DMOS microstep driver with converter and overcurrent protection, with output drive performance up to 35V.The A4988 includes a fixed turn-off time current regulator that operates in slow or mixed attenuation modes.The converter is key to the ease of implementation of A4988.As long as a pulse is entered into the "step" input, the motor will be driven to produce a micro step.No phase sequence tables, high frequency control lines, or complex interface programming are required [15].The pin figure of the A4988 module is shown in Figure 5.

Hardware description
The mechanical arm system consists of power supply, Arduino main control board, ESP8266 wireless transmission module, LM35 temperature sensor, ultrasonic distance sensor and mechanical arm.In this project, the control of the mechanical arm is realized by controlling three stepper motors.The whole system adopts modular design and realizes functions through two parts: software and hardware.The whole system takes Arduino main control board as the core, and integrates related modules for data acquisition, processing and signal transmission.
The camera collects pictures and transmits the picture information to ESP8266 through Arduino to receive signal information from ESP8266 module.After data processing, the control signal of the stepper motor is sent to the stepper motor driver chip, so as to control the movement of the mechanical arm.Another function of the main control panel is to receive data from ultrasonic ranging sensors and temperature sensors on the arm, process the data and transmit it to the ESP8266 module.
The image data transmitted by the camera is transmitted by I2C bus protocol.The SDA interface and SCL interface are used in the Arduino development board.The MPU9250 realizes data acquisition and transmission by calling the internal register.When the master needs to read its data, it first needs to turn on the I2C transfer mode, so it needs to call the Wire library in Arduino for operation.
The main control communicates with the ESP8266 module through serial port for data transmission.The system needs to invoke the digital I/O port of Arduino as the soft serial port.The I/O pin realizes serial port communication with the outside through the built-in UART of ATmega328.First, the Arduino master control board needs to define serial communication and set the baud rate.The baud rate depends on the transmission rate configured on the ESP8266 chip.In this project, the baud rate is selected as 9600bit/s.
The flow chart of the mechanical arm system is shown in Figure 6.Since the image information collected by the mechanical arm camera is only two-dimensional information, the control of the mechanical arm is only carried out on a two-dimensional plane in this project.The kinematic analysis of the two-dimensional planar mechanical arm in this project is mainly carried out around the two control joints of the mechanical arm.The rotation angles of the two joints need to be determined according to the coordinates of the target object.These two angles which are respectively called  1 and  2 can be determined by geometric analysis, as shown in the Figure 7.According to geometric analysis, equation ( 3) and ( 4) can be obtained to obtain the motion control data of the mechanical arm. where, ].
Therefore, the Arduino control board processes the coordinate data according to the Equations 3 and 4 after receiving it, and then converts the momentum into the stepper motor to control the stepper motor, thus controlling the movement of the mechanical arm.
Mechanical arm system simulation of a chart as shown in Figure 8.

software system design
This project uses labVIEW software to develop the software program and MATLAB software to process the image.The following will introduce the work carried out by the two software.

labVIEW.
LabVIEW uses the graphical editing language G to write programs, the generated program is in the form of block diagram [16].In this project, labVIEW needs to be used as the connection between the server and ESP8266, and the transmitted image information should be received and saved locally.After the image is processed by MATLAB, coordinate information of the object is read and sent to Arduino; During the movement of the mechanical arm, labVIEW sends reading instructions to Arduino to read and display the values of temperature and ultrasonic sensors.After receiving the image data, the upper computer software saves the image locally, while MATLAB processes the data after reading, and saves the obtained coordinates locally after processing, which is convenient for the upper computer software to read and send.The following mainly introduces the principle of MATLAB image processing.The main purpose of image processing using MATLAB is to determine the position coordinates of the object.The most direct way to determine the location of an object is to determine the object features, and in a certain area, the most easily recognized object features have two kinds: shape and color [17].In this project, the color of the target is selected and used to judge, because MATLAB has certain advantages in the binarization of images and the analysis of the connected domain of binary images.
The image information collected by the camera in this project is generally differentiated by certain colors.Since each color contains different amounts of three primary colors, regions of different colors will present different gray values during the gray processing of the image, and the brightness of different regions will be displayed on the picture.A binary image is one in which each pixel on the image has only two possible values or grayscale states.Binary graph is usually represented by black and white, monochrome image and B&W.There are only two grayscale levels of binary image, that is, any pixel in the image is either 0 or 1, and there are no other transitional grayscale values [18].
The gray level of an image is regarded as a set R. The condition for the existence of a unique path for R is that the pixels of the path and their adjacent pixels meet a certain collar relation [19].For example, from point p to point q, it has a1, a2, a3...There are pixel points, and the adjacent pixels satisfy some adjacency relation, then there is a path between p and q. Figure 14 is the image before processing, and the image after binarization is shown in Figure 15.Therefore, the image information processing of this project is manifested as the identification and marking of the central point in a specific connected domain, and the coordinates of the target central point are the required coordinate data.Because each connected domain contains different amounts of three primary colors, this project does not judge the connected domain strictly by color, but sets a threshold to allow color difference and makes fuzzy judgment on the image.
Set the maximum color difference not to exceed 1.5, and use related algorithms to get the approximate range of each connected domain.According to the ratio of the three primary colors of yellow and the obtained pixel information, the location of the red connected domain can be obtained.Figure 16 shows the image before image processing.According to the algorithm, the connected domain of different objects can be obtained and the central point can be marked, as shown in figure 17.

Conclusion
This paper describes the principle of motion control, wireless data transmission and image recognition of the mechanical arm, related algorithms and project implementation methods.At the same time, according to the principle and algorithm, the project preliminarily realizes the target recognition on the two-dimensional plane, and the data can be transmitted through the wireless network of ESP8266.In addition, the mechanical arm system realizes the target moves according to the position information and stops near the target position, and the upper computer realizes the function of displaying relevant temperature and distance data information, and realizes the data visualization.However, at present, the project still has the following problems: First, the image collection in a single direction leads to the target recognition only in the two-dimensional direction, resulting in the system cannot accurately locate the actual spatial position of the target; Secondly, the upper computer still needs to use MATLAB to recognize the image.Relevant steps need to be operated manually, and automatic recognition cannot be realized.In the future, this project will increase the number of cameras and improve the motion algorithm to realize target recognition in three-dimensional space.At the same time, improve the function of image algorithm and use other software to realize automatic image recognition.

2. 1 . 3 .
LM35 temperature sensor.The LM35 is a temperature sensor manufactured by National Semiconductor.The sensor has high operating accuracy and a wide linear operating range.Its output voltage is linearly proportional to Celsius temperature without external calibration or fine-tuning, providing common room temperature accuracy of ± 1/4 ℃[11][12].The pin figure of the LM35 sensor is shown in Figure3[11].

Figure 6 .
Figure 6.Flow chart of mechanical arm control.

Figure 7 .
Figure 7. Geometric analysis.According to geometric analysis, equation (3) and (4) can be obtained to obtain the motion control data of the mechanical arm.

Figures 9 -
12 respectively show the codes that labVIEW realizes to send and receive string functions, read temperature data functions, read distance data functions and read local coordinate data and send functions.

Figure 9 .
Figure 9. Send and receive string functions.

Figure 12 .
Figure 12.Read local coordinate data and send functions.The generated interface of program is shown in Figure 13.

Figure 13 .
Figure 13.Upper computer program interface.After receiving the image data, the upper computer software saves the image locally, while MATLAB processes the data after reading, and saves the obtained coordinates locally after processing, which is convenient for the upper computer software to read and send.

Table 1
describes the ESP8266 module parameters.