Motion and Navigation Control System of a Mobile Robot as A Prototype of An Autonomous Vehicle

Autonomous vehicle is a vehicle that can automatically navigate both long-range and short-range navigation. In this paper, we designed a motion and navigation control system of an autonomous vehicle that is implemented on a mobile robot. The remote navigation process involves using the global positioning system (GPS) sensor as a reference to navigate, and for short-range navigation using the proximity and force sensor. The robot task is to reach several coordinate points that are determined and marked with a traffic-cone at each point. In the control of short-range navigation, the robot uses five ultrasonic and three switch sensors to read the environment as input to the description of the distance between the robot and the object. The micro-controller then compares it to the robot’s safe distance instructions in navigating with surrounding objects. GPS sensor is used to provide remote navigation path information. To compensate for the accuracy of the wide GPS sensor readings, several waypoints are designed to guide the robot to the specified traffic-cone. From the implementation, the robot movement can go along the navigation area and can avoid short-range objects in the robot trajectory and get to the traffic-cone point with the percentage of success reaching the final destination above 86% both in sunny and cloudy weather.


Introduction
Autonomous vehicle is a vehicle that can operate safely and effectively without the need to be controlled by humans [1,2]. This vehicle comprises a collection of systems that are coordinated with each other to allow the vehicle to cross its environment [3][4][5]. Autonomous vehicles can sense the environment and navigate themselves, where humans choose the destination but are not required to carry out the mechanical operation of the vehicle. Control system that can identify the right navigation path and existing obstacles through the input from the sensory [6,7]. There are three major subsystems of autonomous vehicle system: (1) algorithms for localization, perception, and planning and control; (2) client systems, such as the robotics operating system and hardware platform; and (3) the cloud platform, which includes data storage, simulation, high definition (HD) mapping, and deep learning model training [8].
Autonomous vehicles have a long history. The first prototype that could function properly was created in 1980. By using the camera as a sensor, this prototype took 100 Km of empty roads without being driven by humans. With this success, many projects emerged in the 80s and 90s using a similar system used to drive through the highway. Other research makes a prototype to the development and IOP Conf. Series: Materials Science and Engineering 879 (2020) 012100 IOP Publishing doi:10.1088/1757-899X/879/1/012100 2 implementation of an autonomous vehicle that explores building. The car using an off-the-shelf remotecontrolled car, an NVIDIA Jetson TX2 GPU which has an ARM processor onboard, infra-red sensor for obstacle avoidance, and a monocular CSI camera for navigation and object detection. A combination of Google's Tensor Flow, OpenCV and a Robot Operating System (ROS) enables the remote-controlled car into becoming a fully self-driving car [9,10].
In the development of autonomous cars, a problem arises particularly regarding safety while driving. The researchers know that and wanted to solve the problem of driving safety and controlling the vehicle. Therefore, this research creates and implements an autonomous vehicle model that can navigate closedistance to avoid the surrounding objects using a proximity and switch sensor to achieve the specified several long-distance coordinates. Figure 1 is a block diagram of the system explaining that there are three main parts, i.e. input and output process where each part has its respective functions integrated. In the input section, there are several input components, i.e. ultrasonic sensors, switch sensors, Android smartphones and a remote where the components are included as input media. Then, there are several microcontrollers in the process, which comprised of sensor and remote, as well as motor and the main microcontroller where each microcontroller is connected to the main microcontroller as the center of the robot control system. The last is the output, there are some components, i.e. DC motor, servo steering motor, buzzer, and led indicators where all components will work according to the command from the main microcontroller.

Figure. 1 Block diagram of the system
The sensor is a device that serves to measure physical quantities in the form of motion, pressure, speed and so forth into voltage and electric current data. The sensor in this research comprises five of ultrasonic sensors and three of switch sensors. The Android smartphone used to send GPS coordinate data to processed by the main microcontroller. The intended remote is just to start and as an emergency stop switch not to navigate the robot.
An ATMega8 used to control the movement of a DC motor. Arduino Nano used to receive data from the sensor. HC-05 Bluetooth module used as receiver data from Android to the main microcontroller. Arduino Mega 2560 Pro microcontroller used as the main controller which is a part that functions as the controlling center of all robot systems. Then, as drive of the robot used A DC motor. For the steering system used a Hitec HS-5645MG Servo Motor.
The robot size is about 75 cm x 50 cm. The robot navigation motion steering system uses a car steering mechanism where the two front wheels are used to change the direction of the robot. This system uses input data from Android in the form of GPS point coordinates. The design of this steering system is supported by several supporting components such as proximity sensors, led indicators and buzzers as a marker of the course of the program. There are five ultrasonic sensors, one the sensor is placed parallel to the front of the robot and two sets are placed sideways diagonally 45 degrees. Two ultrasonic sensors were placed at every side to make area reading wider. Then the processing equipment and other supporting components are placed in the robot body to be protected from all interference that can interfere with performing tool. The following concept of the robot steering path and the component layout are shown in Figure 2. Figure 3 shows the object avoidance algorithm. The robot will navigate when the start button on the remote is pressed. Then the system will check the bumper before navigate. If the bumper is depressed, the robot certainly crashes into something. In this condition, the robot will retreat several steps to a safe distance before continue movement. All front sensors are used to read the robot's distance from the object. When the bumper is not in pressed condition, the system will compare the distance got from reading all side ultrasonic sensors and then move forward while turning into a clear area. This action is carried out by the robot after checking whether the robot is not in a position that almost hit something through the readings of all ultrasonic sensors.  Figure. 3 Object avoidance algorithm

Figure. 2 Robot steering system and component layout
The coordinates of the destination location will be determined on an application that is implemented on an Android smartphone. This application is made to be entered over one coordinate. when run the robot will navigate to the order of coordinates that have been entered. Therefore, the last coordinate entered is the coordinates of robot destination. The coordinate data will be sent to the main microcontroller via the Bluetooth communication line.

Results and Discussion
Control system testing is the testing phase of the navigation motion control application that has been built. Figure 5 is the result of field testing of the navigation motion control system in open area at different location. It can be seen on Figure 5 that the position indicated by the GPS coordinates are marked by red and yellow markers. Red markers indicate the waypoints that must be traversed and yellow markers such as cones are the final destination points.

Figure. 5 Results of testing
The red line that connects between markers is the path that should be traversed by the robot then the blue line that is adjacent to the red line is the actual route taken by the robot. From the test results, it appears that the robot is able to reach the final destination in accordance with the specified route (see Table 1). Cloudy 3  3  100  10  Cloudy  3  3  100  11  Cloudy  3  3  100  12  Cloudy  3  3  100  13  Cloudy  3  3  100  14  Cloudy  3  0  0  15  Sunny  3  3  100  16  Sunny  3  3  100  17  Sunny  3  3  100  18  Sunny  3  3  100  19  Sunny  3  3  100  20  Sunny  3  3  100  21  Sunny  3  3  100  22  Sunny  3  3  100  23  Sunny  3  3  100  24  Sunny  3  3  100  Success average 86.10% Table 1 shows the number of experiments. The failure occurred because of changes in weather that cause navigation motion control does not work accurately, because the navigation motion control system uses GPS which depends on the weather conditions at the time of testing.

Conclusion
The robot can move to navigate autonomously in accordance with the GPS coordinates specified in the test for long-range navigation, can find the last destination in accordance with the coordinates entered, can navigate a short-range distance while passing through obstacles on the way to the last destination point, the overall system success rate is above 86%. The failure occurred due to changes in weather conditions that caused the navigation motion control conditions did not work optimally, because the navigation motion control system uses GPS which very depends on the weather conditions at the time of testing.