Airlight (A UAVs based light guidance and search light system)

Airlight, an UAV (Unmanned Aerial Vehicle) based real time focus and searching light system is a proposed solution which ensures guiding people safely on their path during a dark environment is one less thing to worry about. It is designed to identify a person through algorithms of object detection and also through usage of IoT and sensorics to enable power efficient use of spotlight and hence autonomously guiding people on their path during night-time. Airlight proves to be useful in various sectors such as helping farmers to look after their fields during night-time, security purposes, guiding construction workers while lifting heavy objects, military applications etc.


Introduction
Night-time and lack of lighting facilities lead to a lot of difficulties. It raises security concerns as lack of lighting can lead to several accidents and crimes. Proper light guidance can lead to reduction in crime by 39%.
[1] Taking the case of farmers, due to lack of electricity in villages, keeping a check on farms and doing important tasks on their fields during night time is very difficult. In construction sites, where workers need to carry heavy things up and down, carrying light sources is very difficult and thus low light increases risk of accidents.
[2] Including the above examples, there are so many other places where lack of light leads to problems during night-time and that's where the inspiration and idea of Airlight is born. An autonomous, smart real time focus and searching light system, Airlight provides a lot of help in these various situations and ensures that lack of light is one less reason to worry about during nighttime. Airlight is completely autonomous and doesn't need any contact and this adds to its advantages especially during Covid-19 pandemic times. In today's world, use of UAVs for different applications has increased a lot. The estimated global Research and Development budget for UAVs in the year 2022 is 3.8 billion USD. [3] Looking at the vast application and scope of research and development of UAVs, Airlight is an attempt to add to it by using UAVs for a smart light guiding system.
Airlight uses IoT along with object tracking and image processing to fulfil its purpose and is supported by various sensorics such as relays, LDRs (Light Dependent Resistors), servo motors, etc to make it a smart and power efficient module. Airlight is a microcontroller-based module which manages all computation and processing requirements for image processing and object tracking along with controlling various sensorics to act according to the feed by the camera. Microcontroller used here is Jetson Nano, considering the computation capabilities of image processing. Servo motors have been used in the setup which controls the movement of light and camera working as a two-axis gimbal structure which is controlled directly by the camera. The module is set up to use object detention to find the feet of the person to be guided and hence accordingly align the camera and spotlight to provide light in order to guide during night time. Setup consists of a relay to electronically switch on the light on the basis of feed by the camera and use of LDR is there to sense the amount of light in the environment and hence to decide the use of spotlight is there or not. Also, LDR helps to adjust the brightness of light. The movement of the drone is autonomous, and it will follow the person to guide on their path. With all its smart and efficient guiding nature, Airlight will go a long way in helping people from different fields to tackle their problems associated with low lighting and guidance during night-time.

Literature Review
In order to get the best result out of the module, various parameters need to be integrated and an intelligent approach not only at the central level but also at every small aspect of the module is required. A lot of research is being done in the field of UAVs, gimbal structures and stuff related to object tracking and image processing. But despite all this, a smart light guidance system with real time focus has not been developed which can be made available to people at affordable rates. The stability of the structure plays a key role in the development of the system. Analyzing the vibration characteristics of the two-axis gimbal for UAVs is important for stability of camera and spotlight attached to the drone. [4,5] Object detection is a basis of a wide range of many high-level computer vision applications, such as autonomous driving, face detection and recognition, and activity recognition. [6] The integration of computer vision with UAVs gives solutions to a lot of real-life problems. To be useful in everyday environments, robots must be able to identify and locate real-world objects. [7] Moreover, the image processing algorithms that run on these devices must be fast enough to produce results in real-time. This should be done while minimizing the resources used to make it more compact. Airlight uses the pose estimation module developed by MediaPipe. The single person-specific human pose estimation model was developed to enable various performance-demanding use cases such as Sign Language, Yoga/Fitness tracking and AR (Augmented Reality) [8].
In this use case, it is essential that the model first detects and then tracks a subject, even if more subjects are present in the frame later. Blazepose (the pose estimation module offered by MediaPipe) uses a two-step pose estimation technique that first detects and then tracks the person [9]. This perfectly tackles the problem.
A solution should not just be technologically sound but also user-friendly. Airlight allows the user to easily access its data on the cloud and even take manual control when needed. It provides the correct software framework coupled with efficient hardware to perform guidance in the dark.

3
Proposed Methodology

System Architecture
The process flow of Airlight can be subdivided into a 3-level architecture which involves fetching data, processing it and networking. ' Figure 1' shows the control flow that complements it and is described below: • Perception: The device collects real-time information about the physical surroundings using the sensors and equipment on board. • Interfacing: Data processing happens at this stage. With the help of image processing algorithms, the microcontroller perceives the environment and controls the actuators based on the results. • Communications: This phase comprises sharing processed data and information sent by the user via the cloud. Furthermore, the user can take over control manually to suit their application's needs.
The layers are subdivided to optimize reactions to changes in the external environment and control given manually, if needed, asynchronously. Since the Interfacing stage can take automated on-device decisions with the processed data, it can be called the middle-ware layer.

Sensors
There are various components involved in Airlight which enables its functioning by analyzing the environment around it and accordingly working as a guiding system. Following are utilized components: • Microcontroller: Jetson Nano is used for processing aspects of the system. It has been chosen as it is suitable to handle the object detection and image processing aspects of the module. • Servo Motor: Two servo motors are used for the movement of camera and spotlight.
• Relay: Electronic switching is required, so relay is used in order to switch on the spotlight. A 12V relay is used. • Photoresistors: These are light dependent resistors used to determine the amount of light in the environment. • Camera: Used for the purpose of object detection and capturing live feed.
• Battery: 12V external battery is used in order to power the spotlight as microcontrollers cannot fulfil such high voltage needs.

Structure and Schematic of Airlight
• The system consists basically of a circuit of light along with a photoresistor and relay and a structure consisting of the two servo motors for the movement of camera and light. • The circuit and the structure both are controlled by microcontroller and hence microcontroller is the core of the entire system where all the processing takes place. • Camera used senses the environment and helps to get the coordinates of the feet of the person and hence help to determine the movement of the servo motor to align the spotlight. • Relay used prevents manual need of switching on the light and photoresistors used helps in determining the brightness of the light hence making it a smart system.

Practical Development
• Airlight's development is being carried out in a modular manner to probe each functionality and design an easy-to-use user interface. • ' Figure 2' shows the block diagram of the prototype. The camera and the photoresistor are interfaced with the microcontroller to fetch data from the surroundings. • The data from the LDR is used to turn the spotlight on for better visibility via a relay.
Meanwhile, the frames captured by the camera are processed to track the subject. The description above helps us gauge the use of Airlight as a light guidance system. It explains the various parts involved in actuating the UAV. Control of the device may be completely automatic or can also be transferred to the user at any point. Sturdy hardware coupled with secure software ensures the solidity of shared commands and sensor data.

Circuit Development
• The various components such as servo motors, relay, photoresistor and camera all are connected to the microcontroller Jetson Nano as shown in ' Figure 5'.  Nano. The ground pin of the servo motor is connected to the GND pins of the Jetson Nano. 5V input for the motors needs to be provided by an external battery as Jetson Nano might not be able to handle the current spikes by the servo motors. • Signal pin, ground pin and 5V Vcc input pin of the relay are connected to GPIO, GND and 5V pin of the Jetson nano respectively. On the output side of the relay, where the spotlight is connected, COM(Common) and NO (Normally Open) pins are used. One end of the spotlight is connected to the COM pin of the relay and the other to one of the terminals of the 12V external battery. The other terminal of this battery is connected to NO pin of the relay. • Ends of the photoresistor are connected to a 5V pin and a GPIO pin of Jetson Nano. The end of the photoresistor connected to the GPIO pin also has a resistance (1k ohm) connected to it, which is connected with the terminal of the battery connected with NO pin of the relay and hence connection between the spotlight and the photoresistor is made. • Camera is directly connected to the CSI (Camera Serial Interface) slot available in Jetson nano.
The system is equipped to use object detection in order to determine the coordinates of the feet of the person being guided and hence adjust the position of the light to guide people during night-time. Use of photoresistor and relay make the lighting system smart and also power efficient. ' Figure 6' shows the circuit diagram and development of the system.

Results and Discussion
A miniature implementation is shown in ' Figure 7' to track the feet of a subject. The python script uses the camera footage and processes it to find the coordinates of the feet. These are then sent to the microcontroller. Using this, the microcontroller actuates the dual-axis servos to point at the subject. The essential points of the module that tracks the feet are highlighted below.

Feet Detection and Tracking
Image segmentation and object detection is an essential task in all applications of computer vision. It addresses the problem of partitioning an image into disjoint regions of interest according to their specific features (gray levels, texture, etc.) [12].
• The MediaPipe library was used for pose estimation, which offers live ML solutions.
• It does not require a graphics processing unit (GPU) to create a model for pose estimation.
• Even without a GPU, the tracking occurs very fast at 20-30 frames per second (fps). ' Figure 8' and ' Figure 9' shows the fps in the results in the top-left corner. It is very crucial that the model be able to produce results in real-time while using less resources. The code to find the coordinates of the feet has been previously explained in section 3.4. Both the figures indicate landmarks with red circles with green lines connecting them. The coordinates of the feet on the other hand are indicated by a purple circle.
• Furthermore, the model predicts the position of the subject's feet even if the feet are not visible in the frame. The accuracy of this is shown in ' Figure 9'. To support the prediction of invisible points, the model simulated occlusions (random rectangles filled with various colours) during training and introduced a per-point visibility classifier that indicates whether a particular point is occluded and if the position prediction is deemed inaccurate. This allows tracking a person constantly even for cases of significant occlusions, like upper body only or when the majority of person body is out of scene.

Servo Actuation
Visual servoing is a technique which uses feedback information extracted from a visual sensor to control the motion of a robot [13]. At dark, the microcontroller decides to switch the spotlight on based on the input from the photoresistor. This gives an outline of how the module achieves the purpose of Airlight in an optimized manner that handles edge cases.

Conclusion
The above sections give an idea of the blueprint of the Airlight system design along with its associated image analytics technology capabilities. A close analysis of the system and its development tells us that the device is highly capable of achieving its purpose, i.e., guidance in the dark. It is beneficial in situations that require exploration and navigation guidance. With its comprehensive and robust design   8 and its requirement in various domains, it will surely go a long way in providing the modern world a low-cost, reliable and intelligent light guidance system.

Future Scope and Development
More modules are currently being added to Airlight for testing. Each module is configured and modified to handle all cases to minimize the user's efforts. The next phase is to test out all the individual modules together to achieve flawless and automated operation. The device will be tested in all environments, varying the light brightness specifically. It will also be interfaced with a mobile application that the user can use to gain manual control.
Airlight has more applications than what comes to mind which makes it easily integrable to industrial applications. One of the major challenges for the rescue team in situations like earthquakes, tornados, collapsing of the buildings is to finding and locating the survivors in a specific time [14]. According to the Intergovernmental Panel on Climate Change (IPCC), these events will become even more recurrent and intense due to the growing concentration of greenhouse gases in the atmosphere (IPCC 2018) [15]. The requirement for development in search and rescue missions is more than ever. Airlight can carry out rescue operations quickly and efficiently. It can check spaces (like narrow trenches and cavities) that the rescue team would not be able to. Further, it can alert the rescue team to find the victims of the disaster. Moreover, if the victim is stuck and rescue is taking a long time, Airlight can be used to transport supplies. Airlight can also be used to conduct searches/manhunts. It can run any recognition/detection model instead of pose estimation. Using face recognition, Airlight can conduct extensive manhunts for criminals at large or search for survivors after accidents such as plane crashes.
Airlight's applications are not limited to search and rescue. It can also be put to use in the entertainment industry. Airlight can track the performers on-stage and replace the stage spotlights with its own. Hence, Airlight is quite an adaptable device and provides a smart and airborne solution to a lot of problems and serves as a great IoT assistant.