Real-time Environmental Control System for Disability Access under Node-MCU Platform

An Environmental Control Unit (ECU) is an electronic device that helps a person to perform daily tasks and control the surrounding environment. Hand gesture can be employed to perform this purpose via special wearable gloves. In this paper, we propose a low-cost Human-Computer Interaction (HCI) method using Node MCU platform. Finger and palm movement is captured based on the output of a flex and accelerometer sensors. The output voltage of the flex sensor represents the degree of a finger’s bending while the output of the accelerometer determines the 3-axis of palm movement in order to operate up to three devices. This glove is connected to an Internet-of-Thing (IoT) platform that supports wireless communication through a web server in order to make the ECU wire-free and hence increase the portability of the system. Comparisons with the related work confirm that the proposed system is simple, accurate and easy to implement.


Introduction
In health care organizations like hospitals, a large number of patients are treated at the same time making it almost impossible to fulfill the patient's daily requirement. These needs include different tasks such as turn off a light and TV or adjusting a volume. This is a big problem for partially disabled patients and consumes a large time of sub-staff such as nurses and technicians. Therefore, it is necessary to find a system that can monitor and respond to the individual's requirement continuously [1]. Hand gesture can be used to translate individual's need with least effort. For example, palm or finger movement can be interpreted to turn off a light or switch a fan on. This way eliminates the need to use a switch or turning a knob which can be challenging for some individual such as brain stroke patients. In addition, hand gesture can be used easily by anyone [2]. A low-cost Human Computer Interface (HCI) device can be implemented using a glove with a flex and gyroscope sensor. A flex sensor is a sensor that changess its resistance with bent. This change in resistance can either be increasing or decreasing depending on the type of flex sensors used. This concept shows that if flex sensors are placed at the joints of fingers, they can be used to determine if fingers are bent or not [3]. On the other hand, accelerometers can be employed to monitor the palm movement in three directions: X, Y and Z. Based on these movements, different actions can be implemented. This will allow the user to define a specific finger/hand gesture for device control or virtual simulation. The smart glove which includes a flex and an accelerometer can be interfaced to a microcontroller device to achieve an Environmental Control Unit (ECU). This command can also be transferred to another microcontroller device over a wireless network to achieve a wireless ECU. Creating a wireless ECU allows the user to define gestures for device control commands and hence various technological applications for device control can be implemented. The technology of using hand gestures for remote controlling is a hot topic during the recent years. For instance, Mardiyanto, et al. [4] proposed a method to control an underwater vehicle equipped with an arm. The hand gesture is employed through the use of an accelerometer and a gyroscope. Their system consisted of four microcontrollers, three accelerometers and a gyroscope with a flex sensor. Although good results are reported, this system is rather complex and expensive as it involves four microcontrollers and four sensors. Che-Ani et al. [5] proposed a method for capturing finger's movement via a glove which contains a flex sensor. Finger movement is divided into five positions ranging from straight to fully bended. The output voltage can represent the degree of the finger's bending as the resistance of the flex increases proportionally when the sensor is flexed. Similarly, Ruslan et al. [6] presented a method to control a wireless car using a flex which is connected to a finger. Seven degrees of bending are monitored in their systems. However, the aforementioned methods are prone to errors as it is difficult to control different finger's bending degrees to achieve five to seven positions, especially for physically impaired patients. Other work utilized a camera to record the hand gesture where image processing and computer vision techniques are employed to achieve this purpose. For example, the work in [7] and [8] proposed methods to substitute a remote control with camera based on hand gesture. Hand gestures are monitored and translated to actions based on image processing techniques. Unfortunately, these systems are complex and power-consuming due to the presence of a camera. In addition, a laptop or PC is also needed to process the input from the camera.
In this paper, a low-cost HCI system is proposed using Node MCU platform and a flex with an accelerometer. A flex sensor is used to enable the system while the control actions are taken based on the movement of the hand in 3-axes: X, Y and Z. Only one flex sensor is used to reduce the cost. This paper is organized as follow. Section 2 introduces the proposed method, Section 3 presents the results and comparisons while Section 4 concludes this paper.

Proposed Method
A real-time hand movement system has been designed where the movement of the hand is combined with the index finger in order to operate an ECU. The proposed system consists of two parts, the first part acts as a transmitter where this part transmits hand gesture signals via wireless to a local server with a predefined IP address. The transmitter part (shown in Figure 1 (a)) consists of a a smart glove which has a flex and a GY-85 sensor connected to a Node MCU32 via 3 analog input pins, two inputs from the GY-85 and one input from the flex. The developed prototype of the smart glove is shown in Figure 2. The GY-85 is a combined sensor that has three-axis gyroscope, triaxial accelerometer and 3axis magnetic field [9]. The ADXL345 accelerometer is utilized from this chip due to its accuracy and simplicity. The receiver part consists of a Node MCU32 device which is connected to a 4-line relay module as shown in Figure 1 (b). Only three-line from the relay module are used since there are only 3 devices to be controlled. The transmitter and receiver can communicate with each other using the built-in WI-FI which is part of the Node MCU32 platform for a distance up to 100 meters. However, an access point can be used to cover a larger distance such as hospital blocks. As mentioned before, the proposed system can operate three devices. In order to enable the system, the index finger has to be slightly bent. After that, to operate device (1), the output of the accelerometer from the x-axis can be used. Similarly, the output of the accelerometer from the y-axis and z-axis can be used to operate the device (2) and device (3), respectively. On the other hand, to turn off a device, the flex has to be bent again, and the same operation is reversed. Hence, the operation of the system is similar to a pushbutton switch such that one movement with the bended-finger will operate a device and another movement with a bended-finger will turn it off. The flow chart of the proposed system is shown in Figure 3.

Experimental Results and Comparisons
Flex sensor represents an analog resistor that works as a variable analog voltage divider. When the flex is bent, a resistance output relative to the bent radius is produced by the sensor where the greater the degree of bending, the higher the electrical resistance it produces [3]. The flex sensor is connected with a 10KΩ resistor to form a voltage divider as shown in Figure 4 and the voltage resulting from the flex is fed to the microcontroller using the analog input pin which has a range of 0-1023. A threshold value of "150" is chosen to decide if the flex is bent or not. This value has been set empirically. As for the accelerometer, the output of the accelerometer is entered to the Node MCU32 and a threshold is also chosen to determine the movement in any of the 3-axis.  The developed system proves to be simple, easy to be implemented and support the individual free mobility due to wireless communication. In addition, the proposed system operates in real time and consume low power with low delay as it does not involve complex operations.
In order to evaluate the efficiency of our proposed system, our method is compared with the available work in the literature [4], [5], [6], [7], [8] in terms of complexity, cost and power consumption. Table 1 shows a comparison with the related work in the field of environmental control with the hand gesture. Solanki et al. [7] and Baht et al. [8] proposed methods to control devices by monitoring hand gestures with a camera. Image processing techniques are utilized to translate hand gestures into control actions. Unfortunately, these methods are time and power-consuming as a camera with a PC or laptop should be kept on to decode the camera output. In addition, the cost and weight of such systems are high. On the other hand, Che-Ani et al. [5] and Ruslan et al. [6] presented methods for device control using a flex and a microcontroller. five to seven degrees of finger's bending are monitored with the flex sensor. Although the cost of these methods is low, the error using one flex sensor alone is high. This is because these bending degrees could overlap as the range among them is very small. In addition, it is very difficult for a physically impaired patient to fully control his/her fingers. The work in [4] presented a method to control an underwater vehicle with three gyroscopes connected to four microcontrollers. It is obvious that the latter method requires many hardware components and hence the cost and power consumption are high compared to the simple method proposed in this paper.

Conclusions
A low-cost Human Computer Interface (HCI) method was proposed based on hand gesture utilizing Node MCU platform with a smart glove. The smart glove included a flex sensor and an accelerometer. The flex sensor was used to enable the control system while the control actions were translated from the hand movement in three-axis to control up three devices. The actions are transmitted wirelessly over a local network to form an Environmental Control Unit (ECU). Comparisons with the related work confirmed that the proposed method is simple, low-cost and low power. The proposed method based on hand gesture will greatly assist not only patients with physical impairment but also those individuals with limited mobility.