Arduino based mobile robot controlled by voluntary eye-blinks using LabVIEW GUI & NeuroSky Mindwave Mobile Headset

This paper proposes an experimental system of a brain-computer interface aimed to use the voluntary eye-blinks detected across the EEG signal acquired from a NeuroSky headset to control the movement directions of a mobile robot. Bluetooth protocol was used to achieve the communication between a LabVIEW application and a program uploaded in an Arduino Mega board which controls the movement direction of a mobile robot. Three parallel algorithms were implemented using LabVIEW, one for the voluntary eye-blinks counting, one for switching across buttons and one for selection of a certain command. Transition across commands (stop, go forward, go backward, turn left, turn right) is executed by one voluntary eye-blink, while two intentional eye-blinks determine the selection of a specific command.


Introduction
Brain-Computer Interface (BCI) based systems had a rapid emergence across various research and professional fields, including both biomedical (assistance, recovery and independence regaining) and non-clinical applications (game development, psychological analysis, and cognitive abilities enhancement). A BCI system is related to the use of neuronal biopotentials for the control of mechatronic devices necessary in the assistance of people with neuromotor disabilities suffering from cerebral stroke, spinal cord injuries, tetraplegia, locked-in syndrome, and amyotrophic lateral sclerosis. Therefore, the electroencephalographic (EEG) signal is acquired from the embedded sensor of portable headsets (NeuroSky Mindwave Mobile [1], Muse [2], Emotiv Epoc [3], Emotiv Insight [4]), then it is filtered and prepared for the processing stage, which is consisting of feature extraction and classification phases. This way, significant EEG signal patterns (P300 [5], SSVEP [6], slow-cortical potentials [7] and sensorimotor rhythms [8]) can be identified and the levels of some psychological and emotional states (motor imagery [9], attention [10], meditation [11], stress, excitement, interest) can be measured. The artefact of eye-blinking [12] was used for the development of BCI applications aimed to help disabled people to communicate with the outside environment and control different mechatronic systems necessary in everyday life.
There are few scientific papers [13] focused on the Bluetooth communication between an Arduino board and NeuroSky headset by using a LabVIEW application. The major presented achievements were related to the use of attention and meditation levels for controlling tasks. Other research projects [14] highlighted the possibility of using LabVIEW to acquire, analyze, and display the variations of the EEG biopotentials.

Hardware system
The hardware system consists of the NeuroSky Mindwave Mobile headset and the mobile robot ( Figure 1).

Figure 1
The NeuroSky Mindwave Mobile headset (left) and the Arduino based mobile robot (right)

NeuroSky headset
NeuroSky [15], a Silicon-Valley based company founded in 2004, produces versatile biomedical sensors embedded in portable monitoring devices of electrocardiographic and electroencephalographic signals. NeuroSky provides not only friendly software applications, especially educational games for the end-users but also reveals the communication protocol and it offers basic software examples in various programming languages so that researchers and professionals can take full benefits. Therefore, both novice and experienced users have taken advantages of the 'power of thought', have discovered both simple applications aimed for entertainment [16], focus enhancement, attention monitoring [17] and advanced proof-of-concepts related to robust mechatronic systems [18] aimed for EEG signal analysis, the implementation of a brain-computer interface [19,20] and recovery from cerebral stroke.
Although it has only a dry-sensor which should be placed on the frontal lobe (FP1 position according to 10-20 System) of the forehead, NeuroSky headsets enable the following tasks: the acquisition of the raw EEG signal, the measurement of the attention and meditation level, the estimation of the eye-blink strength and the extraction of the EEG band frequencies (delta, theta, alpha, beta, gamma). Moreover, the official website provides fundamental frameworks aimed at different programming languages or environments (C, C#, or Python) and development platforms (Arduino or Raspberry Pi).

Mobile robot
The hardware structure of the mobile robot includes the following components: Arduino Mega development board, two DC motors controlled by an L298N motor driver, an HC-05 Bluetooth module used for communication between Neurosky headset and Arduino board, an 8x8 LED matrix for displaying the command feedback, two 3.6 V rechargeable Li-Ion batteries.
The mobile robot is controlled by the Arduino Mega [21], an open-source board extremely popular among students from the mechatronics field and not only, who enjoy several opportunities of turning their dreams into reality by the achievement of experimental prototypes, whose design and functionality are similar with the technical specifications of industrial systems.
Based on the ATmega2560 microcontroller, the Arduino Mega provides 54 digital input/output pins (15 of them are PWM outputs), 16 analog inputs, 4 UARTs, a USB connection, and 256 KB of flash memory. Different shields can be connected to the board.
Researchers have developed versatile applications by the integration of the Arduino board with other acquisition devices or command and control systems. As an advantage of using Arduino, there are various free online code examples and tutorials for programming basic functionality of reading values from sensors or controlling actuators. The software development environment is called Arduino IDE and it is based on C programming language. Figure 2 shows the connections diagram based on Fritzing software.

Figure 2
The Fritzing Diagram displaying the hardware structure of the mobile robot

Software system
The software system consists of a LabVIEW application running on a personal computer and an Arduino program running on the Arduino development board. Communication between PC and Arduino is implemented by a Bluetooth protocol. The LabVIEW graphical user interface ( Figure 3) was developed to acquire EEG signal from the embedded sensor of NeuroSky Mindwave Mobile headset, to analyze the EEG data and decipher the commands that will be sent to the mobile robot.
Regarding the working principle of the brain-computer interface proposed in this paper, one voluntary eye-blink is used to enable the transition across buttons or to switch between commands showing a movement direction: stop, forward, backward, left, and right. Then, two voluntary eyeblinks are used to select the currently blue-highlighted button indicating the intended or chosen previously mentioned movement direction that will be sent to Arduino and executed by the robot.  Further, the Arduino program is responsible to control the two DC motors and to display the graphical effect on the 8x8 LED matrix, according to the received command. The speed of the two DC motors is a pre-defined constant value in the Arduino program.
The LabVIEW virtual instrument is consisting of a configuration programming sequence, where the initial values of the used variables are set, the settings of ThinkGear driver of NeuroSky headset are established and the specification of Bluetooth protocol communication is given. A LabVIEW toolkit [22] was used to call the functions necessary to enable: the Bluetooth connection between the Windows 10 operating system and NeuroSky headset and the measurement of the numerical value characterizing the strength of the eye-blinking. The range of this value is 0-255 and it can be compared with a given threshold to detect a voluntary eye-blink.

Voluntary eye-blinks counting algorithm
A state-machine paradigm was defined in the algorithm aimed to count the voluntary eye-blinks ( Figure 5). They are characterized by a strength that is higher than the threshold set in the LabVIEW graphical user interface. The threshold is a value that can be set during the running of the LabVIEW application, taking into account the ability of each user to execute smoother or stronger eye-blinks. The voluntary eye-blinks counting algorithm is consisting of four states: Init, Switch, Select, and Ready. Every one of the four states includes a similar sequence of instructions based on an alternative condition (called case-structure in LabVIEW graphical programming environment), checking if eyeblink strength is higher than the threshold.
On the one hand, a false result means that no voluntary eye-blink was executed, so that the LabVIEW application continues to run in the current state of the state-machine based paradigm. On the other hand, a true result is related to the storing of the voluntary eye-blink strength value in an array, whose size is also incremented and then, it is activated the next state of the state-machine based algorithm. All the consecutive voluntary eye-blinks strength values are recorded in that array, whose size is equal to the number of the intentionally executed eye-blinks. If the user has executed an eyeblink whose strength is lower than the threshold, then the LabVIEW application is running in the Ready state, where the size of the eye-blink strength array is checked.
If the array size is equal to 1, then one voluntary eye-blink was executed. If the array size is equal to 2, then two voluntary eye-blinks were performed. Some if-else conditions related to checking the Boolean variables SelectElement, Trigger, and numerical variable IndexElement were defined. If conditions SelectElement = true and Trigger = false are simultaneously fulfilled, then the 'Switch' command or transition across keys, based on the numerical value of IndexElement, is enabled. Otherwise, if conditions SelectElement = false and Trigger = true are simultaneously fulfilled, then the 'Select' command is enabled, so that a certain movement direction, depending on the numerical value of IndexElement, will be performed by the mobile robot.

Command selection algorithm
According to Figure 6, if the Boolean variable SelectElement was set to the true value, then the 'Switch' command was enabled so that the transition across the buttons was activated.
According to Figure 6, if Boolean variable SelectElement was set to false, then the 'Select' command was enabled, Boolean variable Send was set to true indicating that a certain character will be sent to Arduino, which controls the movement direction of the mobile robot: 1stop; 2go forward; 3go backward; 4go forward right; 5go forward left; 6go backward left; 7go backward right. A Boolean variable called 'Flag' was used to store the previous set direction: go forward or go backward. Therefore, if the user blinks twice, then the command given by the previously blue highlighted button is executed by the Arduino and the mobile robot is remotely controlled by the EEG signal. After sending the character to Arduino, a re-initialization process takes place, so that: SelectElement = true (to re-enable 'Switch' command), Trigger=true (to show that a movement command was executed by the mobile robot) and IndexElement = 1 (to re-enable the transition across buttons).

Command sending algorithm
The NI VISA (Virtual Instrument Software Architecture) LabVIEW toolkit provided the necessary functions to configure serial port corresponding to the HC-05 Bluetooth module and write the character associated with the movement direction command sent to Arduino board and executed by the mobile robot. The Boolean variable 'Send' is continuously checked to avoid the overloading of the data buffer sent by Bluetooth to the Arduino board. This variable is also reset after a command was sent. Moreover, both the command is executed and a graphical effect is displayed on the LED matrix connected to Arduino. The graphical animation is corresponding to the performed command: a square

Conclusions
This paper proposed a brain-computer interface based experimental system, that can be used to provide training sessions for people with neuromotor disabilities so that they can improve their ability to perform voluntary eye-blinks strong enough to precisely control a mobile robot substituting a motorized wheelchair. The LabVIEW application implements an algorithm for the detection of 'Switch' and 'Select' commands. The user should blink once to enable the transition across buttons defining commands. Further, the user should blink twice to select a certain command and send it to the Arduino board which controls the movement direction of the mobile robot.
It results that the integration between software development environment (LabVIEW programming, Arduino IDE) and hardware design (Arduino Mega board based mechatronic assistive device, remotely controlled, by Bluetooth) is a recommended solution for the implementation of a brain-computer interface, due to the following positive factors: low cost, portability, friendly graphical user interface, interactive feedback, real-time quick response and convenience for both users and researchers. It was observed that the only disadvantage is given by the varying accuracy of the eyeblinking strength detection function provided by NeuroSky, which can be influenced by the unintentional body movement during the EEG signal acquisition. Therefore, to avoid the risk of collision, it is necessary to set a medium speed of rotation of DC motors from the structure of the mobile robot.
As a future research endeavor, an algorithm based on machine learning techniques will be applied to the raw EEG signal to provide better accuracy regarding the detection of voluntary eye-blinks.
The control protocol will be improved with supplementary commands for switching between the eye-blinking control and natural eye-blinking, increasing and decreasing the speed, and switching between eye-blinking control and autonomous control. These commands can be associated with specific EEG patterns triggered by the execution of some cognitive tasks, for example, focusing and relaxing, but this is not so easy to achieve as an eye-blink.
Also, the proposed BCI system will be improved so that a powered wheelchair can be successfully controlled.