The objective of this project is to explore the possibility of developing a portable and robust device which can serve as an assistant for a paralyzed person or a person having Amyotrophic lateral sclerosis (ALS), also known as motor neurone disease (MND). It is a completely stand-alone device with its own processing unit making it portable and efficient.
Amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig's disease, is a degenerative disorder of specific nerve cells of the spinal cord, brain stem and brain. It belongs to a group of disorders known as motor neuron diseases and results in the gradual loss of voluntary muscle control leading to paralysis. A person having motor neuron disease face limitation in control of hands and arms, face difficulty in speaking. It makes the person physically disabled and the person cannot even enjoy the benefits of modern day technology like smart-phones and tablets. A good thing between all these miseries is that ALS affects only motor neurons, the disease does not impair a person's mind, personality, intelligence, or memory. Additionally, it does not affect a person's ability to see, smell, taste, hear, or recognize touch. Patients usually maintain control of eye muscles.
For many people who have difficulty physically using a computer, eye gaze technology can offer a quick and easy to understand way of accessing your favourite software. Eye gaze or eye tracking is a way of accessing your computer or communication aid using a mouse that you control with your eyes. Our technique is not exactly same but very similar to eye gaze technique.
I built a deep learning real-time gaze region estimation model having 4 classes using convolutional neural networks which I trained using Keras library.
The predictions part is done by the Raspberry Pi and the predictions are sent to the NodeMCU wirelessly using MQTT Protocol to avoid over-crowding of wires. NodeMCU act as a subscriber to Raspberry Pi and it can send messages to it regarding the position in a required manner.
As the NodeMCU is connected to Raspberry Pi through MQTT, it can send commands to display specific Oled display interface page with respect to some specific eye position. We used 0.96 inch SSD1306 OLED display for our project.
As the user can scroll the pages on OLED, the user can also select any page which is required at that time and it can be notified on a phone/tablet through an mobile application.
The user can even convert the messages to speech by Google text to speech API. These messages can be read on a bluetooth speaker to inform a person currently in that room regarding his current requirements.
Some companies like Phillips, Syska have developed some smart devices like smart bulb, smart tubelight etc which can be controlled by an user using smartphones etc. We can use them with our smart glasses to control them using by it. It can help the motor neuron patient to control different devices with the help of eye gaze.