Human Computer Interaction through Hand Gestures for Home Automation using Microsoft Kinect

Published in: Your chapter in Advs in Intelligent Syst., Computing, Vol. 508, PROCEEDINGS OF INTERNATIONAL CONFERENCE ON COMMUNICATION AND NETWORKS, 978-981-10-2749-9, 417755_1_En (3)


For my final year undergraduate thesis, I worked on developing a NUI(Natural User Interface) for home automation. This was particularly challenging as not only I had to devise an algorithm which could help me in identifying body gestures but I also had to develop a transparent UI. Not much work had been done into it at that time, so it was pretty exciting for me to charter into unknown territory. 

After successfully developing the Home Automation System, I also performed User studies and found out that senior citizens(people above the age of 65) preferred using a gesture-based automation compared to other available options. Also, the user study proved that the transparent UI was easy to understand and served the purpose well. 

Working on this project was one of the most fulfilling experiences of my life. As this was an individual thesis, I not only did the designing but also the coding. I completed this project over a span of a year. 


Gesture recognition has been an attractive area of research since a long time. With the introduction of Microsoft Kinect, hand gesture and body gesture recognition has become handy for the researchers. Here an innovative application has been presented which controls all electrical home appliances through hand gestures. The algorithm presented here is an assistive application useful for physically challenged and senior citizens. In this paper, I have used Microsoft Kinect for image capturing along with some important computer vision (CV) and digital image processing techniques (DIP) for hand gesture recognition. Arduino Uno microcontroller and relay circuits are used for controlling electrical devices. The algorithm presented gives an accuracy of 88%.

Keywords: Microsoft Kinect, Human Computer Interaction (HCI), Computer Vision (CV), Depth Sensor, Feature Extraction, Feature Classification, Arduino Uno

Voice & Far Mode Software Platform specifications

Operating System : Windows

Application IDE : Microsoft Visual Studio V 10.0

Language : C#

SDK : Microsoft Kinect SDK V 1.8 Arduino SDK

Near Mode Software Platform specifications

Operating System: Windows

Application IDE: Eclipse Luna, Processing

Language: Java

Library:OpenCV 2.4.1 OpenNI Processing.core JSS.serial Arduino SDK

Feature Set : Hu’s Invariant Moments

Classifier : kNN

Project in a nutshell

Block diagram of the project

Block diagram of the project

In this project, I built a Home automation system that could be regulated by using Skeleton Movement, Hand Gestures and Voice Commands. I used Microsoft Kinect because of its depth-sensing capabilities. Standard Kinect SDK was used for skeleton tracking and Microsoft Speech SDK was used for Voice Recognition. But for hand Gesture Recognition, I developed an algorithm that was capable of identifying different hand gestures and could respond accordingly. I used a Depth based Segmentation method for identifying and tracking the hand of a user. And the incoming video stream was compared with my sample training set using kNN classifier, in order to recognize the command given. Arduino Uno and relay circuits were used to control the home appliances.

The poster of this presentation could be found here

Full Research Paper: Home Automation