High Impact Factor : 4.396 icon | Submit Manuscript Online icon |

Hand Gesture Recognition for Sign Language using CNN


Sayali Vijay Samgiskar , MGM College of Engineering and Technology; Sakharkar Dhanashree Vivek, MGM College of Engineering and Technology; Sonawane Priyanka Chandrakant, MGM College of Engineering and Technology; Jadhav Rahul Ananda, MGM College of Engineering and Technology; Prof. Yogesh Shahare, MGM College of Engineering and Technology


Convolutional Neural Network (CNN), Python, Raspberry Pi 3B+, Gesture Recognition


In this Hand Gesture recognition for sign language project a real time vision based system is proposed to monitor objects (hand fingers). It is built based on the Raspberry Pi with camera module and programmed with Python programming Language supported by Open Source Computer Vision (OpenCV) library. The Raspberry Pi embeds with an image-processing algorithm called hand gesture, which monitors an object (hand fingers) with its extracted features. The essential aim of hand gesture recognition system is to establish a communication between human and computerized systems for the sake of control The mobile system is built and tested to prove the effectiveness of the proposed It has many applications in traffic control, human computer interaction, gesture recognition, augmented reality and surveillance. It led to a system that has the ability of surveillance and applications in detecting and monitoring a known object. Raspberry pi is a small sized PC board suitable for real-time projects. The main purpose of the work presented in this project is to make a system capable of detecting and monitoring some features for objects that specified according to an image processing algorithms using Raspberry Pi and camera module. The feature extraction algorithm programmed with Python supported by OpenCV libraries, and executed with the Raspberry Pi attached with an external camera.

Other Details

Paper ID: IJSRDV7I30209
Published in: Volume : 7, Issue : 3
Publication Date: 01/06/2019
Page(s): 153-155

Article Preview

Download Article