SlideShare una empresa de Scribd logo
1 de 4
Descargar para leer sin conexión
ISSN: 2278 – 1323
International Journal of Advanced Research in Computer Engineering & Technology (IJARCET)
Volume 2, Issue 3, March 2013
947
All Rights Reserved © 2013 IJARCET
FGEST: FINGER TRACKING AND GESTURE
RECOGNITION IN SMARTPHONES
Pratham Parikh Shubham Gupta Swatil Patel Varun Nimbalkar
Department of Computer Science,
Sinhgad College of Engineering,
University of Pune, India
Abstract
Recent advances in mobile processors have made complex
calculations possible and feasible in Smartphones. Taking
advantage of these developments we aim to develop a gesture
recognition application that can recognize finger movements
effectively in a variety of lighting conditions and perform
corresponding actions. This enables us to incorporate a new
form of interaction on mobile devices without using extra
hardware as it utilizes the front facing camera present in most
mobile phones. Here we present a framework that describes
our application which will increase the functionality of
preexisting devices.
Keywords-video analysis,edge and feature detection, motion,
shape, tracking
I. INTRODUCTION
Smartphones have completely revolutionized the mobile phone
industry in the last five years. Mostly touch-screen devices, they
capitalize on human being‟s natural instinct to touch and feel.
Smartphones have brought the world closer, providing one–touch
access to all the information desired by the user. These phones are
usually characterized by interactive interfaces and a superior user
experience.
Developers across the world are trying to further enhance this
user experience. From numeric keypads to qwerty keypads to
touch-screens, the technology is ever-revolving. Research is going
on to make the interface touch-independent, so as to make it
possible to access the device even from a distance.
A lot of work has been done in this field on Windows platform,
yet the Android platform is still virgin. Recently, Qualcomm, using
a prototype tablet running on Quad core processor, demonstrated
gesture recognition on an Android platform. Sony Xperia Sola uses
capacitive inductors to implement “floating touch”™ which
recognizes touch action from 1-2 cm above the screen.
According to a 2012 census, 59% of the smartphone run on the
Android platform. So, there is a need and possibility of an
application that can further enhance user experience in Android
smartphones and make it possible for users to access their devices
from a distance.
The purpose of this system is to develop an Android application
for tracking the movement of a fingertip. By employing the front
camera that is already present in most of today‟s smartphones, we
shall use a fingertip recognition algorithm to detect the fingertip
motion using contour analysis. Finger gestures can then be
associated with accepting and rejecting calls which would aid a
driver. The gestures will also be used to shuffle songs and move to
the next image in gallery.
As finger-tip gesture recognition is more preferred and
convenient for human beings and being an unexplored idea in
cellular technology, it is the main focus of this project. Also
reducing overhead of processing, background subtraction, color
detection, contour recognition and to avoid the problems faced by
the users are the driving forces for taking up this project.
User moves his/her finger in front of the smartphone‟s front
camera. Background detection, color detection and contour analysis
are performed on the video frames to detect the finger-tip and then
recognize its movement. Depending upon the movement detected
and the third-party application synchronized, the necessary action is
initiated.
Our application can be easily integrated with system
applications like phone, gallery and music player as well as opening
new avenues for application developers.
II. PROBLEM FORMULATION
A. Problem Definition – Given the performance of a deliberate
gesture by an actor the system should interpret that gesture and
send a command to a 3rd
party application to perform an
associated task. Gesture recognition should be robust i.e. it
should work in a variety of lighting conditions and with a wide
array of gestures.
B. Problem Output – The main objective of the FGEST is to
enable a new form of interaction in smartphones without
adding hardware. The added functionality will enable safe
phone calls in cars, intuitive web browsing and navigation,
handsfree gameplay etc.
Real time inputs will be taken from the front camera. The
frames will be processed and any motion will be detected. The
empirical data will be classified as gestures and the
information will be sent to the application which has been
selected by the user.
III. MATHEMATICAL MODELLING
We now provide a model of the system in terms of Set Theory
domain.
1. Let „S‟ be the Finger tracking and Gesture recognition system.
ISSN: 2278 – 1323
International Journal of Advanced Research in Computer Engineering & Technology (IJARCET)
Volume 2, Issue 3, March 2013
948
All Rights Reserved © 2013 IJARCET
S= {…………
2. Identify the inputs as F, F1 and L.
S = {V, L…
F = {i | „i‟ is the raw video frame captured by front camera.}
= {*.mp4 }
L= {t | „t‟ is location of input file on hard disk drive}
3. Identify the output as O.
S = {V, L, O, R …
O = {o | „o‟ is metadata of processed video frame.}
R = {r | „r‟ is the desired response of the system.}
4. Identify the processes as P.
S= {V, L, O, R, P …
P= {F, M, D}
5. F is the set for finger extraction module activities and associated
data.
F = {Fip, Fp, Fop}
Fip= {f| „f‟ is the valid input to finger extraction module.}
Fp= {f | „f‟ is the finger extraction function to convert the Fip to
Fop.}
Fp (Fip) = Fop
Fop= {f | „f‟ is the output generated by finger extraction module
i.e. contour detection.}
6. M is the set of Motion tracking activities and associated data.
M = {Mip, Mi, MOp}
Mip= {t |‟t‟ is valid input for motion tracking.} = Fop
Mi= {p | „p‟ is the function to track Mip.}
Mi (Mip) = MOp
MOp= {t |‟t‟ is the metafile that stores information about tracked
motion.}
7. D is the set of finger motion direction detection activities and
associated data.
D= {Dip, Di, DOp}
Dip= {t |‟t‟ is valid input for direction detection.} = MOp
Di= {p | „p‟ is the function to track Dip.}
Di(Dip) = DOp
DOp= {j |‟j‟ is the metafile that stores information about detected
direction i.e. x and y coordinates}
8. Identify failure cases as F‟
S= {V, L, O, R, P, F‟ …
Failure occurs when –
F= {ф}
F= {p| ‟p‟ is matching with input in less than 70% cases}
9. Identify success case (terminating case) as „e‟
S= {V, L, O, R, P, F‟, e…
Success is defined as- e = {p| „p‟ is matching with input more than
70% similarity}
10. Initial conditions as S0
S= {V, L, O, R, P, F‟, e, S0}
Initial condition for finger detection- There should be considerate
distinction between the finger colour and background colour i.e.
IOp ≠ {ф}
There is no initial condition for motion tracking.
IV. FRAMEWORK
The basic framework for Finger Tracking and Gesture Recognition
consists of 3 main phases:
Phase 1(Receive Input Frame) receives the raw video frames from
the front camera and decodes these frames before sending them to
the next phase.
Phase 2(Process Frames) processes those frames and applies
background subtraction, skin color segmentation, contour analysis
and motion tracking to recognize and interpret gestures from the
given input.
Phase 3(Perform Action) receives the gesture information and
commands a 3rd
party application to take the necessary steps to
complete the request specified.
ISSN: 2278 – 1323
International Journal of Advanced Research in Computer Engineering & Technology (IJARCET)
Volume 2, Issue 3, March 2013
949
All Rights Reserved © 2013 IJARCET
V. IMAGE PROCESSING FILTERS
The following filters shall be used to process the frames:
A. Skin Color Segmentation
The first phase of the processing requires detecting the skin region
in the input frame. Normally frames are in the RGB color system.
RGB is not suitable for skin color detection as a variety of lighting
conditions means that consistent detection of skin will not take
place due to changes in skin color. To resolve this difficulty we
propose to convert the RGB frames into YCbCr system which will
neutralize lighting and exposure variances to improve skin
detection.
This formula will be the base for Skin Color Segmentation and will
be used to differentiate between skin and non-skin pixels.
B. Background Subtraction
To improve the accuracy of finger detection we shall, in parallel,
employ a background subtraction filter. This filter shall perform
foreground, background differentiation and then remove the
background parts. The frames shall remain in the RGB system. The
outputs of the background subtraction and skin color segmentation
shall be added together to get the most accurate detection of the
finger tip.
C. Contour Analysis
After the detection of the hand contour analysis will be used to
detect fingertips. In an open hand the areas of the finger tips have
sharper curvature than the other parts of the hand. Using the
kCosine formula the curvature of the hand will be calculated and
the area with the sharpest change in curvature will be positively
identified as fingertips.
D. Finger Tracking
A predictive algorithm will be used for finger tracking. Using
previously calculated values an area of the image will be identified
as most likely to contain the finger. This area will be processed
first. If no fingertip is detected within that area than the entire
frame will be processed. The extra calculations required in the
worst case will be offset by the reduction in calculations when the
fingertip is detected in the predicted area.
VI. OPTIMIZATION FOR MOBILE INTERFACES
To counter the limited processing power and battery capacities of
mobile platforms we plan to include the following modifications in
our framework:-
A. Variable Frame Processing
During the frame capture process we plan to introduce variable
frame capture rates which will depend on factors like if a
supported application is running or if a valid artifact has been
detected in the previous frame capture.
B. Situational Frame Crop
For efficient utilization of processor, we intend to run our
algorithm only on the interesting area, part of the complete frame
in which the valid artifact was detected, ignoring the rest of frame
area, omitting unnecessary computations.
C. Early Termination
To reduce unnecessary processor usage we introduce variable
process termination. At every stage we check to see if there are any
negative detections. If any are encountered then the frame is
immediately dropped and the processing pauses.
VII. EXPECTED RESULT
The expected result will be the successful detection of gestures
with a relatively high probability in a variety of lighting conditions
and accounting for variable distances between user and camera.
VIII. CONCLUSION
Thus we have presented a framework for finger tracking and
gesture recognition using smartphones. We have also described the
Mathematical Model for the same.
ACKNOWLEDGMENT
This Research Paper cannot be considered complete without
mentioning Professor P.R. Futane. We wish to express true
sense of gratitude towards his valuable contribution. We are
grateful to him for his constant encouragement and guidance
in the fulfillment of the project activity.
REFERENCES
[1] Daeho Lee, SeungGwan Lee, Vison –based finger action recognition
by angle detection and contour analysis, 2011.
[2] Byung-Hun Oh, Jub-Ho An, Kwang-Seok Hong, Mobile user
interface using a robust fingertip detection algorithm for complex
lighting and background conditions, 2012.
[3] Jun-ho An, Kwang-Seok Hong, Finger Gesture-based mobile user
interface using a rear-facing camera, 2011.
[4] Mathias Kolsch, Matthew Turk, Robust Hand Detection, 2004.
Figure 1: Basic framework for Finger Tracking and Gesture Recognition
Receive
Frame
input
Process
Frames
Decoded
frames
Raw
frames
Gesture
Interpretation
Perform
Action
ISSN: 2278 – 1323
International Journal of Advanced Research in Computer Engineering & Technology (IJARCET)
Volume 2, Issue 3, March 2013
950
All Rights Reserved © 2013 IJARCET
[5] Sylvia M. Dominguez, Trish Keaton, A robust finger tracking
method for multimodal wearable computer interfacing, 2006.
[6] Thuy Thi Nguyen, Nguyen Dang Binh, Horst Bischof, An active
boosting-based learning framework for real-time hand detection,
2008.
[7] S. Bilal, Rini Akmeliawati, A hybrid method using haar-like and
skin-color algorithm for hand posture detection, recognition and
tracking, 2010.
[8] Leyuan Liu, Nong Sang, Saiyong Yang, Real-time skin color
detection under rapidly changing illumination conditions, 2011.
AUTHORS
Pratham Parikh
Dept. of Computer Engineering, Sinhgad College of
Engineering, Pune University.
Shubham Gupta
Dept. of Computer Engineering, Sinhgad College of
Engineering, Pune University.
Varun Nimbalkar
Dept. of Computer Engineering, Sinhgad College of
Engineering, Pune University
Swatil Patel
Dept. of Computer Engineering, Sinhgad College of
Engineering, Pune University

Más contenido relacionado

La actualidad más candente

Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...NidhinRaj Saikripa
 
EyeRing PowerPoint Presentation
EyeRing PowerPoint PresentationEyeRing PowerPoint Presentation
EyeRing PowerPoint PresentationPriyad S Naidu
 
Part 2 - Gesture Recognition Technology
Part   2 - Gesture Recognition TechnologyPart   2 - Gesture Recognition Technology
Part 2 - Gesture Recognition TechnologyPatel Saunak
 
Gesture recognition adi
Gesture recognition adiGesture recognition adi
Gesture recognition adiaditya verma
 
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A SurveyNatural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A SurveyEditor IJCATR
 
sign language recognition using HMM
sign language recognition using HMMsign language recognition using HMM
sign language recognition using HMMveegrrl
 
IRJET- Finger Gesture Recognition using Laser Line Generator and Camera
IRJET- Finger Gesture Recognition using Laser Line Generator and CameraIRJET- Finger Gesture Recognition using Laser Line Generator and Camera
IRJET- Finger Gesture Recognition using Laser Line Generator and CameraIRJET Journal
 
Gesture recognition technology
Gesture recognition technologyGesture recognition technology
Gesture recognition technologyKompal Neutan
 
hand gesture based interactive photo silder
hand gesture based interactive photo silderhand gesture based interactive photo silder
hand gesture based interactive photo sildersampada muley
 
Gesture Technology
Gesture TechnologyGesture Technology
Gesture TechnologyBugRaptors
 
powerpoint presentation on sixth sense Technology
powerpoint presentation  on sixth sense Technologypowerpoint presentation  on sixth sense Technology
powerpoint presentation on sixth sense TechnologyJawhar Ali
 
Gesture Recognition Technology
Gesture Recognition TechnologyGesture Recognition Technology
Gesture Recognition TechnologyMuhammad Zeeshan
 
Reading System for the Blind PPT
Reading System for the Blind PPTReading System for the Blind PPT
Reading System for the Blind PPTBinayak Ghosh
 
Gesture recognition technology
Gesture recognition technologyGesture recognition technology
Gesture recognition technologySahil Abbas
 
Gesture Recognition Technology-Seminar PPT
Gesture Recognition Technology-Seminar PPTGesture Recognition Technology-Seminar PPT
Gesture Recognition Technology-Seminar PPTSuraj Rai
 
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...Vladimir Kulyukin
 

La actualidad más candente (20)

Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...
 
EyeRing PowerPoint Presentation
EyeRing PowerPoint PresentationEyeRing PowerPoint Presentation
EyeRing PowerPoint Presentation
 
Part 2 - Gesture Recognition Technology
Part   2 - Gesture Recognition TechnologyPart   2 - Gesture Recognition Technology
Part 2 - Gesture Recognition Technology
 
Gesture recognition adi
Gesture recognition adiGesture recognition adi
Gesture recognition adi
 
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A SurveyNatural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
 
sign language recognition using HMM
sign language recognition using HMMsign language recognition using HMM
sign language recognition using HMM
 
IRJET- Finger Gesture Recognition using Laser Line Generator and Camera
IRJET- Finger Gesture Recognition using Laser Line Generator and CameraIRJET- Finger Gesture Recognition using Laser Line Generator and Camera
IRJET- Finger Gesture Recognition using Laser Line Generator and Camera
 
Finalgesture22
Finalgesture22Finalgesture22
Finalgesture22
 
Gesture recognition technology
Gesture recognition technologyGesture recognition technology
Gesture recognition technology
 
hand gesture based interactive photo silder
hand gesture based interactive photo silderhand gesture based interactive photo silder
hand gesture based interactive photo silder
 
Gesture Technology
Gesture TechnologyGesture Technology
Gesture Technology
 
hand gestures
hand gestureshand gestures
hand gestures
 
powerpoint presentation on sixth sense Technology
powerpoint presentation  on sixth sense Technologypowerpoint presentation  on sixth sense Technology
powerpoint presentation on sixth sense Technology
 
Gesture Recognition Technology
Gesture Recognition TechnologyGesture Recognition Technology
Gesture Recognition Technology
 
Reading System for the Blind PPT
Reading System for the Blind PPTReading System for the Blind PPT
Reading System for the Blind PPT
 
GESTURE TECHNOLOGY
GESTURE    TECHNOLOGYGESTURE    TECHNOLOGY
GESTURE TECHNOLOGY
 
GESTURE prestation
GESTURE prestation GESTURE prestation
GESTURE prestation
 
Gesture recognition technology
Gesture recognition technologyGesture recognition technology
Gesture recognition technology
 
Gesture Recognition Technology-Seminar PPT
Gesture Recognition Technology-Seminar PPTGesture Recognition Technology-Seminar PPT
Gesture Recognition Technology-Seminar PPT
 
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
 

Similar a Ijarcet vol-2-issue-3-947-950

Sign Language Identification based on Hand Gestures
Sign Language Identification based on Hand GesturesSign Language Identification based on Hand Gestures
Sign Language Identification based on Hand GesturesIRJET Journal
 
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...IJMER
 
SIXTH SENSE TECHNOLOGY REPORT
SIXTH SENSE TECHNOLOGY REPORTSIXTH SENSE TECHNOLOGY REPORT
SIXTH SENSE TECHNOLOGY REPORTJISMI JACOB
 
A Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureA Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureIRJET Journal
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technologyRenjith Ravi
 
Sixth sense
Sixth senseSixth sense
Sixth senseShilpa S
 
Control Buggy using Leap Sensor Camera in Data Mining Domain
Control Buggy using Leap Sensor Camera in Data Mining DomainControl Buggy using Leap Sensor Camera in Data Mining Domain
Control Buggy using Leap Sensor Camera in Data Mining DomainIRJET Journal
 
IRJET- Smart Helmet for Visually Impaired
IRJET- Smart Helmet for Visually ImpairedIRJET- Smart Helmet for Visually Impaired
IRJET- Smart Helmet for Visually ImpairedIRJET Journal
 
IRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture RecognitionIRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture RecognitionIRJET Journal
 
Research on Detecting Hand Gesture
Research on Detecting Hand GestureResearch on Detecting Hand Gesture
Research on Detecting Hand GestureIRJET Journal
 
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 BoardDevelopment of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 BoardWaqas Tariq
 
vimal kumar's presentation on Sixth sense technology & its working
vimal kumar's presentation on Sixth sense technology & its workingvimal kumar's presentation on Sixth sense technology & its working
vimal kumar's presentation on Sixth sense technology & its workingvimalstar
 
Accessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureAccessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureIRJET Journal
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technologyAkhil Ak
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technologyRenjith Ravi
 
sixth sense technology.pdf
sixth sense technology.pdfsixth sense technology.pdf
sixth sense technology.pdfgopika1official
 

Similar a Ijarcet vol-2-issue-3-947-950 (20)

Sign Language Identification based on Hand Gestures
Sign Language Identification based on Hand GesturesSign Language Identification based on Hand Gestures
Sign Language Identification based on Hand Gestures
 
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
 
SIXTH SENSE TECHNOLOGY REPORT
SIXTH SENSE TECHNOLOGY REPORTSIXTH SENSE TECHNOLOGY REPORT
SIXTH SENSE TECHNOLOGY REPORT
 
A Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureA Survey on Detecting Hand Gesture
A Survey on Detecting Hand Gesture
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
Sixth sense
Sixth senseSixth sense
Sixth sense
 
Finger tracking
Finger trackingFinger tracking
Finger tracking
 
Control Buggy using Leap Sensor Camera in Data Mining Domain
Control Buggy using Leap Sensor Camera in Data Mining DomainControl Buggy using Leap Sensor Camera in Data Mining Domain
Control Buggy using Leap Sensor Camera in Data Mining Domain
 
IRJET- Smart Helmet for Visually Impaired
IRJET- Smart Helmet for Visually ImpairedIRJET- Smart Helmet for Visually Impaired
IRJET- Smart Helmet for Visually Impaired
 
Sixth sense
Sixth senseSixth sense
Sixth sense
 
Gesture phones final
Gesture phones  finalGesture phones  final
Gesture phones final
 
IRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture RecognitionIRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture Recognition
 
Research on Detecting Hand Gesture
Research on Detecting Hand GestureResearch on Detecting Hand Gesture
Research on Detecting Hand Gesture
 
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 BoardDevelopment of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
 
Dip fingerprint
Dip fingerprintDip fingerprint
Dip fingerprint
 
vimal kumar's presentation on Sixth sense technology & its working
vimal kumar's presentation on Sixth sense technology & its workingvimal kumar's presentation on Sixth sense technology & its working
vimal kumar's presentation on Sixth sense technology & its working
 
Accessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureAccessing Operating System using Finger Gesture
Accessing Operating System using Finger Gesture
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
sixth sense technology.pdf
sixth sense technology.pdfsixth sense technology.pdf
sixth sense technology.pdf
 

Más de Editor IJARCET

Electrically small antennas: The art of miniaturization
Electrically small antennas: The art of miniaturizationElectrically small antennas: The art of miniaturization
Electrically small antennas: The art of miniaturizationEditor IJARCET
 
Volume 2-issue-6-2205-2207
Volume 2-issue-6-2205-2207Volume 2-issue-6-2205-2207
Volume 2-issue-6-2205-2207Editor IJARCET
 
Volume 2-issue-6-2195-2199
Volume 2-issue-6-2195-2199Volume 2-issue-6-2195-2199
Volume 2-issue-6-2195-2199Editor IJARCET
 
Volume 2-issue-6-2200-2204
Volume 2-issue-6-2200-2204Volume 2-issue-6-2200-2204
Volume 2-issue-6-2200-2204Editor IJARCET
 
Volume 2-issue-6-2190-2194
Volume 2-issue-6-2190-2194Volume 2-issue-6-2190-2194
Volume 2-issue-6-2190-2194Editor IJARCET
 
Volume 2-issue-6-2186-2189
Volume 2-issue-6-2186-2189Volume 2-issue-6-2186-2189
Volume 2-issue-6-2186-2189Editor IJARCET
 
Volume 2-issue-6-2177-2185
Volume 2-issue-6-2177-2185Volume 2-issue-6-2177-2185
Volume 2-issue-6-2177-2185Editor IJARCET
 
Volume 2-issue-6-2173-2176
Volume 2-issue-6-2173-2176Volume 2-issue-6-2173-2176
Volume 2-issue-6-2173-2176Editor IJARCET
 
Volume 2-issue-6-2165-2172
Volume 2-issue-6-2165-2172Volume 2-issue-6-2165-2172
Volume 2-issue-6-2165-2172Editor IJARCET
 
Volume 2-issue-6-2159-2164
Volume 2-issue-6-2159-2164Volume 2-issue-6-2159-2164
Volume 2-issue-6-2159-2164Editor IJARCET
 
Volume 2-issue-6-2155-2158
Volume 2-issue-6-2155-2158Volume 2-issue-6-2155-2158
Volume 2-issue-6-2155-2158Editor IJARCET
 
Volume 2-issue-6-2148-2154
Volume 2-issue-6-2148-2154Volume 2-issue-6-2148-2154
Volume 2-issue-6-2148-2154Editor IJARCET
 
Volume 2-issue-6-2143-2147
Volume 2-issue-6-2143-2147Volume 2-issue-6-2143-2147
Volume 2-issue-6-2143-2147Editor IJARCET
 
Volume 2-issue-6-2119-2124
Volume 2-issue-6-2119-2124Volume 2-issue-6-2119-2124
Volume 2-issue-6-2119-2124Editor IJARCET
 
Volume 2-issue-6-2139-2142
Volume 2-issue-6-2139-2142Volume 2-issue-6-2139-2142
Volume 2-issue-6-2139-2142Editor IJARCET
 
Volume 2-issue-6-2130-2138
Volume 2-issue-6-2130-2138Volume 2-issue-6-2130-2138
Volume 2-issue-6-2130-2138Editor IJARCET
 
Volume 2-issue-6-2125-2129
Volume 2-issue-6-2125-2129Volume 2-issue-6-2125-2129
Volume 2-issue-6-2125-2129Editor IJARCET
 
Volume 2-issue-6-2114-2118
Volume 2-issue-6-2114-2118Volume 2-issue-6-2114-2118
Volume 2-issue-6-2114-2118Editor IJARCET
 
Volume 2-issue-6-2108-2113
Volume 2-issue-6-2108-2113Volume 2-issue-6-2108-2113
Volume 2-issue-6-2108-2113Editor IJARCET
 
Volume 2-issue-6-2102-2107
Volume 2-issue-6-2102-2107Volume 2-issue-6-2102-2107
Volume 2-issue-6-2102-2107Editor IJARCET
 

Más de Editor IJARCET (20)

Electrically small antennas: The art of miniaturization
Electrically small antennas: The art of miniaturizationElectrically small antennas: The art of miniaturization
Electrically small antennas: The art of miniaturization
 
Volume 2-issue-6-2205-2207
Volume 2-issue-6-2205-2207Volume 2-issue-6-2205-2207
Volume 2-issue-6-2205-2207
 
Volume 2-issue-6-2195-2199
Volume 2-issue-6-2195-2199Volume 2-issue-6-2195-2199
Volume 2-issue-6-2195-2199
 
Volume 2-issue-6-2200-2204
Volume 2-issue-6-2200-2204Volume 2-issue-6-2200-2204
Volume 2-issue-6-2200-2204
 
Volume 2-issue-6-2190-2194
Volume 2-issue-6-2190-2194Volume 2-issue-6-2190-2194
Volume 2-issue-6-2190-2194
 
Volume 2-issue-6-2186-2189
Volume 2-issue-6-2186-2189Volume 2-issue-6-2186-2189
Volume 2-issue-6-2186-2189
 
Volume 2-issue-6-2177-2185
Volume 2-issue-6-2177-2185Volume 2-issue-6-2177-2185
Volume 2-issue-6-2177-2185
 
Volume 2-issue-6-2173-2176
Volume 2-issue-6-2173-2176Volume 2-issue-6-2173-2176
Volume 2-issue-6-2173-2176
 
Volume 2-issue-6-2165-2172
Volume 2-issue-6-2165-2172Volume 2-issue-6-2165-2172
Volume 2-issue-6-2165-2172
 
Volume 2-issue-6-2159-2164
Volume 2-issue-6-2159-2164Volume 2-issue-6-2159-2164
Volume 2-issue-6-2159-2164
 
Volume 2-issue-6-2155-2158
Volume 2-issue-6-2155-2158Volume 2-issue-6-2155-2158
Volume 2-issue-6-2155-2158
 
Volume 2-issue-6-2148-2154
Volume 2-issue-6-2148-2154Volume 2-issue-6-2148-2154
Volume 2-issue-6-2148-2154
 
Volume 2-issue-6-2143-2147
Volume 2-issue-6-2143-2147Volume 2-issue-6-2143-2147
Volume 2-issue-6-2143-2147
 
Volume 2-issue-6-2119-2124
Volume 2-issue-6-2119-2124Volume 2-issue-6-2119-2124
Volume 2-issue-6-2119-2124
 
Volume 2-issue-6-2139-2142
Volume 2-issue-6-2139-2142Volume 2-issue-6-2139-2142
Volume 2-issue-6-2139-2142
 
Volume 2-issue-6-2130-2138
Volume 2-issue-6-2130-2138Volume 2-issue-6-2130-2138
Volume 2-issue-6-2130-2138
 
Volume 2-issue-6-2125-2129
Volume 2-issue-6-2125-2129Volume 2-issue-6-2125-2129
Volume 2-issue-6-2125-2129
 
Volume 2-issue-6-2114-2118
Volume 2-issue-6-2114-2118Volume 2-issue-6-2114-2118
Volume 2-issue-6-2114-2118
 
Volume 2-issue-6-2108-2113
Volume 2-issue-6-2108-2113Volume 2-issue-6-2108-2113
Volume 2-issue-6-2108-2113
 
Volume 2-issue-6-2102-2107
Volume 2-issue-6-2102-2107Volume 2-issue-6-2102-2107
Volume 2-issue-6-2102-2107
 

Último

Vertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsVertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsMiki Katsuragi
 
Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxhariprasad279825
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .Alan Dix
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfAddepto
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningLars Bell
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clashcharlottematthew16
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo DayH2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo DaySri Ambati
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 

Último (20)

Vertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsVertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering Tips
 
Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptx
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdf
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine Tuning
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clash
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo DayH2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 

Ijarcet vol-2-issue-3-947-950

  • 1. ISSN: 2278 – 1323 International Journal of Advanced Research in Computer Engineering & Technology (IJARCET) Volume 2, Issue 3, March 2013 947 All Rights Reserved © 2013 IJARCET FGEST: FINGER TRACKING AND GESTURE RECOGNITION IN SMARTPHONES Pratham Parikh Shubham Gupta Swatil Patel Varun Nimbalkar Department of Computer Science, Sinhgad College of Engineering, University of Pune, India Abstract Recent advances in mobile processors have made complex calculations possible and feasible in Smartphones. Taking advantage of these developments we aim to develop a gesture recognition application that can recognize finger movements effectively in a variety of lighting conditions and perform corresponding actions. This enables us to incorporate a new form of interaction on mobile devices without using extra hardware as it utilizes the front facing camera present in most mobile phones. Here we present a framework that describes our application which will increase the functionality of preexisting devices. Keywords-video analysis,edge and feature detection, motion, shape, tracking I. INTRODUCTION Smartphones have completely revolutionized the mobile phone industry in the last five years. Mostly touch-screen devices, they capitalize on human being‟s natural instinct to touch and feel. Smartphones have brought the world closer, providing one–touch access to all the information desired by the user. These phones are usually characterized by interactive interfaces and a superior user experience. Developers across the world are trying to further enhance this user experience. From numeric keypads to qwerty keypads to touch-screens, the technology is ever-revolving. Research is going on to make the interface touch-independent, so as to make it possible to access the device even from a distance. A lot of work has been done in this field on Windows platform, yet the Android platform is still virgin. Recently, Qualcomm, using a prototype tablet running on Quad core processor, demonstrated gesture recognition on an Android platform. Sony Xperia Sola uses capacitive inductors to implement “floating touch”™ which recognizes touch action from 1-2 cm above the screen. According to a 2012 census, 59% of the smartphone run on the Android platform. So, there is a need and possibility of an application that can further enhance user experience in Android smartphones and make it possible for users to access their devices from a distance. The purpose of this system is to develop an Android application for tracking the movement of a fingertip. By employing the front camera that is already present in most of today‟s smartphones, we shall use a fingertip recognition algorithm to detect the fingertip motion using contour analysis. Finger gestures can then be associated with accepting and rejecting calls which would aid a driver. The gestures will also be used to shuffle songs and move to the next image in gallery. As finger-tip gesture recognition is more preferred and convenient for human beings and being an unexplored idea in cellular technology, it is the main focus of this project. Also reducing overhead of processing, background subtraction, color detection, contour recognition and to avoid the problems faced by the users are the driving forces for taking up this project. User moves his/her finger in front of the smartphone‟s front camera. Background detection, color detection and contour analysis are performed on the video frames to detect the finger-tip and then recognize its movement. Depending upon the movement detected and the third-party application synchronized, the necessary action is initiated. Our application can be easily integrated with system applications like phone, gallery and music player as well as opening new avenues for application developers. II. PROBLEM FORMULATION A. Problem Definition – Given the performance of a deliberate gesture by an actor the system should interpret that gesture and send a command to a 3rd party application to perform an associated task. Gesture recognition should be robust i.e. it should work in a variety of lighting conditions and with a wide array of gestures. B. Problem Output – The main objective of the FGEST is to enable a new form of interaction in smartphones without adding hardware. The added functionality will enable safe phone calls in cars, intuitive web browsing and navigation, handsfree gameplay etc. Real time inputs will be taken from the front camera. The frames will be processed and any motion will be detected. The empirical data will be classified as gestures and the information will be sent to the application which has been selected by the user. III. MATHEMATICAL MODELLING We now provide a model of the system in terms of Set Theory domain. 1. Let „S‟ be the Finger tracking and Gesture recognition system.
  • 2. ISSN: 2278 – 1323 International Journal of Advanced Research in Computer Engineering & Technology (IJARCET) Volume 2, Issue 3, March 2013 948 All Rights Reserved © 2013 IJARCET S= {………… 2. Identify the inputs as F, F1 and L. S = {V, L… F = {i | „i‟ is the raw video frame captured by front camera.} = {*.mp4 } L= {t | „t‟ is location of input file on hard disk drive} 3. Identify the output as O. S = {V, L, O, R … O = {o | „o‟ is metadata of processed video frame.} R = {r | „r‟ is the desired response of the system.} 4. Identify the processes as P. S= {V, L, O, R, P … P= {F, M, D} 5. F is the set for finger extraction module activities and associated data. F = {Fip, Fp, Fop} Fip= {f| „f‟ is the valid input to finger extraction module.} Fp= {f | „f‟ is the finger extraction function to convert the Fip to Fop.} Fp (Fip) = Fop Fop= {f | „f‟ is the output generated by finger extraction module i.e. contour detection.} 6. M is the set of Motion tracking activities and associated data. M = {Mip, Mi, MOp} Mip= {t |‟t‟ is valid input for motion tracking.} = Fop Mi= {p | „p‟ is the function to track Mip.} Mi (Mip) = MOp MOp= {t |‟t‟ is the metafile that stores information about tracked motion.} 7. D is the set of finger motion direction detection activities and associated data. D= {Dip, Di, DOp} Dip= {t |‟t‟ is valid input for direction detection.} = MOp Di= {p | „p‟ is the function to track Dip.} Di(Dip) = DOp DOp= {j |‟j‟ is the metafile that stores information about detected direction i.e. x and y coordinates} 8. Identify failure cases as F‟ S= {V, L, O, R, P, F‟ … Failure occurs when – F= {ф} F= {p| ‟p‟ is matching with input in less than 70% cases} 9. Identify success case (terminating case) as „e‟ S= {V, L, O, R, P, F‟, e… Success is defined as- e = {p| „p‟ is matching with input more than 70% similarity} 10. Initial conditions as S0 S= {V, L, O, R, P, F‟, e, S0} Initial condition for finger detection- There should be considerate distinction between the finger colour and background colour i.e. IOp ≠ {ф} There is no initial condition for motion tracking. IV. FRAMEWORK The basic framework for Finger Tracking and Gesture Recognition consists of 3 main phases: Phase 1(Receive Input Frame) receives the raw video frames from the front camera and decodes these frames before sending them to the next phase. Phase 2(Process Frames) processes those frames and applies background subtraction, skin color segmentation, contour analysis and motion tracking to recognize and interpret gestures from the given input. Phase 3(Perform Action) receives the gesture information and commands a 3rd party application to take the necessary steps to complete the request specified.
  • 3. ISSN: 2278 – 1323 International Journal of Advanced Research in Computer Engineering & Technology (IJARCET) Volume 2, Issue 3, March 2013 949 All Rights Reserved © 2013 IJARCET V. IMAGE PROCESSING FILTERS The following filters shall be used to process the frames: A. Skin Color Segmentation The first phase of the processing requires detecting the skin region in the input frame. Normally frames are in the RGB color system. RGB is not suitable for skin color detection as a variety of lighting conditions means that consistent detection of skin will not take place due to changes in skin color. To resolve this difficulty we propose to convert the RGB frames into YCbCr system which will neutralize lighting and exposure variances to improve skin detection. This formula will be the base for Skin Color Segmentation and will be used to differentiate between skin and non-skin pixels. B. Background Subtraction To improve the accuracy of finger detection we shall, in parallel, employ a background subtraction filter. This filter shall perform foreground, background differentiation and then remove the background parts. The frames shall remain in the RGB system. The outputs of the background subtraction and skin color segmentation shall be added together to get the most accurate detection of the finger tip. C. Contour Analysis After the detection of the hand contour analysis will be used to detect fingertips. In an open hand the areas of the finger tips have sharper curvature than the other parts of the hand. Using the kCosine formula the curvature of the hand will be calculated and the area with the sharpest change in curvature will be positively identified as fingertips. D. Finger Tracking A predictive algorithm will be used for finger tracking. Using previously calculated values an area of the image will be identified as most likely to contain the finger. This area will be processed first. If no fingertip is detected within that area than the entire frame will be processed. The extra calculations required in the worst case will be offset by the reduction in calculations when the fingertip is detected in the predicted area. VI. OPTIMIZATION FOR MOBILE INTERFACES To counter the limited processing power and battery capacities of mobile platforms we plan to include the following modifications in our framework:- A. Variable Frame Processing During the frame capture process we plan to introduce variable frame capture rates which will depend on factors like if a supported application is running or if a valid artifact has been detected in the previous frame capture. B. Situational Frame Crop For efficient utilization of processor, we intend to run our algorithm only on the interesting area, part of the complete frame in which the valid artifact was detected, ignoring the rest of frame area, omitting unnecessary computations. C. Early Termination To reduce unnecessary processor usage we introduce variable process termination. At every stage we check to see if there are any negative detections. If any are encountered then the frame is immediately dropped and the processing pauses. VII. EXPECTED RESULT The expected result will be the successful detection of gestures with a relatively high probability in a variety of lighting conditions and accounting for variable distances between user and camera. VIII. CONCLUSION Thus we have presented a framework for finger tracking and gesture recognition using smartphones. We have also described the Mathematical Model for the same. ACKNOWLEDGMENT This Research Paper cannot be considered complete without mentioning Professor P.R. Futane. We wish to express true sense of gratitude towards his valuable contribution. We are grateful to him for his constant encouragement and guidance in the fulfillment of the project activity. REFERENCES [1] Daeho Lee, SeungGwan Lee, Vison –based finger action recognition by angle detection and contour analysis, 2011. [2] Byung-Hun Oh, Jub-Ho An, Kwang-Seok Hong, Mobile user interface using a robust fingertip detection algorithm for complex lighting and background conditions, 2012. [3] Jun-ho An, Kwang-Seok Hong, Finger Gesture-based mobile user interface using a rear-facing camera, 2011. [4] Mathias Kolsch, Matthew Turk, Robust Hand Detection, 2004. Figure 1: Basic framework for Finger Tracking and Gesture Recognition Receive Frame input Process Frames Decoded frames Raw frames Gesture Interpretation Perform Action
  • 4. ISSN: 2278 – 1323 International Journal of Advanced Research in Computer Engineering & Technology (IJARCET) Volume 2, Issue 3, March 2013 950 All Rights Reserved © 2013 IJARCET [5] Sylvia M. Dominguez, Trish Keaton, A robust finger tracking method for multimodal wearable computer interfacing, 2006. [6] Thuy Thi Nguyen, Nguyen Dang Binh, Horst Bischof, An active boosting-based learning framework for real-time hand detection, 2008. [7] S. Bilal, Rini Akmeliawati, A hybrid method using haar-like and skin-color algorithm for hand posture detection, recognition and tracking, 2010. [8] Leyuan Liu, Nong Sang, Saiyong Yang, Real-time skin color detection under rapidly changing illumination conditions, 2011. AUTHORS Pratham Parikh Dept. of Computer Engineering, Sinhgad College of Engineering, Pune University. Shubham Gupta Dept. of Computer Engineering, Sinhgad College of Engineering, Pune University. Varun Nimbalkar Dept. of Computer Engineering, Sinhgad College of Engineering, Pune University Swatil Patel Dept. of Computer Engineering, Sinhgad College of Engineering, Pune University