controling-mouse-movement-and-clicks-with-camera
The purpose of this project is to implement the Tensorflow and Dlib libraries in Python along side some computer vision techniques to create a human interface device that replaces the functionality of a mouse using a webcam, face tracking, and separate convolutional neural networks to detect specific changes in facial expression. More specifically we aim to create a program that detects when a user's head is looking in a certain direction and then move the mouse cursor in the corresponding direction regardless of the webcams orientation relative to the user. More so, we aim to allow the user to make left and right mouse clicks by closing either eye such that when the user closes an eye shut the corresponding click down event is triggered, and then after when the user opens the same eye the corresponding click up event is triggered. The end result is a series of models that run in real time to allow the user smooth control of mouse movement and clicks. Further development of the methods and techniques used in this project could one day be used to allow individuals with certain physical disabilities to better interact with their computer devices.