• Stars
    star
    343
  • Rank 123,371 (Top 3 %)
  • Language
    Swift
  • License
    MIT License
  • Created about 7 years ago
  • Updated almost 7 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Simple project to recognize hands in realtime. ๐Ÿ‘‹ Serves as an Example for building your own object recognizer.

Hand Gesture Recognition

This simple sample project recognizes hands in realtime. ๐Ÿ‘‹ It serves as a basic example for recognizing your own objects. Suitable for AR ๐Ÿค“. Written for the tutorial โ€œCreate your own Object Recognizerโ€.

gif showing fist and spread hand appearing and dissappearing from screen, and it being recognized on an iPhone

Demo Video - on Youtube

Tech: iOS 11, ARKit, CoreML, iPhone 7 plus, Xcode 9.1, Swift 4.0

Notes:

This demonstrates basic Object Recognition (for spread hand ๐Ÿ–, fist ๐Ÿ‘Š, and no hands โŽ). It serves as a building block for object detection, localization, gesture-recognition, and hand tracking.

Disclaimer:

The sample model provided here was captured in 1 hour and is biased to one human handย ๐Ÿ‘‹๐Ÿผ. Itโ€™s intended as a placeholder for your own models. (See Tutorial)


Steps Taken (Overview)

Hereโ€™s an overview of the steps taken. (You can also view my commit history to see steps involved.)

  1. Build an Intuition by playing with Google CL's Teachable Machine.
  2. Build dataset.
  3. Create a Core ML Model using Microsoft's CustomVision.ai.
  4. Run the model in realtime with ARKit.

Full Tutorial here

P.S. A few well selected images are sufficient for CustomVision.ai . For the sample model here, I did 3 rounds of data collection (adding 63, 38, 21 images per round). Alternating classes during data collection also appeared to work better than gathering all the class images at once.

image of dataset

License

MIT Open Source License. ๐Ÿงž Use as you wish. Have fun! ๐Ÿ˜