Empowering people to communicate with care takers and loved ones.
Pre-release builds are available through Firebase here
Vocable AAC allows those with conditions such as MS, stroke, ALS, or spinal cord injuries to communicate using an app that tracks head movements, without the need to spend tens of thousands of dollars on technology to do so.
Vocable uses ARCore to track the user's head movements and understand where the user is looking at on the screen. This allows the app to be used completely hands-free: users can look around the screen and make selections by lingering their gaze at a particular element.
For users with more mobility, the app can be operated by touch.
Use a list of common phrases provided by speech language pathologists, or create and save your own.
Type with your head or your hands.
For the current progress on features, please visit the project board.
For a high-level roadmap, see the Vocable Roadmap
We'd love to translate Vocable into as many languages as possible. If you'd like to help translate, please visit our Crowdin project. Thanks for helping people communicate all around the world! ๐๐๐
We love contributions! To get started, please see our Contributing Guidelines.
Matt Kubota, Kyle Ohanian, Duncan Lewis, Ameir Al-Zoubi, and many more from WillowTree ๐.
vocable-android is released under the MIT license. See LICENSE for details.
vocable-ios is available on Apple Play Store and is also open-source.