This repository is for the iASL Android application. The app will use the Android device's front-facing camera to record the user performing ASL. It will then attempt to convert the ASL to text by using machine learning. The application will also support a speech-to-text feature.
Users will be able to save the resulting text as a note. This application will also contain a messaging feature so that users can perform ASL in front of the camera, or speak into the microphone, and have the resulting text be sent to someone else.
The repository containing the machine learning backend for converting ASL to text can be found at: https://github.com/Capstone-Projects-2020-Spring/iASL-Backend
- Android device (or emulator) running Android 9.0 Pie (API Level 28) or later
- Front-facing camera (for ASL-to-text functionality)
- Microphone (for speech-to-text functionality)
Note: Not all features may work successfully if the application is running on an emulator instead of an actual device.
To clone this project, you may enter the following command into Bash: git clone https://github.com/Capstone-Projects-2020-Spring/iASL-Android.git
After cloning the repository, you can open the project in Android Studio.
Alternatively, you may open Android Studio and navigate to File > New > Project from Version Control, and then enter the URL of this repository (https://github.com/Capstone-Projects-2020-Spring/iASL-Android.git
) and choose a directory in which to save the project.
Code documentation can be found here: https://capstone-projects-2020-spring.github.io/iASL-Android/ Code documentation is generated using JavaDocs.