This repository has been archived by the owner on Aug 28, 2024. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 606
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This reverts commit 5a65775.
IvanKobzarev
reviewed
Aug 3, 2021
@@ -0,0 +1,21 @@ | |||
# Add project specific ProGuard rules here. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
File can be removed
IvanKobzarev
reviewed
Aug 3, 2021
|
||
public AnalysisResult(String results) { | ||
mResults = results; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
needs formatting
IvanKobzarev
reviewed
Aug 3, 2021
Comment on lines
121
to
123
if (maxScoreIdx == DELETE) result = "DELETE"; | ||
else if (maxScoreIdx == NOTHING) result = "NOTHING"; | ||
else if (maxScoreIdx == SPACE) result = "SPACE"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: Imo using blocks for every case will be more readable:
if (maxScoreIdx == DELETE) {
result = "DELETE";
} else if (maxScoreIdx == NOTHING) {
result = "NOTHING";
} else if (maxScoreIdx == SPACE) {
result = "SPACE";
}
IvanKobzarev
reviewed
Aug 3, 2021
btnNext.setOnClickListener(new View.OnClickListener() { | ||
public void onClick(View v) { | ||
mStartLetterPos = (mStartLetterPos + 1) % 26; | ||
if (mStartLetterPos == 0) mStartLetterPos = 26; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit:
if (mStartLetterPos == 0) {
mStartLetterPos = 26;
}
IvanKobzarev
approved these changes
Aug 3, 2021
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
American Sign Language Recognition on Android
Introduction
American Sign Language (ASL) is a natural language used by deaf communities in many countries around the world. It has 26 signs corresponding to the 26 letters of the English language. This repo shows Python scripts that train a deep learning model to recognize the 26 ASL signs (and 3 additional signs for deletion, space, and nothing) and converts and optimizes the model to the Mobile Interpreter format, and an Android app that uses the model to recognize the 26 ASL signs.
Prerequisites
Quick Start
To Test Run the ASL recognition Android App, follow the steps below:
1. Train and Prepare the Model
If you don't have the PyTorch 1.9.0 and torchvision 0.10.0 installed, or if don't want to install them, you can skip this step. The trained, scripted and optimized model is already included in the repo, located at
ASLRecognitionapp/src/main/assets
.Otherwise, open a terminal window, make sure you have torch 1.9.0 and torchvision 0.10.0 installed using command like
pip list|grep torch
, or install them using command likepip install torch torchvision
, then run the following commands:Download the ASL alphabet dataset here and unzip it into the
ASLRecognition/scripts
folder. Then run the scripts below, which are based on this tutorial, to pre-process the training images, train the model and convert and optimize the trained model to the mobile interpreter model:If all goes well, the model
asl.ptl
will be generated and you can copy it toASLRecognition/app/src/main/assets
.You can also run
python test.py
to see the result of a test image located at../app/src/main/assets/C1.jpg
:For more information on how to use a test script like the above to find out the expected model input and output and use them in an Android app, see Step 2 of the tutorial Image Segmentation DeepLabV3 on Android.
2. Use Android Studio
Open the ASLRecognition project using Android Studio. Note the app's
build.gradle
file has the following lines:and in the MainActivity.java, the code below is used to load the model:
3. Run the App
Select an Android emulator or device and build and run the app. Some of the 26 test images of the ASL alphabet and their recognition results are as follows:
To test the live ASL alphabet gesture recognition, after you get familiar with the 26 ASL signs by tapping Next and Recognize, select the LIVE button and make some ASL gesture in front of the camera. A screencast of the app running is available here.
4. What's Next
With a different sign language dataset such as the RWTH-PHOENIX-Weather 2014 MS Public Hand Shape Dataset or the Continuous Sign Language Recognition Dataset and a state-of-the-art sign language transformer based model, more powerful sign language recognition Android app can be developed based on the app here.