The main objective is to develop an Android application for emotion recognition using a Machine Learning model optimized for mobile devices via TensorFlow Lite. The app allows users to:
- Take a photo using the device's camera.
- Process the image through a TFLite model to determine the predominant emotion.
- Display the result directly on the user interface.
The model was trained using the FER2013 dataset with advanced deep learning techniques, and its optimized format ensures high performance on mobile devices.
- Image Capture: Utilizes CameraX to take photos using the front-facing camera.
- Real-Time Inference: Processes images using a convolutional neural network model.
- User-Friendly Interface: Displays results directly in the app's text area.
- Mobile Optimization: Model optimized through quantization.
- Android Studio (version >= 2022.1.1)
- Gradle (version >= 7.4)
- Device with Android 6.0 (API 23) or higher
- Python >= 3.8
- TensorFlow >= 2.9
- Additional libraries:
numpy
scikit-learn
pillow
app/
: Source code for the Android app.src/main/assets
: Optimized model in TFLite format (emotion_model.tflite
).res/layout
: XML layouts for the user interface.java/com/example/emotionalfaces
: Application logic.
models/
: Contains files generated during training, including:best_emotion_model.keras
: Trained model.emotion_model.tflite
: Optimized model for mobile devices.
datasets/
: Dataset used for training (FER2013).
MainActivity.java
: Android app logic.TFmodel.py
: Script for training and converting the model.FEdataset.py
: Script for loading and preprocessing the FER2013 dataset.
- Open the project in Android Studio.
- Synchronize the Gradle files.
- Add the
emotion_model.tflite
file to theapp/src/main/assets
directory (already included in the project).
- Connect an Android device or use an emulator.
- Press the Run button (green triangle) in Android Studio.
- Launch the app on the device and explore its features.
Ensure the following Python dependencies are installed:
pip install numpy scikit-learn pillow tensorflow
- Download the FER2013 dataset and place it in the
datasets/fer2013.zip
directory. - Run the
TFmodel.py
script:python TFmodel.py
- The trained model will be saved in
.keras
and.tflite
formats in themodels/
directory.
- Architecture: Convolutional Neural Network (CNN)
- 3 convolutional blocks with batch normalization and dropout.
- Fully connected layers with
ReLU
andsoftmax
activation.
- Input: Grayscale images, size 48x48.
- Output: 7 emotion classes (
angry
,disgust
,fear
,happy
,neutral
,sad
,surprise
).
- Full-integer quantization for optimized performance.
- Data representative configured to ensure high accuracy.
- Name: FER2013 (Facial Expression Recognition 2013)
- Format: ZIP file organized into directories for emotion categories.
- Preprocessing:
- Conversion to grayscale.
- Pixel normalization (0-1).
- Split into training (80%) and testing (20%).
This project demonstrates the effectiveness of emotion recognition using convolutional neural networks and its integration on mobile devices. The combination of an intuitive Android app and an optimized TFLite model represents a step forward in real-world AI-based applications.
This project is distributed under the MIT License. See the LICENSE
file for more details.