Skip to content

Latest commit

 

History

History
38 lines (29 loc) · 1.83 KB

README.md

File metadata and controls

38 lines (29 loc) · 1.83 KB

EventDrivenArchitecture

Overview

This repository contains prototypes and components of an AI event-driven architecture for controlling 8-bit LED screen simulation. The project includes several modes such as animations, drawing, games, and interaction with large language models (LLMs).

Folder Structure

  1. C: Various animations, movement detection, and pose/hand recognition on Jetson Nano. Includes snake and brick pong games implemented in C/C++.

  2. Ellie_connected: MobileNet for user recognition, Mediapipe for BrickPong hand controller, autolaunch when user is detected.

  3. Ellie_connected_v2: More solid version with resting animation and movement detection with OpenCV, mode selection with hand gesture recognition and brick pong with hand controller, relaunch when no movement.

  4. flaskServerWith3DEffects: YOLOv9 object detection integrated to create video effects and animations.

  5. soundAndPersonRecognition: Scripts for detecting sound levels and recognizing persons using a simple OpenCV model.

Technologies Used

  • Python
  • C/C++
  • YOLOv9
  • Mediapipe
  • MobileNet
  • OpenCV
  • PyGame
  • NumPy
  • Matplotlib
  • whisper
  • Jetson Nano

License

This project is licensed under the MIT License.