- Overview
- Features
- Demo
- Installation
- Usage
- Project Structure
- Dependencies
- Contributing
- License
- Contact
This project is a Multi-Layer Perceptron (MLP) neural network implemented in C++ using the Eigen library for matrix operations. The neural network is designed to perform classification tasks, with flexibility in configuring the number of layers, hidden units, learning rate, and training iterations.
- Configurable Architecture: Specify the number of layers, hidden units, and output classes.
- Activation Functions: Utilizes ReLU for hidden layers and Softmax for the output layer.
- Training: Implements forward and backward propagation with gradient descent optimization.
- Accuracy Calculation: Evaluates the model's performance on training data.
- Utility Functions: Includes functions for one-hot encoding and activation computations.
- Command-Line Interface: Interactive prompts to input training parameters and dataset.
Figure: Training progress showing iteration counts and accuracy metrics.
- C++ Compiler: Supports C++17 standard.
- CMake: Version 3.10 or higher.
- Eigen Library: Installed on your system.
-
Clone the Repository
git clone https://github.com/DanTheCoderMan06/neural-network-cpp.git cd neural-network-cpp
-
Install Eigen
-
Ubuntu:
sudo apt-get update sudo apt-get install libeigen3-dev
-
MacOS (using Homebrew):
brew install eigen
-
Windows:
- Download Eigen from Eigen Downloads.
- Extract and place it in a known directory.
-
-
Build the Project
mkdir build cd build cmake .. cmake --build .
Note: If Eigen is installed in a non-standard directory, you might need to specify the path during the CMake configuration.
After building the project, you can run the executable to train the neural network on your dataset.
./rooster
-
Dataset Filename:
- Provide the path to your training CSV file (e.g.,
train.csv
). - CSV Format: Each row should start with a label followed by pixel values (e.g., for MNIST, labels
0-9
followed by784
pixel values).
- Provide the path to your training CSV file (e.g.,
-
Number of Iterations:
- Specify how many training iterations to perform.
- Must be a multiple of 25. If not, it will be rounded down to the nearest multiple.
-
Number of Layers:
- Define the number of layers in the Multi-Layer Perceptron.
- Minimum of 2 layers (1 hidden layer + 1 output layer).
-
Learning Rate:
- Set the learning rate (e.g.,
0.1
).
- Set the learning rate (e.g.,
Enter filename (e.g., train.csv): train.csv
How many iterations shall happen? (Must be multiple of 25, if not is rounded): 500
How many layers in the Multi-Layer Perceptron? 3
Learning rate? 0.1
Starting training for 500 iterations...
Iteration: 0
Accuracy: 0.12
Iteration: 20
Accuracy: 0.45
...
Iteration: 500
Accuracy: 0.85
Training completed.
Final training accuracy: 0.85
neural-network-cpp/
├── CMakeLists.txt # CMake configuration file
├── README.md # Project documentation
├── main.cpp # Entry point of the application
├── NeuralNetwork.hpp # NeuralNetwork class declaration
├── NeuralNetwork.cpp # NeuralNetwork class implementation
├── utils.hpp # Utility function declarations
├── utils.cpp # Utility function implementations
├── data/ # Directory for datasets
│ └── train.csv
└── build/ # Directory for build files
- Eigen: A high-performance C++ library for linear algebra.
- C++17 Standard: Ensure your compiler supports C++17.
This project is licensed under the MIT License.
- Author: Daniil Novak
- Email: [email protected]
- GitHub: @DanTheCoderMan06
Feel free to reach out for any questions or suggestions!
- Inspired by various neural network implementations and educational resources.
- Thanks to the Eigen community for providing a robust linear algebra library.