This project implements a Neural Network from scratch in Python using only NumPy to recognize handwritten digits from the MNIST dataset. Itβs an educational project designed to understand how neural networks work internally without relying on high-level libraries like TensorFlow or PyTorch.
β
Builds a neural network from scratch (no external ML frameworks)
β
Implements forward propagation and backpropagation
β
Uses ReLU and Softmax activations
β
Trains on the MNIST dataset
β
Achieves ~85% accuracy on the development set
β
Visualizes predictions for test images
- Language: Python π
- Libraries: NumPy, Pandas, Matplotlib
- Dataset: MNIST (handwritten digits)
git clone https://github.com/chetlasrijith/neural-net-mnist.git
cd neural-net-mnistpip install numpy pandas matplotlib- Download the MNIST training dataset (
mnist_train.csv) from Kaggle or any other source. - Place the file in the
datasets/folder.
-
Data Loading: Loads
mnist_train.csv, normalizes pixel values (0-1). -
Network Architecture:
- Input layer: 784 neurons (28x28 pixels)
- Hidden layer: 10 neurons with ReLU activation
- Output layer: 10 neurons with Softmax activation
-
Training:
- Uses forward propagation to compute predictions.
- Applies backpropagation to update weights and biases.
- Trains over multiple iterations (epochs).
-
Prediction:
- Predicts the digit for unseen images and visualizes them using Matplotlib.
| Metric | Value |
|---|---|
| Training Accuracy | 85% |
| Test Accuracy | ~84-85% |
Contributions are welcome! Follow these steps:
- Fork the repo
- Create your branch (
git checkout -b feature/new-feature) - Commit your changes (
git commit -m 'Add new feature') - Push to the branch (
git push origin feature/new-feature) - Open a Pull Request
Created with β€οΈ by Chetla Srijith For queries, raise an issue in the repository or connect on LinkedIn.