A minimal deep learning framework that implements a computational graph with automatic differentiation, inspired by PyTorch's Tensor and autograd features. This project wraps NumPy's ndarray to provide gradient tracking and backpropagation capabilities. It also includes a simple dataset and dataloader system for efficient data loading during training.
- Tensor-like API: Core data structure wraps NumPy arrays, supporting basic tensor operations and autograd.
- Automatic Differentiation: Enables gradient computation for building and training neural networks.
- Dataset & DataLoader: Utilities for loading and batching data, similar to PyTorch's
DatasetandDataLoader. - Matrix Operations: Implements essential matrix operations like multiplication, addition, and broadcasting.
- Backpropagation: Supports backpropagation through computational graphs, allowing for efficient gradient updates.
- Neural Network Support: Basic neural network components (e.g., layers, activation functions) are included for building models.
- Extensible Design: Future features (e.g., more operations, GPU support) are planned for further PyTorch-like functionality.
- Clone the repository.
- Install dependencies (e.g. NumPy) in
requirements.txt. - Explore the
Nodein Graph,Dataset, andDataLoadermodules inexamples. For example,python -m examples.matrix_operations.
-
Recreating PyTorch from Scratch with GPU Support and Automatic Differentiation
A comprehensive guide on building a PyTorch-like framework, covering computational graphs, autograd, and GPU support. -
PyNorch Tensor Implementation
An open-source project that implements a minimal PyTorch-like tensor with autograd, serving as a reference for design and implementation.
-
CS231n: Vector Calculus Review and Reference
A concise summary of vector and matrix calculus, essential for understanding backpropagation and gradient computation. -
Matrix Calculus for Deep Learning
A concise document explaining matrix calculus rules, useful for implementing gradients in computational graphs. -
Explained.ai: Matrix Calculus
An intuitive explanation of matrix calculus concepts, with visualizations and practical examples. -
Matrix Calculus (for Machine Learning and Beyond)
A comprehensive and modern treatment of calculus involving matrices, designed to bridge the gap between traditional calculus and the advanced differentiation needed for machine learning, optimization, and scientific computing.
-
PyTorch Documentation: torch.matmul
Official documentation for matrix multiplication in PyTorch, describing broadcasting and operation semantics. -
PEP 465: A dedicated infix operator for matrix multiplication
Python Enhancement Proposal introducing the@operator for matrix multiplication, relevant for implementing intuitive APIs.
This project is a work in progress. Contributions and suggestions are welcome!