Skip to content

nhphucqt/ComputationalGraph

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ComputationalGraph

A minimal deep learning framework that implements a computational graph with automatic differentiation, inspired by PyTorch's Tensor and autograd features. This project wraps NumPy's ndarray to provide gradient tracking and backpropagation capabilities. It also includes a simple dataset and dataloader system for efficient data loading during training.

Features

  • Tensor-like API: Core data structure wraps NumPy arrays, supporting basic tensor operations and autograd.
  • Automatic Differentiation: Enables gradient computation for building and training neural networks.
  • Dataset & DataLoader: Utilities for loading and batching data, similar to PyTorch's Dataset and DataLoader.
  • Matrix Operations: Implements essential matrix operations like multiplication, addition, and broadcasting.
  • Backpropagation: Supports backpropagation through computational graphs, allowing for efficient gradient updates.
  • Neural Network Support: Basic neural network components (e.g., layers, activation functions) are included for building models.
  • Extensible Design: Future features (e.g., more operations, GPU support) are planned for further PyTorch-like functionality.

Getting Started

  1. Clone the repository.
  2. Install dependencies (e.g. NumPy) in requirements.txt.
  3. Explore the Node in Graph, Dataset, and DataLoader modules in examples. For example, python -m examples.matrix_operations.

References

Core Concepts

Matrix Calculus

Matrix Multiplication


This project is a work in progress. Contributions and suggestions are welcome!

About

Just a dumb project for a dumb purpose

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages