Name: micrograd Version: 0.1.0 Release: 1%{?dist} Summary: A tiny scalar-valued autograd engine and a neural net library on top License: MIT URL: https://github.com/karpathy/micrograd Source0: %url/archive/%{version}.tar.gz#/%{name}-%{version}.tar.gz # https://files.pythonhosted.org/packages/1c/dc/9354ac4d39b589f2f3bea4b51195936673c211484fc069c147b1cc1196f0/micrograd-0.1.0.tar.gz BuildArch: noarch BuildRequires: python3-devel BuildRequires: python3-setuptools %description A tiny Autograd engine (with a bite! :)). Implements backpropagation (reverse-mode autodiff) over a dynamically built DAG and a small neural networks library on top of it with a PyTorch-like API. Both are tiny, with about 100 and 50 lines of code respectively. The DAG only operates over scalar values, so e.g. we chop up each neuron into all of its individual tiny adds and multiplies. However, this is enough to build up entire deep neural nets doing binary classification, as the demo notebook shows. Potentially useful for educational purposes. %prep %autosetup -p1 %generate_buildrequires %pyproject_buildrequires %build %pyproject_wheel %install %pyproject_install %pyproject_save_files micrograd %files -f %{pyproject_files} #%license LICENSE #%doc README.md %changelog * Tue Jun 20 2023 Peter Robinson - 0.1.0-1 - Initial package