%global debug_package %{nil} Name: ramalama Version: 0.1 Release: 1%{?dist} Summary: Make working with AI boring through the use of OCT containers. License: MIT URL: https://github.com/containers/%{name} Source0: %{url}/archive/%{name}-%{version}.tar.gz Source1: https://raw.githubusercontent.com/containers/%{name}/s/%{name}.py Requires: podman Requires: python3-devel BuildRequires: make BuildRequires: go-md2man %description The Ramalama project's goal is to make working with AI boring through the use of OCI containers. On first run Ramalama inspects your system for GPU support, falling back to CPU support if no GPUs are present. It then uses container engines like Podman to pull the appropriate OCI image with all of the software necessary to run an AI Model for your systems setup. This eliminates the need for the user to configure the system for AI themselves. After the initialization, Ramalama will run the AI Models within a container based on the OCI image. %prep %autosetup -p1 sed -i "/install -m 0644 links\/ramalama/d" docs/Makefile %build %install mkdir -p %{buildroot}%{_prefix}/local/bin/ mkdir -p %{buildroot}/%{_prefix}/local/share/man/man1/ make -C docs install DESTDIR=%{buildroot} install -m755 %{SOURCE1} %{buildroot}%{_prefix}/local/bin/ramalama %files %license LICENSE %{_prefix}/local/bin/ramalama %{_prefix}/local/share/man/man1/* %changelog * Mon Sep 02 2024 Ian Mullins - 0.1-1 - Initial version.