%global pypi_name ramalama %global desc %{pypi_name} is a command line tool for working with AI LLM models. %define _python_dist_allow_version_zero 1 %global forgeurl https://github.com/containers/ramalama # see ramalama/version.py %global version0 0.0.18 %forgemeta Name: python-%{pypi_name} Version: %{forgeversion} License: Apache-2.0 Release: 1%{?dist} Summary: RESTful API for Ramalama URL: %{forgeurl} # Tarball fetched from upstream Source0: %{forgesource} BuildArch: noarch BuildRequires: git-core BuildRequires: golang BuildRequires: golang-github-cpuguy83-md2man BuildRequires: make BuildRequires: pyproject-rpm-macros BuildRequires: python%{python3_pkgversion}-argcomplete BuildRequires: python%{python3_pkgversion}-devel BuildRequires: python%{python3_pkgversion}-pip BuildRequires: python%{python3_pkgversion}-setuptools BuildRequires: python%{python3_pkgversion}-wheel %description %desc On first run RamaLama inspects your system for GPU support, falling back to CPU support if no GPUs are present. It then uses container engines like Podman to pull the appropriate OCI image with all of the software necessary to run an AI Model for your systems setup. This eliminates the need for the user to configure the system for AI themselves. After the initialization, RamaLama will run the AI Models within a container based on the OCI image. %package -n python%{python3_pkgversion}-%{pypi_name} Recommends: podman Summary: %{summary} Provides: %{pypi_name} = %{version}-%{release} %{?python_provide:%python_provide python%{python3_pkgversion}-%{pypi_name}} %description -n python%{python3_pkgversion}-%{pypi_name} %desc %prep %forgesetup %build %pyproject_wheel %install %pyproject_install %pyproject_save_files %{pypi_name} %{__make} DESTDIR=%{buildroot} PREFIX=%{_prefix} install-docs install-shortnames # older argcomplete does not support zsh %if 0%{?fedora} >= 40 %{__make} DESTDIR=%{buildroot} PREFIX=%{_prefix} install-completions %endif %files -n python%{python3_pkgversion}-%{pypi_name} %license LICENSE %doc README.md %{_bindir}/ramalama %if 0%{?fedora} >= 40 %{bash_completions_dir}/%{pypi_name} %{_datadir}/fish/vendor_completions.d/ramalama.fish %{_datadir}/zsh/site/_ramalama %endif %{python3_sitelib}/%{pypi_name} %{python3_sitelib}/%{pypi_name}-%{version}.dist-info %dir %{_datadir}/%{pypi_name} %{_datadir}/%{pypi_name}/shortnames.conf %{_mandir}/man1/ramalama*.1* %changelog * Wed Oct 16 2024 Jens Petersen - 0.0.18-1 - update to 0.0.18 release * Sun Sep 22 2024 Jens Petersen - initial package with program (was 0.1.0-0.1)