%bcond_without openmpi %if 0%{?rhel} && 0%{?rhel} < 7 %ifarch ppc ppc64 s390 s390x # No mpich in RHEL < 7 for these arches %bcond_with mpich %else %bcond_without mpich %endif %else # Enable mpich on RHEL >= 7 and on Fedora %bcond_without mpich %endif %{!?openblas_arches:%global openblas_arches x86_64 %{ix86} armv7hl %{power64} aarch64} %ifarch %{openblas_arches} # matches openblas ExclusiveArch %bcond_without openblas %endif Summary: A subset of LAPACK routines redesigned for heterogeneous computing Name: scalapack Version: 2.0.2 Release: 27%{?dist} # This is freely distributable without any restrictions. License: Public Domain Group: Development/Libraries URL: http://www.netlib.org/scalapack/ Source0: http://www.netlib.org/scalapack/scalapack-%{version}.tgz BuildRequires: lapack-devel %if %{with openblas} BuildRequires: openblas-devel %else BuildRequires: blas-devel %endif BuildRequires: gcc-gfortran, glibc-devel %if %{with mpich} %if 0%{?rhel} && 0%{?rhel} <= 7 BuildRequires: mpich-devel %else BuildRequires: mpich-devel-static %endif %endif %if %{with openmpi} BuildRequires: openmpi-devel %endif # Build shared-library for ScaLAPACK Patch0: scalapack-2.0.2-fedora.patch # Build shared library for BLACS Patch1: scalapack-2.0.2-shared-blacs.patch # Add missing functions Patch2: scalapack-2.0.2-missing-functions.patch # Still use blas if openblas is not available Patch3: scalapack-2.0.2-fedora-blas.patch %description The ScaLAPACK (or Scalable LAPACK) library includes a subset of LAPACK routines redesigned for distributed memory MIMD parallel computers. It is currently written in a Single-Program-Multiple-Data style using explicit message passing for inter-processor communication. It assumes matrices are laid out in a two-dimensional block cyclic decomposition. ScaLAPACK is designed for heterogeneous computing and is portable on any computer that supports MPI or PVM. Like LAPACK, the ScaLAPACK routines are based on block-partitioned algorithms in order to minimize the frequency of data movement between different levels of the memory hierarchy. (For such machines, the memory hierarchy includes the off-processor memory of other processors, in addition to the hierarchy of registers, cache, and local memory on each processor.) The fundamental building blocks of the ScaLAPACK library are distributed memory versions (PBLAS) of the Level 1, 2 and 3 BLAS, and a set of Basic Linear Algebra Communication Subprograms (BLACS) for communication tasks that arise frequently in parallel linear algebra computations. In the ScaLAPACK routines, all inter-processor communication occurs within the PBLAS and the BLACS. One of the design goals of ScaLAPACK was to have the ScaLAPACK routines resemble their LAPACK equivalents as much as possible. %package -n blacs-common Summary: Common files for blacs Group: Development/Libraries %description -n blacs-common The BLACS (Basic Linear Algebra Communication Subprograms) project is an ongoing investigation whose purpose is to create a linear algebra oriented message passing interface that may be implemented efficiently and uniformly across a large range of distributed memory platforms. The length of time required to implement efficient distributed memory algorithms makes it impractical to rewrite programs for every new parallel machine. The BLACS exist in order to make linear algebra applications both easier to program and more portable. This file contains common files which are not specific to any MPI implementation. %package common Summary: Common files for scalapack Group: Development/Libraries %description common The ScaLAPACK (or Scalable LAPACK) library includes a subset of LAPACK routines redesigned for distributed memory MIMD parallel computers. It is currently written in a Single-Program-Multiple-Data style using explicit message passing for inter-processor communication. It assumes matrices are laid out in a two-dimensional block cyclic decomposition. ScaLAPACK is designed for heterogeneous computing and is portable on any computer that supports MPI or PVM. Like LAPACK, the ScaLAPACK routines are based on block-partitioned algorithms in order to minimize the frequency of data movement between different levels of the memory hierarchy. (For such machines, the memory hierarchy includes the off-processor memory of other processors, in addition to the hierarchy of registers, cache, and local memory on each processor.) The fundamental building blocks of the ScaLAPACK library are distributed memory versions (PBLAS) of the Level 1, 2 and 3 BLAS, and a set of Basic Linear Algebra Communication Subprograms (BLACS) for communication tasks that arise frequently in parallel linear algebra computations. In the ScaLAPACK routines, all inter-processor communication occurs within the PBLAS and the BLACS. One of the design goals of ScaLAPACK was to have the ScaLAPACK routines resemble their LAPACK equivalents as much as possible. This package contains common files which are not specific to any MPI implementation. %if %{with mpich} %package -n blacs-mpich Summary: BLACS libraries compiled against mpich Group: Development/Libraries Requires: blacs-common = %{version}-%{release} Provides: blacs-mpich2 = %{version}-%{release} Obsoletes: blacs-mpich2 < 1.1-50 Requires: mpich %description -n blacs-mpich The BLACS (Basic Linear Algebra Communication Subprograms) project is an ongoing investigation whose purpose is to create a linear algebra oriented message passing interface that may be implemented efficiently and uniformly across a large range of distributed memory platforms. The length of time required to implement efficient distributed memory algorithms makes it impractical to rewrite programs for every new parallel machine. The BLACS exist in order to make linear algebra applications both easier to program and more portable. This package contains BLACS libraries compiled with mpich. %package -n blacs-mpich-devel Summary: Development libraries for blacs (mpich) Group: Development/Libraries Requires: blacs-mpich = %{version}-%{release} Provides: blacs-mpich2-devel = %{version}-%{release} Obsoletes: blacs-mpich2-devel < 1.1-50 Requires: mpich-devel %description -n blacs-mpich-devel This package contains development libraries for blacs, compiled against mpich. %package -n blacs-mpich-static Summary: Static libraries for blacs (mpich) Group: Development/Libraries Requires: blacs-mpich-devel = %{version}-%{release} Provides: blacs-mpich2-static = %{version}-%{release} Obsoletes: blacs-mpich2-static < 1.1-50 %description -n blacs-mpich-static This package contains static libraries for blacs, compiled against mpich. %package mpich Summary: ScaLAPACK libraries compiled against mpich Group: Development/Libraries Requires: %{name}-common = %{version}-%{release} Requires: mpich Provides: %{name}-mpich2 = %{version}-%{release} Obsoletes: %{name}-mpich2 < 1.7.5-19 # This is a lie, but something needs to obsolete it. Provides: %{name}-lam = %{version}-%{release} Obsoletes: %{name}-lam <= 1.7.5-7 %description mpich The ScaLAPACK (or Scalable LAPACK) library includes a subset of LAPACK routines redesigned for distributed memory MIMD parallel computers. It is currently written in a Single-Program-Multiple-Data style using explicit message passing for inter-processor communication. It assumes matrices are laid out in a two-dimensional block cyclic decomposition. ScaLAPACK is designed for heterogeneous computing and is portable on any computer that supports MPI or PVM. Like LAPACK, the ScaLAPACK routines are based on block-partitioned algorithms in order to minimize the frequency of data movement between different levels of the memory hierarchy. (For such machines, the memory hierarchy includes the off-processor memory of other processors, in addition to the hierarchy of registers, cache, and local memory on each processor.) The fundamental building blocks of the ScaLAPACK library are distributed memory versions (PBLAS) of the Level 1, 2 and 3 BLAS, and a set of Basic Linear Algebra Communication Subprograms (BLACS) for communication tasks that arise frequently in parallel linear algebra computations. In the ScaLAPACK routines, all inter-processor communication occurs within the PBLAS and the BLACS. One of the design goals of ScaLAPACK was to have the ScaLAPACK routines resemble their LAPACK equivalents as much as possible. This package contains ScaLAPACK libraries compiled with mpich. %package mpich-devel Summary: Development libraries for ScaLAPACK (mpich) Group: Development/Libraries Requires: %{name}-mpich = %{version}-%{release} Requires: mpich-devel Provides: %{name}-lam-devel = %{version}-%{release} Obsoletes: %{name}-lam-devel <= 1.7.5-7 Provides: %{name}-mpich2-devel = %{version}-%{release} Obsoletes: %{name}-mpich2-devel < 1.7.5-19 %description mpich-devel This package contains development libraries for ScaLAPACK, compiled against mpich. %package mpich-static Summary: Static libraries for ScaLAPACK (mpich) Group: Development/Libraries Provides: %{name}-lam-static = %{version}-%{release} Obsoletes: %{name}-lam-static <= 1.7.5-7 Requires: %{name}-mpich-devel = %{version}-%{release} Provides: %{name}-mpich2-static = %{version}-%{release} Obsoletes: %{name}-mpich2-static < 1.7.5-19 %description mpich-static This package contains static libraries for ScaLAPACK, compiled against mpich. %endif %if %{with openmpi} %package -n blacs-openmpi Summary: BLACS libraries compiled against openmpi Group: Development/Libraries Requires: blacs-common = %{version}-%{release} Requires: openmpi %description -n blacs-openmpi The BLACS (Basic Linear Algebra Communication Subprograms) project is an ongoing investigation whose purpose is to create a linear algebra oriented message passing interface that may be implemented efficiently and uniformly across a large range of distributed memory platforms. The length of time required to implement efficient distributed memory algorithms makes it impractical to rewrite programs for every new parallel machine. The BLACS exist in order to make linear algebra applications both easier to program and more portable. This package contains BLACS libraries compiled with openmpi. %package -n blacs-openmpi-devel Summary: Development libraries for blacs (openmpi) Group: Development/Libraries Requires: blacs-openmpi = %{version}-%{release} Requires: openmpi-devel %description -n blacs-openmpi-devel This package contains development libraries for blacs, compiled against openmpi. %package -n blacs-openmpi-static Summary: Static libraries for blacs (openmpi) Group: Development/Libraries Requires: blacs-openmpi-static = %{version}-%{release} %description -n blacs-openmpi-static This package contains static libraries for blacs, compiled against openmpi. %package openmpi Summary: ScaLAPACK libraries compiled against openmpi Group: Development/Libraries Requires: %{name}-common = %{version}-%{release} Requires: openmpi %description openmpi The ScaLAPACK (or Scalable LAPACK) library includes a subset of LAPACK routines redesigned for distributed memory MIMD parallel computers. It is currently written in a Single-Program-Multiple-Data style using explicit message passing for inter-processor communication. It assumes matrices are laid out in a two-dimensional block cyclic decomposition. ScaLAPACK is designed for heterogeneous computing and is portable on any computer that supports MPI or PVM. Like LAPACK, the ScaLAPACK routines are based on block-partitioned algorithms in order to minimize the frequency of data movement between different levels of the memory hierarchy. (For such machines, the memory hierarchy includes the off-processor memory of other processors, in addition to the hierarchy of registers, cache, and local memory on each processor.) The fundamental building blocks of the ScaLAPACK library are distributed memory versions (PBLAS) of the Level 1, 2 and 3 BLAS, and a set of Basic Linear Algebra Communication Subprograms (BLACS) for communication tasks that arise frequently in parallel linear algebra computations. In the ScaLAPACK routines, all inter-processor communication occurs within the PBLAS and the BLACS. One of the design goals of ScaLAPACK was to have the ScaLAPACK routines resemble their LAPACK equivalents as much as possible. This package contains ScaLAPACK libraries compiled with openmpi. %package openmpi-devel Summary: Development libraries for ScaLAPACK (openmpi) Group: Development/Libraries Requires: %{name}-openmpi = %{version}-%{release} Requires: openmpi-devel %description openmpi-devel This package contains development libraries for ScaLAPACK, compiled against openmpi. %package openmpi-static Summary: Static libraries for ScaLAPACK (openmpi) Group: Development/Libraries Requires: %{name}-openmpi-devel = %{version}-%{release} %description openmpi-static This package contains static libraries for ScaLAPACK, compiled against openmpi. %endif %prep %setup -q -c -n %{name}-%{version} %patch0 -p1 %patch1 -p1 %patch2 -p1 %if %{without openblas} %patch3 -p1 %endif for i in %{?with_mpich:mpich} %{?with_openmpi:openmpi}; do cp -a %{name}-%{version} %{name}-%{version}-$i done %build %global dobuild() \ cd %{name}-%{version}-$MPI_COMPILER_NAME ; \ make lib ; \ cd .. %if %{with mpich} # Build mpich version export MPI_COMPILER_NAME=mpich %{_mpich_load} RPM_OPT_FLAGS=`echo $CFLAGS` %dobuild %{_mpich_unload} %endif %if %{with openmpi} # Build OpenMPI version export MPI_COMPILER_NAME=openmpi %{_openmpi_load} RPM_OPT_FLAGS=`echo $CFLAGS` %dobuild %{_openmpi_unload} %endif %install mkdir -p ${RPM_BUILD_ROOT}%{_libdir} mkdir -p ${RPM_BUILD_ROOT}%{_bindir} for i in %{?with_mpich:mpich} %{?with_openmpi:openmpi}; do mkdir -p %{buildroot}%{_libdir}/$i/lib/ pushd %{name}-%{version}-$i for f in *.a *.so*; do cp -f $f %{buildroot}%{_libdir}/$i/lib/$f done popd mkdir -p %{buildroot}%{_includedir}/$i-%{_arch}/ # This file is independent of the MPI compiler used, but it is poorly named # So we'll put it in %{_includedir}/blacs/ mkdir -p %{buildroot}%{_includedir}/blacs/ install -p %{name}-%{version}-$i/BLACS/SRC/Bdef.h %{buildroot}%{_includedir}/blacs/ pushd %{buildroot}%{_libdir}/$i/lib/ ln -fs libscalapack.so.2.0.0 libscalapack.so.2 ln -s libscalapack.so.2.0.0 libscalapack.so # ln -fs libmpiblacs.so.2.0.0 libmpiblacs.so.2 # ln -s libmpiblacs.so.2.0.0 libmpiblacs.so popd done # Copy docs cd %{name}-%{version} cp -f README ../ %files common %doc README %files -n blacs-common %{_includedir}/blacs/ %if %{with mpich} %files -n blacs-mpich %{_libdir}/mpich/lib/libmpiblacs*.so.* %files -n blacs-mpich-devel %{_includedir}/mpich-%{_arch}/ %{_libdir}/mpich/lib/libmpiblacs*.so %files -n blacs-mpich-static %{_libdir}/mpich/lib/libmpiblacs*.a %files mpich %{_libdir}/mpich/lib/libscalapack.so.* %files mpich-devel %{_libdir}/mpich/lib/libscalapack.so %files mpich-static %{_libdir}/mpich/lib/libscalapack.a %endif %if %{with openmpi} %files -n blacs-openmpi %{_libdir}/openmpi/lib/libmpiblacs*.so.* %files -n blacs-openmpi-devel %{_includedir}/openmpi-%{_arch}/ %{_libdir}/openmpi/lib/libmpiblacs*.so %files -n blacs-openmpi-static %{_libdir}/openmpi/lib/libmpiblacs*.a %files openmpi %{_libdir}/openmpi/lib/libscalapack.so.* %files openmpi-devel %{_libdir}/openmpi/lib/libscalapack.so %files openmpi-static %{_libdir}/openmpi/lib/libscalapack.a %endif %changelog * Mon Nov 26 2018 Orion Poplawski - 2.0.2-27 - Rebuild for openmpi 4.0 * Sat Jul 14 2018 Fedora Release Engineering - 2.0.2-26 - Rebuilt for https://fedoraproject.org/wiki/Fedora_29_Mass_Rebuild * Fri Feb 09 2018 Fedora Release Engineering - 2.0.2-25 - Rebuilt for https://fedoraproject.org/wiki/Fedora_28_Mass_Rebuild * Thu Feb 01 2018 Ralf Corsépius - 2.0.2-24 - Rebuilt for GCC-8.0.1. * Thu Aug 03 2017 Fedora Release Engineering - 2.0.2-23 - Rebuilt for https://fedoraproject.org/wiki/Fedora_27_Binutils_Mass_Rebuild * Thu Jul 27 2017 Fedora Release Engineering - 2.0.2-22 - Rebuilt for https://fedoraproject.org/wiki/Fedora_27_Mass_Rebuild * Mon Feb 06 2017 Zbigniew Jędrzejewski-Szmek - 2.0.2-21 - Rebuild for libgfortran.so.4 * Tue Nov 29 2016 Dan Horák - 2.0.2-20 - still use blas if openblas is not available * Mon Nov 28 2016 Tom Callaway - 2.0.2-19 - build against openblas * Tue Oct 25 2016 Dan Horák - 2.0.2-18 - enable build on s390(x) and fix the logic for alt-arches in Fedora * Fri Oct 21 2016 Orion Poplawski - 2.0.2-17 - Rebuild for openmpi 2.0 * Fri Aug 26 2016 Tom Callaway - 2.0.2-16 - conditionalize mpich cases on old rhel arches * Thu Jul 28 2016 Tom Callaway - 2.0.2-15 - fix scalapack shared library to properly have blacs inside it - add explicit openmpi/mpich requires * Thu Feb 04 2016 Fedora Release Engineering - 2.0.2-14 - Rebuilt for https://fedoraproject.org/wiki/Fedora_24_Mass_Rebuild * Tue Jan 26 2016 Tom Callaway - 2.0.2-13 - use global instead of define * Tue Jan 19 2016 Orion Poplawski - 2.0.2-12 - Make blacs-openmpi require blacs-common (bug #1299939) * Tue Sep 15 2015 Orion Poplawski - 2.0.2-11 - Rebuild for openmpi 1.10.0 * Sat Aug 15 2015 Zbigniew Jędrzejewski-Szmek - 2.0.2-10 - Rebuild for MPI provides * Sun Jul 26 2015 Sandro Mani - 2.0.2-9 - Rebuild for RPM MPI Requires Provides Change * Fri Jun 19 2015 Fedora Release Engineering - 2.0.2-8 - Rebuilt for https://fedoraproject.org/wiki/Fedora_23_Mass_Rebuild * Sat Mar 14 2015 Thomas Spura - 2.0.2-7 - Rebuild for changed mpich libraries * Thu Dec 18 2014 Tom Callaway - 2.0.2-6 - add missing functions (thanks to d.loveliverpool.ac.uk) * Thu Sep 4 2014 Thomas Spura - 2.0.2-5 - rebuild for changed library inside openmpi (#1135728) * Mon Aug 18 2014 Fedora Release Engineering - 2.0.2-4 - Rebuilt for https://fedoraproject.org/wiki/Fedora_21_22_Mass_Rebuild * Tue Jul 1 2014 Tom Callaway - 2.0.2-3 - explictly link to other dependent libs * Sun Jun 08 2014 Fedora Release Engineering - 2.0.2-2 - Rebuilt for https://fedoraproject.org/wiki/Fedora_21_Mass_Rebuild * Sat May 3 2014 Tom Callaway - 2.0.2-1 - update to 2.0.2 - subpackage blacs * Sun Aug 04 2013 Fedora Release Engineering - 1.7.5-20 - Rebuilt for https://fedoraproject.org/wiki/Fedora_20_Mass_Rebuild * Sat Jul 20 2013 Deji Akingunola - 1.7.5-19 - Rename mpich2 sub-packages to mpich and rebuild for mpich-3.0 * Thu Feb 14 2013 Fedora Release Engineering - 1.7.5-18 - Rebuilt for https://fedoraproject.org/wiki/Fedora_19_Mass_Rebuild * Thu Nov 15 2012 Tom Callaway - 1.7.5-17 - rebuild for new mpich2 * Sat Jul 21 2012 Fedora Release Engineering - 1.7.5-16 - Rebuilt for https://fedoraproject.org/wiki/Fedora_18_Mass_Rebuild * Sat Jan 14 2012 Fedora Release Engineering - 1.7.5-15 - Rebuilt for https://fedoraproject.org/wiki/Fedora_17_Mass_Rebuild * Wed Aug 03 2011 Jussi Lehtola - 1.7.5-14 - Bump spec. * Wed Aug 03 2011 Jussi Lehtola - 1.7.5-13 - Honor MPI guidelines wrt placement of libraries. - Drop unnecessary module file. - A few rpmlint fixes. * Tue Mar 29 2011 Deji Akingunola - 1.7.5-12 - Rebuild for mpich2 soname bump * Wed Feb 09 2011 Fedora Release Engineering - 1.7.5-12 - Rebuilt for https://fedoraproject.org/wiki/Fedora_15_Mass_Rebuild * Tue Oct 19 2010 Deji Akingunola - 1.7.5-11 - Rebuild for both mpich2 and openmpi updates * Wed Jul 7 2010 Tom "spot" Callaway - 1.7.5-10 - Move all BuildRequires to the top of the spec file - -static packages now Require matching -devel package, they're not very useful otherwise * Tue Dec 15 2009 Deji Akingunola - 1.7.5-9 - Buildrequire mpich2-devel-static * Wed Dec 9 2009 Tom "spot" Callaway - 1.7.5-8 - drop lam support (Provides/Obsoletes by mpich2, which is a hack, but something's gotta do it) - move static libs to static subpackages (resolves bz 545150) * Thu Aug 6 2009 Tom "spot" Callaway - 1.7.5-7 - rework package to handle all supported MPI environments in Fedora * Sun Jul 26 2009 Fedora Release Engineering - 1.7.5-6 - Rebuilt for https://fedoraproject.org/wiki/Fedora_12_Mass_Rebuild * Wed Feb 25 2009 Fedora Release Engineering - 1.7.5-5 - Rebuilt for https://fedoraproject.org/wiki/Fedora_11_Mass_Rebuild * Tue Sep 23 2008 Tom "spot" Callaway 1.7.5-4 - incorporate Deji Akingunola's changes (bz 462424) - build against openmpi instead of lam * Tue Jul 8 2008 Tom "spot" Callaway 1.7.5-3 - fix compile against new lam paths * Wed Feb 13 2008 Tom "spot" Callaway 1.7.5-2 - rebuild for new gcc * Mon Aug 27 2007 Tom "spot" Callaway 1.7.5-1.1 - rebuild for BuildID * Thu Jan 18 2007 Tom "spot" Callaway 1.7.5-1 - bump to 1.7.5 * Fri Sep 15 2006 Tom "spot" Callaway 1.7.4-4 - I said "BR" not "R". Stupid packager. * Fri Sep 15 2006 Tom "spot" Callaway 1.7.4-3 - fix BR: lam-devel * Fri Sep 15 2006 Tom "spot" Callaway 1.7.4-2 - fix 64bit patch * Fri Sep 15 2006 Tom "spot" Callaway 1.7.4-1 - bump to 1.7.4 * Wed Mar 8 2006 Tom "spot" Callaway 1.7-13 - lam moved into _libdir/lam... need to fix patches * Wed Mar 8 2006 Tom "spot" Callaway 1.7-12 - set -fPIC as NOOPT * Sun Feb 26 2006 Tom "spot" Callaway 1.7-11 - fix 64 bit builds - enable shared libraries - split package into base and devel * Tue Feb 14 2006 Tom "spot" Callaway 1.7-10 - Incorporate Andrew Gormanly's fixes * Mon Jan 9 2006 Tom "spot" Callaway 1.7-9 - fix BR * Mon Dec 19 2005 Tom "spot" Callaway 1.7-8 - rebuild for gcc4.1 * Sun May 15 2005 Tom "spot" Callaway 1.7-7 - 64 bit library fix * Mon May 9 2005 Tom "spot" Callaway 1.7-6 - remove hardcoded dist tags * Sun May 8 2005 Tom "spot" Callaway 1.7-4 - fix broken patch for fc-3 branch * Sun Apr 24 2005 Tom "spot" Callaway 1.7-3 - use dist tag - fix fc3 BuildRequires * Tue Apr 19 2005 Tom "spot" Callaway 1.7-2 - fix buildroot - add gcc-gfortran to BuildRequires (gcc-g77 for fc3) * Mon Apr 18 2005 Tom "spot" Callaway 1.7-1 - initial package creation