Molcas Forum

Support and discussions for Molcas and OpenMolcas users and developers

You are not logged in.

Announcement

Welcome to the Molcas forum.

Please note: The forum's URL has changed. The new URL is: https://molcasforum.univie.ac.at. Please update your bookmarks!

You can choose an avatar and change the default style by going to "Profile" → "Personality" or "Display".

#1 2022-06-02 07:52:43

shaoh
Member
Registered: 2019-07-18
Posts: 10

Problem with parallelization

I tried to install OpenMolcas on a new cluster. Everything went well until I received an error message in the middle of "make". Showing below is some of more information that might be useful:

****************** successful (or maybe not?) cmake *****************************************

(huiling) [huilings@login4 OpenMolcas]$ mkdir build
(huiling) [huilings@login4 OpenMolcas]$ cd build
(huiling) [huilings@login4 build]$ CC=mpicc FC=mpifort cmake -DMPI=ON -DGA=ON -DGA_BUILD=ON ..
-- The Fortran compiler identification is GNU 7.5.0
-- The C compiler identification is GNU 7.5.0
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Check for working Fortran compiler: /u/local/mpi/openmpi/1.10.6/gcc-7.5.0/bin/mpifort - skipped
-- Checking whether /u/local/mpi/openmpi/1.10.6/gcc-7.5.0/bin/mpifort supports Fortran 90
-- Checking whether /u/local/mpi/openmpi/1.10.6/gcc-7.5.0/bin/mpifort supports Fortran 90 - yes
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /u/local/mpi/openmpi/1.10.6/gcc-7.5.0/bin/mpicc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Found Python: /u/home/h/huilings/.conda/envs/huiling/bin/python3.7 (found version "3.7.10") found components: Interpreter
Configuring compilers:
Detecting Molcas version info:
-- OPENMOLCAS_VERSION: v22.02-324-gf8fe719
Detecting system info:
-- OS: Linux-x86_64
-- ADDRMODE: 64
-- PLATFORM: LINUX64
Configuring with MPI parallellization:
-- Found MPI_C: /u/local/mpi/openmpi/1.10.6/gcc-7.5.0/bin/mpicc (found version "3.0")
-- Found MPI_Fortran: /u/local/mpi/openmpi/1.10.6/gcc-7.5.0/bin/mpifort (found version "3.0")
-- Found MPI: TRUE (found version "3.0")
-- MPI_C_INCLUDE_PATH:
-- MPI_Fortran_INCLUDE_PATH:
-- MPI_C_LIBRARIES:
-- MPI_Fortran_LIBRARIES:
-- MPIEXEC: /u/local/mpi/openmpi/1.10.6/gcc-7.5.0/bin/mpiexec
-- MPI_IMPLEMENTATION: openmpi
Configuring HDF5 support:
-- HDF5 C compiler wrapper is unable to compile a minimal HDF5 program.
-- Found HDF5: /usr/lib64/libhdf5.so (found version "1.8.12") found components: C
-- HDF5_INCLUDE_DIRS: /usr/include
-- HDF5_C_LIBRARIES: /usr/lib64/libhdf5.so
Configuring linear algebra libraries:
-- Using internal LAPACK+BLAS libraries (SLOW!)
-- LINALG_LIBRARIES: lapack;blas
Configuring Libxc:
-- Configuring built-in Libxc
-- Libxc_INCLUDE_DIRS: /u/home/h/huilings/software/OpenMolcas/build/External/Libxc_install/include
-- Libxc_LIBRARIES: xcf03;xc
Configuring built-in Global Arrays library:
-- GA_INCLUDE_PATH: /u/home/h/huilings/software/OpenMolcas/build/External/GlobalArrays_install/include/ga
-- GA_LIBRARIES: ga;armci
Gromacs interface DISABLED
BLOCK interface DISABLED
CHEMPS2 interface DISABLED
MSYM support DISABLED
QCMaquis DMRG support DISABLED
NECI interface DISABLED
EFP interface DISABLED
GEN1INT support DISABLED
libwfa support DISABLED
NEVPT2 support DISABLED
MolGUI DISABLED
Configuring runtime environment settings:
-- DEFMOLCASMEM:  2048
-- DEFMOLCASDISK: 20000
-- RUNSCRIPT:    $program $input
-- RUNBINARY:    /u/local/mpi/openmpi/1.10.6/gcc-7.5.0/bin/mpiexec -n $MOLCAS_NPROCS  $program
-- RUNBINARYSER: $program
Build type: Release
-- C compiler: /u/local/mpi/openmpi/1.10.6/gcc-7.5.0/bin/mpicc
-- C compiler flags:  -std=gnu99  -O2
-- Fortran compiler: /u/local/mpi/openmpi/1.10.6/gcc-7.5.0/bin/mpifort
-- Fortran compiler flags:  -fno-aggressive-loop-optimizations -cpp -fdefault-integer-8 -fmax-stack-var-size=1048576 -O2
-- Definitions: _MOLCAS_;_I8_;_LINUX_;_MOLCAS_MPP_;_GA_
-- Debug definitions:
-- pymolcas: added to targets
Definitions: -D_MOLCAS_;-D_I8_;-D_LINUX_;-D_MOLCAS_MPP_;-D_GA_
Configuring documentation
-- Sphinx compiler: no sphinx-build available, documentation disabled
Install directory: /opt/OpenMolcas
-- Configuring done
-- Generating done
-- Build files have been written to: /u/home/h/huilings/software/OpenMolcas/build

************************make in progress for about 30 mins *************************************

(huiling) [huilings@login4 build]$ make
Scanning dependencies of target lapack
[  0%] Building Fortran object CMakeFiles/lapack.dir/src/LinAlg_internal/la_constants_.F90.o
[  0%] Building Fortran object CMakeFiles/lapack.dir/src/LinAlg_internal/la_xisnan_.F90.o
[  0%] Building Fortran object CMakeFiles/lapack.dir/src/LinAlg_internal/lapack.f.o

*******************************Error approaching *********************************************

[ 63%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/drvn1_emb.F90.o
[ 63%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/drvn1.F90.o
[ 63%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/inputg.F90.o
[ 63%] Built target alaska_obj
Scanning dependencies of target alaska
[ 63%] Linking Fortran static library ../../lib/libalaska.a
[ 63%] Built target alaska
Scanning dependencies of target alaska.exe
[ 63%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska.exe.dir/main.F90.o
make[2]: *** No rule to make target `External/GlobalArrays_install/lib/libga.a', needed by `bin/alaska.exe'.  Stop.
make[1]: *** [CMakeFiles/alaska/CMakeFiles/alaska.exe.dir/all] Error 2
make: *** [all] Error 2

Offline

#2 2022-06-02 09:59:33

Ignacio
Administrator
From: Uppsala
Registered: 2015-11-03
Posts: 1,011

Re: Problem with parallelization

You could compare with https://gitlab.com/Molcas/OpenMolcas/-/ … 388571/raw

GlobalArrays should be built after Libxc:

[  0%] Completed 'Libxc'
[  0%] Built target Libxc
[  0%] Creating directories for 'GlobalArrays'
[  0%] Performing download step (git clone) for 'GlobalArrays'
Cloning into 'GlobalArrays'...
HEAD is now at da7a53ff Merge branch 'develop'
patching file cmake/linalg-modules/util/CommonFunctions.cmake
[  0%] Performing update step for 'GlobalArrays'
[  0%] No patch step for 'GlobalArrays'
[  0%] Performing configure step for 'GlobalArrays'

...

[100%] Built target armci
[  0%] Performing update_hash step for 'GlobalArrays'
[  0%] Performing install step for 'GlobalArrays'

...

[  0%] Completed 'GlobalArrays'
[  0%] Built target GlobalArrays

I guess something failed or is missing there.

Offline

#3 2022-06-02 18:01:46

shaoh
Member
Registered: 2019-07-18
Posts: 10

Re: Problem with parallelization

Based on your response, I think the error may come from the fact I installed an external GlobalArrary prior to my OpenMolcas installation. So I deleted my previous GlobalArray installation and started fresh. However, this time I end up with a different error.

**************************Obtain a copy of the code**************

(huiling) [huilings@login3 software]$ git clone [url]https://gitlab.com/Molcas/OpenMolcas.git[/url]
Cloning into 'OpenMolcas'...
remote: Enumerating objects: 99457, done.
remote: Counting objects: 100% (1766/1766), done.
remote: Compressing objects: 100% (153/153), done.
remote: Total 99457 (delta 1711), reused 1613 (delta 1613), pack-reused 97691
Receiving objects: 100% (99457/99457), 92.52 MiB | 33.00 MiB/s, done.
Resolving deltas: 100% (82062/82062), done.
Checking out files: 100% (8217/8217), done.

***********************Install submodule *************************

(huiling) [huilings@login3 software]$ cd OpenMolcas/
(huiling) [huilings@login3 OpenMolcas]$ git submodule update --init External/lapack
Submodule 'External/lapack' ([url]https://github.com/Reference-LAPACK/lapack.git[/url]) registered for path 'External/lapack'
Cloning into 'External/lapack'...
remote: Enumerating objects: 73957, done.
remote: Counting objects: 100% (81/81), done.
remote: Compressing objects: 100% (76/76), done.
remote: Total 73957 (delta 51), reused 17 (delta 5), pack-reused 73876
Receiving objects: 100% (73957/73957), 22.55 MiB | 18.01 MiB/s, done.
Resolving deltas: 100% (69859/69859), done.
Submodule path 'External/lapack': checked out 'aa631b4b4bd13f6ae2dbab9ae9da209e1e05b0fc'

***************************Compilation - load modules *******************************

(huiling) [huilings@login3 OpenMolcas]$ ls
CMakeLists.txt   CONTRIBUTORS.md  LICENSE    Tools          cmake            data  sbin  test
CONTRIBUTING.md  External         README.md  basis_library  configure-cmake  doc   src   unit_tests
(huiling) [huilings@login3 OpenMolcas]$ mkdir build
(huiling) [huilings@login3 OpenMolcas]$ cd build
(huiling) [huilings@login3 build]$ module load intel
(huiling) [huilings@login3 build]$ module load gcc
(huiling) [huilings@login3 build]$ module load cmake/3.19.5
(huiling) [huilings@login3 build]$ export FC=ifort
(huiling) [huilings@login3 build]$ export CC=icc
(huiling) [huilings@login3 build]$ export CXX=icpc

***************************Compilation - cmake without mpi works *******************************

(huiling) [huilings@login3 build]$ cmake ../
-- The Fortran compiler identification is GNU 7.5.0
-- The C compiler identification is GNU 7.5.0
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Check for working Fortran compiler: /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpifort - skipped
-- Checking whether /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpifort supports Fortran 90
-- Checking whether /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpifort supports Fortran 90 - yes
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpicc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Found Python: /u/home/h/huilings/.conda/envs/huiling/bin/python3.7 (found version "3.7.10") found components: Interpreter
Configuring compilers:
Detecting Molcas version info:
-- OPENMOLCAS_VERSION: v22.02-324-gf8fe719
Detecting system info:
-- OS: Linux-x86_64
-- ADDRMODE: 64
-- PLATFORM: LINUX64
Configuring HDF5 support:
-- HDF5 C compiler wrapper is unable to compile a minimal HDF5 program.
-- Found HDF5: /usr/lib64/libhdf5.so (found version "1.8.12") found components: C
-- HDF5_INCLUDE_DIRS: /usr/include
-- HDF5_C_LIBRARIES: /usr/lib64/libhdf5.so
Configuring linear algebra libraries:
-- Using internal LAPACK+BLAS libraries (SLOW!)
-- LINALG_LIBRARIES: lapack;blas
Configuring Libxc:
-- Configuring built-in Libxc
-- Libxc_INCLUDE_DIRS: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/include
-- Libxc_LIBRARIES: xcf03;xc
Gromacs interface DISABLED
BLOCK interface DISABLED
CHEMPS2 interface DISABLED
MSYM support DISABLED
QCMaquis DMRG support DISABLED
NECI interface DISABLED
EFP interface DISABLED
GEN1INT support DISABLED
libwfa support DISABLED
NEVPT2 support DISABLED
MolGUI DISABLED
Configuring runtime environment settings:
-- DEFMOLCASMEM:  2048
-- DEFMOLCASDISK: 20000
-- RUNSCRIPT:    $program $input
-- RUNBINARY:    $program
-- RUNBINARYSER: $program
Build type: Release
-- C compiler: /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpicc
-- C compiler flags:  -std=gnu99  -O2
-- Fortran compiler: /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpifort
-- Fortran compiler flags:  -fno-aggressive-loop-optimizations -cpp -fdefault-integer-8 -fmax-stack-var-size=1048576 -O2
-- Definitions: _MOLCAS_;_I8_;_LINUX_
-- Debug definitions:
-- pymolcas: added to targets
Copying hook "/u/home/h/huilings/software/OpenMolcas/OpenMolcas/sbin/pre-commit" into "/u/home/h/huilings/software/OpenMolcas/OpenMolcas/.git/hooks/pre-commit"
Definitions: -D_MOLCAS_;-D_I8_;-D_LINUX_
Configuring documentation
-- Sphinx compiler: no sphinx-build available, documentation disabled
Install directory: /opt/OpenMolcas
-- Configuring done
-- Generating done
-- Build files have been written to: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build

*********************************Cmake with mpi***********************************
*********************************checking mpi compilers***************************

(huiling) [huilings@login3 build]$ module load openmpi/4.1.1

(huiling) [huilings@login3 build]$ module load openmpi/4.1.1
(huiling) [huilings@login3 build]$ mpicc -show
gcc -I/u/local/mpi/openmpi/4.1.2/include -pthread -L/u/local/mpi/openmpi/4.1.2/lib -Wl,-rpath -Wl,/u/local/mpi/openmpi/4.1.2/lib -Wl,--enable-new-dtags -lmpi
(huiling) [huilings@login3 build]$ mpifort -show
gfortran -I/u/local/mpi/openmpi/4.1.2/include -pthread -I/u/local/mpi/openmpi/4.1.2/lib -L/u/local/mpi/openmpi/4.1.2/lib -Wl,-rpath -Wl,/u/local/mpi/openmpi/4.1.2/lib -Wl,--enable-new-dtags -lmpi_usempi -lmpi_mpifh -lmpi

*********************************closed down and rerun cmake with MPI parallellization worked***************************
******************************problem associated with export*******************************

(huiling) [huilings@login2 ~]$ module ava cmake
--------------------------------- /u/local/Modules/modulefiles ---------------------------------
cmake/3.17.2  cmake/3.19.5
(huiling) [huilings@login2 ~]$ module load cmake/3.19.5

(huiling) [huilings@login2 build]$ cmake ..
-- The Fortran compiler identification is GNU 8.3.0
-- The C compiler identification is GNU 8.3.0
-- Check for working Fortran compiler: /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpifort
-- Check for working Fortran compiler: /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpifort  -- works
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Checking whether /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpifort supports Fortran 90
-- Checking whether /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpifort supports Fortran 90 -- yes
-- Check for working C compiler: /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpicc
-- Check for working C compiler: /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpicc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Found Python: /u/home/h/huilings/.conda/envs/huiling/bin/python3.7 (found version "3.7.10") found components:  Interpreter
Configuring compilers:
Detecting Molcas version info:
-- OPENMOLCAS_VERSION: v22.02-324-gf8fe719
Detecting system info:
-- OS: Linux-x86_64
-- ADDRMODE: 64
-- PLATFORM: LINUX64
Configuring with MPI parallellization:
-- Found MPI_Fortran: /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/lib/libmpi_usempif08.so (found version "3.1")
-- Found MPI: TRUE (found version "3.1")
-- MPI_C_INCLUDE_PATH:
-- MPI_Fortran_INCLUDE_PATH: /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/include;/u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/lib
-- MPI_C_LIBRARIES:
-- MPI_Fortran_LIBRARIES: /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/lib/libmpi_usempif08.so;/u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/lib/libmpi_usempi_ignore_tkr.so;/u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/lib/libmpi_mpifh.so;/u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/lib/libmpi.so
-- MPIEXEC: /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpiexec
-- MPI_IMPLEMENTATION: openmpi
Configuring HDF5 support:
-- HDF5: Using hdf5 compiler wrapper to determine C configuration
-- Found HDF5: /usr/lib64/libhdf5.so;/usr/lib64/libsz.so;/usr/lib64/libz.so;/usr/lib64/libdl.so;/usr/lib64/libm.so (found version "1.8.12") found components:  C
-- HDF5_INCLUDE_DIRS: /usr/include
-- HDF5_C_LIBRARIES: /usr/lib64/libhdf5.so;/usr/lib64/libsz.so;/usr/lib64/libz.so;/usr/lib64/libdl.so;/usr/lib64/libm.so
Configuring linear algebra libraries:
-- Using internal LAPACK+BLAS libraries (SLOW!)
-- LINALG_LIBRARIES: lapack;blas
Configuring Libxc:
-- Configuring built-in Libxc
-- Libxc_INCLUDE_DIRS: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/include
-- Libxc_LIBRARIES: xcf03;xc
Configuring built-in Global Arrays library:
-- GA_INCLUDE_PATH: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/GlobalArrays_install/include/ga
-- GA_LIBRARIES: ga;armci
Gromacs interface DISABLED
BLOCK interface DISABLED
CHEMPS2 interface DISABLED
MSYM support DISABLED
QCMaquis DMRG support DISABLED
NECI interface DISABLED
EFP interface DISABLED
GEN1INT support DISABLED
libwfa support DISABLED
NEVPT2 support DISABLED
MolGUI DISABLED
Configuring runtime environment settings:
-- DEFMOLCASMEM:  2048
-- DEFMOLCASDISK: 20000
-- RUNSCRIPT:    $program $input
-- RUNBINARY:    /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpiexec -n $MOLCAS_NPROCS  $program
-- RUNBINARYSER: $program
Build type: Release
-- C compiler: /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpicc
-- C compiler flags:  -std=gnu99  -O2
-- Fortran compiler: /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpifort
-- Fortran compiler flags:  -fno-aggressive-loop-optimizations -cpp -fdefault-integer-8 -fmax-stack-var-size=1048576 -O2
-- Definitions: _MOLCAS_;_I8_;_LINUX_;_MOLCAS_MPP_;_GA_
-- Debug definitions:
-- pymolcas: added to targets
Definitions: -D_MOLCAS_;-D_I8_;-D_LINUX_;-D_MOLCAS_MPP_;-D_GA_
Configuring documentation
-- Sphinx compiler: no sphinx-build available, documentation disabled
Install directory: /opt/OpenMolcas
-- Configuring done
-- Generating done
-- Build files have been written to: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build

***************************************make********************************************************

(huiling) [huilings@login2 build]$ make -j 16
Scanning dependencies of target blas
Scanning dependencies of target Libxc
Scanning dependencies of target prgms
Scanning dependencies of target lapack
Scanning dependencies of target parnell.exe
Scanning dependencies of target pymolcas_target
Scanning dependencies of target fruit_molcas
[  0%] Creating directories for 'Libxc'
[  0%] Generating data/MolGUI.prgm
[  0%] Generating data/caspt2.prgm
[  0%] Generating data/alaska.prgm
[  0%] Built target pymolcas_target
[  0%] Generating data/averd.prgm
[  0%] Building Fortran object unit_tests/linalg_mod/bin/CMakeFiles/fruit_molcas.dir/fruit.f90.o
[  0%] Generating data/chcc.prgm
[  0%] Generating data/casvb.prgm
[  0%] Generating data/check.prgm
[  0%] Generating data/ccsdt.prgm
[  0%] Generating data/cpf.prgm
[  1%] Generating data/dmrgscf.prgm
[  1%] Generating data/cht3.prgm

******************************stuff going until error comes out *************************

[ 98%] Built target xcf03
[ 99%] Built target xc-info
[100%] Built target xcf90
Install the project...
-- Install configuration: "Release"
-- Installing: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/include/xc.h
-- Installing: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/include/xc_funcs.h
-- Installing: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/include/xc_funcs_worker.h
-- Installing: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/include/xc_funcs_removed.h
-- Up-to-date: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/include/xc_version.h
-- Installing: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/bin/xc-info
-- Installing: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/lib/libxc.a
-- Up-to-date: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/lib/libxc.a
-- Installing: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/lib/libxcf03.a
-- Installing: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/lib/libxcf90.a
-- Up-to-date: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/include/xc_f03_lib_m.mod
-- Up-to-date: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/include/xc_f90_lib_m.mod
-- Up-to-date: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/lib/pkgconfig/libxcf03.pc
-- Up-to-date: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/lib/pkgconfig/libxcf90.pc
-- Up-to-date: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/share/cmake/Libxc/LibxcConfig.cmake
-- Installing: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/share/cmake/Libxc/LibxcConfigVersion.cmake
-- Old export file "/u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/share/cmake/Libxc/LibxcTargets-C.cmake" will be replaced.  Removing files [/u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/share/cmake/Libxc/LibxcTargets-C-release.cmake].
-- Installing: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/share/cmake/Libxc/LibxcTargets-C.cmake
-- Installing: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/share/cmake/Libxc/LibxcTargets-C-release.cmake
-- Old export file "/u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/share/cmake/Libxc/LibxcTargets-Fortran.cmake" will be replaced.  Removing files [/u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/share/cmake/Libxc/LibxcTargets-Fortran-release.cmake].
-- Installing: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/share/cmake/Libxc/LibxcTargets-Fortran.cmake
-- Installing: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/share/cmake/Libxc/LibxcTargets-Fortran-release.cmake
-- Up-to-date: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/Libxc_install/lib/pkgconfig/libxc.pc
[  1%] Completed 'Libxc'
[  1%] Built target Libxc
make: *** [all] Error 2

Last edited by shaoh (2022-06-02 18:26:09)

Offline

#4 2022-06-02 18:45:51

shaoh
Member
Registered: 2019-07-18
Posts: 10

Re: Problem with parallelization

When trying to solve the problem, I updated the CMake configuration.

 BIGOT                            OFF
 BLOCK                            OFF
 BOUNDS                           OFF
 BUILD_SHARED_LIBS                OFF
 BUILD_TESTING                    ON
 CHEMPS2                          OFF
 CMAKE_BUILD_TYPE                 Release
 CMAKE_INSTALL_PREFIX             /opt/OpenMolcas
 CUBLAS                           OFF
 DEBUG_DEFS
 DEFMOLCASDISK                    20000
 DEFMOLCASMEM                     2048
 DMRG                             OFF
 EFPLIB                           OFF
 EXPERT                           OFF
 EXTERNAL_LIBXC
 EXTRA
 FDE                              OFF
 GA                               ON
 GA_BUILD                         ON
 GCOV                             OFF
 GEN1INT                          OFF
 GPERFTOOLS                       OFF
 GROMACS                          OFF
 HDF5                             OFF
 HDF5_C_LIBRARY_dl                /usr/lib64/libdl.so
 HDF5_C_LIBRARY_hdf5              /usr/lib64/libhdf5.so
 HDF5_C_LIBRARY_m                 /usr/lib64/libm.so
 HDF5_C_LIBRARY_sz                /usr/lib64/libsz.so
 HDF5_C_LIBRARY_z                 /usr/lib64/libz.so
 INSTALL_TESTS                    OFF
 IPO                              OFF
 LIBMKL_BLACS                     /u/local/compilers/intel/2020.4/compilers_and_libraries_2020.4.304/linux/mkl
 LIBMKL_CORE                      /u/local/compilers/intel/2020.4/compilers_and_libraries_2020.4.304/linux/mkl
 LIBMKL_INTERFACE                 /u/local/compilers/intel/2020.4/compilers_and_libraries_2020.4.304/linux/mkl
 LIBMKL_SCALAPACK                 /u/local/compilers/intel/2020.4/compilers_and_libraries_2020.4.304/linux/mkl
 LIBMKL_SEQUENTIAL                /u/local/compilers/intel/2020.4/compilers_and_libraries_2020.4.304/linux/mkl
 LIBMKL_THREADING                 /u/local/compilers/intel/2020.4/compilers_and_libraries_2020.4.304/linux/mkl
 LINALG                           MKL
 MKLROOT                          /u/local/compilers/intel/2020.4/compilers_and_libraries_2020.4.304/linux/mkl
 MKL_INCLUDE_PATH                 /u/local/compilers/intel/2020.4/compilers_and_libraries_2020.4.304/linux/mkl
 MKL_LIBRARY_PATH                 /u/local/compilers/intel/2020.4/compilers_and_libraries_2020.4.304/linux/mkl
 MPI                              ON
 MPI_LAUNCHER                     /u/local/mpi/openmpi/4.1.0/gcc8.3.0-ucx/bin/mpiexec -n $MOLCAS_NPROCS
 MSYM                             OFF
 MolGUI                           OFF
 NECI                             OFF
 NEVPT2                           OFF
 NVBLAS                           OFF
 OPENMP                           OFF
 RUNSCRIPT                        $program $input
 SER_LAUNCHER

*************I then run make again and got the following error*****************************

(huiling) [huilings@login2 build]$ make -j 16
[  0%] Built target Libxc
[  0%] Performing update step for 'GlobalArrays'
[  0%] Built target pymolcas_target
[  1%] Built target parnell.exe
[  1%] Built target fruit_molcas
[  2%] Built target prgms
[  2%] No patch step for 'GlobalArrays'
[  2%] Performing configure step for 'GlobalArrays'
-- CMAKE_MODULE_PATH: /u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays/cmake/linalg-modules;/u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays/cmake
-- Value of ENABLE_CXX was set by user to : OFF
-- Value of ENABLE_FORTRAN was set by user to : ON
-- Setting value of CMAKE_CXX_EXTENSIONS to default : OFF
-- Setting value of CMAKE_BUILD_TYPE to default : Release
-- Setting value of LINALG_VENDOR to default : BLIS
-- Value of ENABLE_TESTS was set by user to : OFF
-- Setting value of ENABLE_PROFILING to default : OFF
-- Value of ENABLE_BLAS was set by user to : ON
-- Setting value of ENABLE_SCALAPACK to default : OFF
-- Setting value of ENABLE_EISPACK to default : OFF
-- Setting value of ENABLE_DPCPP to default : OFF
-- Setting value of GA_RUNTIME to default : MPI_2SIDED
-- Checking MPI ...
-- Could NOT find MPI_Fortran (missing: MPI_Fortran_WORKS)
CMake Error at /u/local/apps/cmake/3.19.5/gcc-4.8.5/share/cmake-3.19/Modules/FindPackageHandleStandardArgs.cmake:218 (message):
  Could NOT find MPI (missing: MPI_Fortran_FOUND) (found version "3.1")

      Reason given by package: MPI component 'CXX' was requested, but language CXX is not enabled.

Call Stack (most recent call first):
  /u/local/apps/cmake/3.19.5/gcc-4.8.5/share/cmake-3.19/Modules/FindPackageHandleStandardArgs.cmake:582 (_FPHSA_FAILURE_MESSAGE)
  /u/local/apps/cmake/3.19.5/gcc-4.8.5/share/cmake-3.19/Modules/FindMPI.cmake:1722 (find_package_handle_standard_args)
  CMakeLists.txt:114 (find_package)


-- Configuring incomplete, errors occurred!
See also "/u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles/CMakeOutput.log".
See also "/u/home/h/huilings/software/OpenMolcas/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles/CMakeError.log".
make[2]: *** [External/GlobalArrays/src/GlobalArrays-stamp/GlobalArrays-configure] Error 1
make[1]: *** [CMakeFiles/GlobalArrays.dir/all] Error 2
make: *** [all] Error 2

Offline

#5 2022-06-03 08:33:35

Ignacio
Administrator
From: Uppsala
Registered: 2015-11-03
Posts: 1,011

Re: Problem with parallelization

If you want to debug "make" problems, do not use the -j flag, or you will not get the error messages where they belong. I believe the relevant error is:

-- Could NOT find MPI_Fortran (missing: MPI_Fortran_WORKS)
CMake Error at /u/local/apps/cmake/3.19.5/gcc-4.8.5/share/cmake-3.19/Modules/FindPackageHandleStandardArgs.cmake:218 (message):
  Could NOT find MPI (missing: MPI_Fortran_FOUND) (found version "3.1")

      Reason given by package: MPI component 'CXX' was requested, but language CXX is not enabled.

But I don't know why it's happening. CXX is not needed for OpenMolcas.

It may have to do with (https://gitlab.com/Molcas/OpenMolcas/-/ … lelization):

Known problems: Sometimes CMake may use wrong MPI wrappers to detect the configuration, which may result in a faulty compilation. To override the MPI wrappers used you can specify them with the options MPI_Fortran_COMPILER, MPI_C_COMPILER and MPI_CXX_COMPILER.

Offline

#6 2022-06-03 08:35:23

Ignacio
Administrator
From: Uppsala
Registered: 2015-11-03
Posts: 1,011

Re: Problem with parallelization

By the way, if you have an external GlobalArrays, you can use it by setting GA_BUILD=OFF and GAROOT=/path/to/ga/installation

Offline

#7 2022-06-03 20:55:45

shaoh
Member
Registered: 2019-07-18
Posts: 10

Re: Problem with parallelization

I cleared everything and started a new installation in a new conda environment. I still end up with similar binding error between GA library. Here are steps I have taken and mistakes I see:

In fresh conda environment molcas python=3.7
**************************Obtain a copy of the OpenMolcas code**************

(huiling) [huilings@login3 software]$ git clone https://gitlab.com/Molcas/OpenMolcas.git 

***********************Install submodule**************************
************************https://github.com/Reference-LAPACK/lapack.git**********

(huiling) [huilings@login3 software]$ cd OpenMolcas/
(huiling) [huilings@login3 OpenMolcas]$ git submodule update --init External/lapack

**************************Compilation - load modules *******************************

(molcas) [huilings@n6675 build]$ cmake ../OpenMolcas/
CMake Error at CMakeLists.txt:22 (cmake_minimum_required):
  CMake 3.12 or higher is required.  You are running version 2.8.12.2
-- Configuring incomplete, errors occurred!
(molcas) [huilings@n6675 build]$ module load cmake/3.19.5

(molcas) [huilings@n6675 build]$ cmake ../OpenMolcas/
...
-- Debug definitions:
CMake Warning at Tools/pymolcas/CMakeLists.txt:37 (message):
  Some Python modules are not available: pyparsing
(molcas) [huilings@n6675 build]$ conda install -c anaconda pyparsing

(molcas) [huilings@n6675 build]$ cmake ../OpenMolcas/

(molcas) [huilings@n6675 build]$ ccmake . 

**************change GA,GA_BUILD,MPI to ON,LINALG = MKL*********************

(molcas) [huilings@n6675 build]$ make

CMake Error at cmake/ga-linalg.cmake:159 (message):
  ENABLE_BLAS=ON, but a LAPACK library was not found
Call Stack (most recent call first):
  CMakeLists.txt:154 (include)
  
(molcas) [huilings@n6675 build]$ conda install -c conda-forge lapack
(molcas) [huilings@n6675 build]$ conda install -c anaconda openblas
(molcas) [huilings@n6675 build]$ conda install -c conda-forge blas

(molcas) [huilings@n6675 build]$ cmake ../OpenMolcas/

(molcas) [huilings@n6675 build]$ make

-- Configuring incomplete, errors occurred!
See also "/u/home/h/huilings/software/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles/CMakeOutput.log".
See also "/u/home/h/huilings/software/OpenMolcas/build/External/GlobalArrays/src/GlobalArrays-build/CMakeFiles/CMakeError.log".
make[2]: *** [External/GlobalArrays/src/GlobalArrays-stamp/GlobalArrays-configure] Error 1
make[1]: *** [CMakeFiles/GlobalArrays.dir/all] Error 2
make: *** [all] Error 2

***************************change LINALG = INTERNAL**************************

(molcas) [huilings@n6675 build]$ cmake ../OpenMolcas/

(molcas) [huilings@n6675 build]$ make

[ 63%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska.exe.dir/main.F90.o
make[2]: *** No rule to make target `External/GlobalArrays_install/lib/libga.a', needed by `bin/alaska.exe'.  Stop.
make[1]: *** [CMakeFiles/alaska/CMakeFiles/alaska.exe.dir/all] Error 2
make: *** [all] Error 2

*********************************Install external GA**************************

********************************Install GA on Hoffman2**********************

https://hpc.pnl.gov/globalarrays/papers … l-Main.pdf
https://github.com/GlobalArrays/ga

*********************************download Global Array********************

git clone https://github.com/GlobalArrays/ga.git

**********************************create configure ********************
cd ga
./autogen.sh

****************************** configure 1 in conda molcas *****************************

(huiling) [huilings@n6675 ga]$ module load  openmpi/4.1
(molcas) [huilings@n6675 ga]$ ./configure --prefix=/u/home/h/huilings/software/GA-build
(molcas) [huilings@n6675 ga]$ make check

make[5]: Entering directory `/u/home/h/huilings/software/ga/comex'
FAIL: testing/perf
FAIL: testing/perf_contig
FAIL: testing/perf_strided
FAIL: testing/perf_amo
FAIL: testing/shift
FAIL: testing/test
=================================
6 of 6 tests failed
See ./test-suite.log
Please report to hpctools@pnl.gov
=================================

****************************** configure 2 with different compiler(s) *****************************

(molcas) [huilings@n6675 ga]$ which mpiicc
/u/local/compilers/intel/2020.4/compilers_and_libraries_2020.4.304/linux/mpi/intel64/bin/mpiicc
(molcas) [huilings@n6675 ga]$ which mpiicpc
/u/local/compilers/intel/2020.4/compilers_and_libraries_2020.4.304/linux/mpi/intel64/bin/mpiicpc
(molcas) [huilings@n6675 ga]$ which mpiifort
/u/local/compilers/intel/2020.4/compilers_and_libraries_2020.4.304/linux/mpi/intel64/bin/mpiifort

(molcas) [huilings@n6675 ga]$ make clean
(molcas) [huilings@n6675 ga]$ ./configure MPICXX=mpiicpc MPIF77=mpiifort MPICC=mpiicc --prefix=/u/home/h/huilings/software/GA-build

(molcas) [huilings@n6675 ga]$ make
(molcas) [huilings@n6675 ga]$ make check

...
PASS: global/testing/nga-util.x
PASS: global/testing/ngatest.x
PASS: global/examples/lennard-jones/lennard.x
PASS: global/examples/boltzmann/boltz.x
PASS: global/testing/thread_perf_contig.x
PASS: global/testing/thread_perf_strided.x
PASS: global/testing/threadsafec.x
=====================================================
All 74 tests behaved as expected (1 expected failure)
=====================================================
make[4]: Leaving directory `/u/home/h/huilings/software/ga'
make[3]: Leaving directory `/u/home/h/huilings/software/ga'
make[2]: Leaving directory `/u/home/h/huilings/software/ga'
make[1]: Leaving directory `/u/home/h/huilings/software/ga'

(molcas) [huilings@n6675 ga]$ make install
...
Libraries have been installed in:
   /u/home/h/huilings/software/GA-build/lib

*********************************external GA install complete **************************

(molcas) [huilings@n6675 software]$ cd GA-build
(molcas) [huilings@n6675 GA-build]$ ls
bin  include  lib
(molcas) [huilings@n6675 GA-build]$ pwd
/u/home/h/huilings/software/GA-build

*********************************back to OpenMolcas install **************************

(huiling) [huilings@login1 build]$ cmake GA_BUILD=OFF GAROOT=/u/home/h/huilings/software/GA-build ../OpenMolcas/

Configuring compilers:
Detecting Molcas version info:
-- OPENMOLCAS_VERSION: v22.02-324-gf8fe719
Detecting system info:
-- OS: Linux-x86_64
-- ADDRMODE: 64
-- PLATFORM: LINUX64
Configuring with MPI parallellization:
-- Could NOT find MPI_C (missing: MPI_C_WORKS)
CMake Error at /u/local/apps/cmake/3.19.5/gcc-4.8.5/share/cmake-3.19/Modules/FindPackageHandleStandardArgs.cmake:218 (message):
  Could NOT find MPI (missing: MPI_C_FOUND) (found version "3.1")

      Reason given by package: MPI component 'CXX' was requested, but language CXX is not enabled.

Call Stack (most recent call first):
  /u/local/apps/cmake/3.19.5/gcc-4.8.5/share/cmake-3.19/Modules/FindPackageHandleStandardArgs.cmake:582 (_FPHSA_FAILURE_MESSAGE)
  /u/local/apps/cmake/3.19.5/gcc-4.8.5/share/cmake-3.19/Modules/FindMPI.cmake:1722 (find_package_handle_standard_args)
  CMakeLists.txt:1042 (find_package)


-- Configuring incomplete, errors occurred!
See also "/u/home/h/huilings/software/OpenMolcas/build/CMakeFiles/CMakeOutput.log".
See also "/u/home/h/huilings/software/OpenMolcas/build/CMakeFiles/CMakeError.log".

(huiling) [huilings@login1 build]$ export CXX=icpc
(huiling) [huilings@login1 build]$ module load intel
(huiling) [huilings@login1 build]$ CC=mpicc FC=mpifort CXX=mpicxx cmake -DMPI=ON -DGA=ON GA_BUILD=OFF GAROOT=/u/home/h/huilings/software/GA-build ../OpenMolcas/

Could NOT find MPI (missing: MPI_C_FOUND) (found version "3.1")

      Reason given by package: MPI component 'CXX' was requested, but language CXX is not enabled.

(huiling) [huilings@login1 build]$ CC=mpicc FC=mpifort CXX=mpicxx cmake -DMPI_Fortran_COMPILER=mpifort \
-DMPI_C_COMPILER=mpicc -DMPI_CXX_COMPILER=mpicxx -DMPI=ON -DGA=ON GA_BUILD=OFF GAROOT=/u/home/h/huilings/software/GA-build ../OpenMolcas/

  Could NOT find MPI (missing: MPI_C_FOUND) (found version "3.1")

      Reason given by package: MPI component 'CXX' was requested, but language CXX is not enabled.

(huiling) [huilings@login1 build]$ export FC=mpifort
(huiling) [huilings@login1 build]$ export CC=mpicc
(huiling) [huilings@login1 build]$ export CXX=mpicxx

(huiling) [huilings@login1 build]$ module load openmpi/4.1 
(huiling) [huilings@login1 build]$ CC=mpicc FC=mpifort CXX=mpicxx cmake -DMPI_Fortran_COMPILER=mpifort \
-DMPI_C_COMPILER=mpicc -DMPI_CXX_COMPILER=mpicxx -DMPI=ON -DGA=ON GA_BUILD=OFF GAROOT=/u/home/h/huilings/software/GA-build ../OpenMolcas/

Configuring external Global Arrays library:
CMake Error at CMakeLists.txt:1780 (message):
  GA_INCLUDE_PATH not found, make sure GAROOT environment variable is set.

(huiling) [huilings@login1 build]$ export GAROOT=/u/home/h/huilings/software/GA-build    
(huiling) [huilings@login1 build]$ echo $GAROOT
/u/home/h/huilings/software/GA-build

(huiling) [huilings@login1 build]$ CC=mpicc FC=mpifort CXX=mpicxx cmake -DMPI_Fortran_COMPILER=mpifort \
-DMPI_C_COMPILER=mpicc -DMPI_CXX_COMPILER=mpicxx -DMPI=ON -DGA=ON GA_BUILD=OFF  ../OpenMolcas/

*********************************OpenMolcas configuration finished **************************

(huiling) [huilings@login1 build]$ make

*********************************GA linking error **************************

Scanning dependencies of target alaska.exe
[ 61%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska.exe.dir/main.F90.o
[ 61%] Linking Fortran executable ../../bin/alaska.exe
/u/home/h/huilings/software/GA-build/lib/libga.a(elem_alg.o): In function `pnga_abs_value_patch':
elem_alg.c:(.text+0x109): undefined reference to `_intel_fast_memcpy'
elem_alg.c:(.text+0x5aa): undefined reference to `__svml_i64rem2'

**errors**********

/u/home/h/huilings/software/ga/comex/src-mpi/comex.c:1396: undefined reference to `_intel_fast_memset'
/u/home/h/huilings/software/GA-build/lib/libarmci.a(comex.o):/u/home/h/huilings/software/ga/comex/src-mpi/comex.c:1167: more undefined references to `_intel_fast_memset' follow
collect2: error: ld returned 1 exit status
make[2]: *** [bin/alaska.exe] Error 1
make[1]: *** [CMakeFiles/alaska/CMakeFiles/alaska.exe.dir/all] Error 2
make: *** [all] Error 2

Offline

#8 2022-06-04 05:12:37

shaoh
Member
Registered: 2019-07-18
Posts: 10

Re: Problem with parallelization

I tried again with GA_BUILD=ON and got the following error. I assume there must be some problem linking OpenMolcas to the GlobalArray library. I feel I am really stuck and cannot solve this problem on my end.

(molcas) [hus21@exp-9-55 build]$ cmake -DMPI=ON -DGA=ON -DGA_BUILD=ON ../OpenMolcas/
(molcas) [hus21@exp-9-55 build]$ make

****************************stuff going *****************************

[ 64%] Built target transform_util_obj
Scanning dependencies of target wfn_util_obj
[ 64%] Building Fortran object CMakeFiles/wfn_util/CMakeFiles/wfn_util_obj.dir/dummy.f90.o
[ 64%] Built target wfn_util_obj
Scanning dependencies of target libmolcas
[ 64%] Linking Fortran static library lib/libmolcas.a
[ 64%] Built target libmolcas
Scanning dependencies of target alaska_obj
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/alaska_info.F90.o
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/alaska_banner.F90.o
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/alaska.F90.o
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/alaska_super_driver.F90.o
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/annihil_rho.F90.o
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/chk_numerical.F90.o
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/cho_alaska_rdinp.F90.o
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/drvdftg.F90.o
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/drvemb_.F90.o
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/drvembg.F90.o
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/drvg1.F90.o
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/drvh1_emb.F90.o
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/drvn1_emb.F90.o
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/drvn1.F90.o
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska_obj.dir/inputg.F90.o
[ 64%] Built target alaska_obj
Scanning dependencies of target alaska
[ 64%] Linking Fortran static library ../../lib/libalaska.a
[ 64%] Built target alaska
Scanning dependencies of target alaska.exe
[ 64%] Building Fortran object CMakeFiles/alaska/CMakeFiles/alaska.exe.dir/main.F90.o
make[2]: *** No rule to make target 'External/GlobalArrays_install/lib/libga.a', needed by 'bin/alaska.exe'.  Stop.
make[1]: *** [CMakeFiles/Makefile2:4269: CMakeFiles/alaska/CMakeFiles/alaska.exe.dir/all] Error 2
make: *** [Makefile:160: all] Error 2

Offline

#9 2023-12-11 10:52:49

thevro
Member
Registered: 2023-12-11
Posts: 1

Re: Problem with parallelization

I got the error message mentioned in the original post. I looked inside External/GlobalArrays_install/lib/libga.a and saw that it contains both lib/ and lib64/. The build system (under -DGA_BUILD) makes libga.a in lib64/ and later looks for the same file in lib/. After creating a hard link I reran make, and compilation completed successfully.

Cf. issue: https://gitlab.com/Molcas/OpenMolcas/-/issues/431

Offline

Board footer

Powered by FluxBB 1.5.11

Last refresh: Today 11:52:20