Support and discussions for Molcas and OpenMolcas users and developers
You are not logged in.
Please note: The forum's URL has changed. The new URL is: https://molcasforum.univie.ac.at. Please update your bookmarks!
You can choose an avatar and change the default style by going to "Profile" → "Personality" or "Display".Pages: 1
Hello everyone,
Hope you guys are doing well.
I recently got OpenMolcas installed but I'm having issues running calculation in parallel. Regarding the requirements for openmolcas I used the following versions:
- OpenMPI v4.1.4
- OpenBLAS v0.3.17
- GlobalArrays v5.8.1
- Cmake v3.24.0-rc1
Everything was smoothly. I didn't have warning or any issues compiling or installing the program. When I tried to run a calculation with just 1 processor it worked well. However when I specified more than one processor, I got this problem "Program received signal SIGSEGV: Segmentation fault - invalid memory reference."
This is my input (really simple, I was just testing):
&GATEWAY
Basis set
H.3-21G.....
H1 0.0 0.0 0.0
End of basis
&SEWARD
Grid input
RQuad= Log3; nGrid= 50000; GGL; lMax= 26; Global
End of Grid Input
&SCF; Occupations=1; KSDFT=LDA5; Iterations= 1 1
This is the job file:
#!/bin/bash
#SBATCH -p intel
#SBATCH --job-name=h2o
#SBATCH -N 1
#SBATCH --tasks-per-node=2
#SBATCH --mem-per-cpu=16000
#SBATCH -o job-%j.stdout
#SBATCH -e job-%j.stderr
#module purge
#module load openmolcas/pymolcas
# OpenMP settings
export PATH="$PATH:/home/mariab/opt/openmpi/bin"
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/home/mariab/opt/openmpi/lib/"
export OMP_NUM_THREADS=1
# Molcas environment
export MOLCAS_WORKDIR=/home/mariab/scratch/$SLURM_JOB_ID
export MOLCAS_MEM=4000
export MOLCAS_NNODES=1
export MOLCAS_NPROCS=2
/home/mariab/build/pymolcas h2o.inp -oe h2o.log -b 1
This is the output
--- Start Module: gateway at Tue Jun 28 14:08:29 2022 ---
Program received signal SIGSEGV: Segmentation fault - invalid memory reference.
Backtrace for this error:
Program received signal SIGSEGV: Segmentation fault - invalid memory reference.
Backtrace for this error:
#0 0x7f73ce15b2ed in ???
#1 0x7f73ce15a503 in ???
#2 0x7f73cb4ddf0f in ???
#3 0x7f73cb901ab3 in ???
#0 0x7fe0bd1f82ed in ???
#1 0x7fe0bd1f7503 in ???
#2 0x7fe0ba57af0f in ???
#3 0x7fe0ba99eab3 in ???
#4 0x55b314f41e38 in comex_init
at src-mpi/comex.c:1376
#5 0x55b314f41fef in comex_init_args
at src-mpi/comex.c:1429
#4 0x55f8b49b3e38 in comex_init
at src-mpi/comex.c:1376
#5 0x55f8b49b3fef in comex_init_args
at src-mpi/comex.c:1429
#6 0x55b314f3c148 in PARMCI_Init_args
at src-armci/armci.c:416
#7 0x55b314f14cda in ???
#8 0x55b314e48362 in ???
#9 0x55b314d7f456 in ???
#10 0x55b314d62e92 in ???
#11 0x55b314cf8170 in ???
#12 0x7fe0ba55dc86 in ???
#13 0x55b314cf81b9 in ???
#14 0xffffffffffffffff in ???
#6 0x55f8b49ae148 in PARMCI_Init_args
at src-armci/armci.c:416
#7 0x55f8b4986cda in ???
#8 0x55f8b48ba362 in ???
#9 0x55f8b47f1456 in ???
#10 0x55f8b47d4e92 in ???
#11 0x55f8b476a170 in ???
#12 0x7f73cb4c0c86 in ???
#13 0x55f8b476a1b9 in ???
#14 0xffffffffffffffff in ???
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
Any help with this issue will be appreciated.
Thanks
Offline
Did you use an externally compiled GlobalArrays? Those problems are typically caused by mismatch between the configurations of different libraries. Note that for OpenMolcas you need an 8-byte integer lapack, but a 4-byte integer MPI. You could try using the "built-in" Global Arrays (GA_BUILD=ON)
To rule out possible bugs in seldom-used features, try also without the "Grid input" block.
Offline
Hi Ignacio, thanks for your response. Yes, I used an externally compiled GA. I just tried with "GA_BUILD=ON" but I got the same issue. I don't know what the problem can be.
Offline
Hi Ignacio,
The problem in fact was the GA compilation. I downloaded the source code version and I recompiled it. That solved the problem!
Thank you very much
Offline
Pages: 1