Support and discussions for Molcas and OpenMolcas users and developers
You are not logged in.
Please note: The forum's URL has changed. The new URL is: https://molcasforum.univie.ac.at. Please update your bookmarks!
You can choose an avatar and change the default style by going to "Profile" → "Personality" or "Display".Hello,
Has anyone seen this error and know anyway to fix it?
*** An error occurred in MPI_Allreduce
*** reported by process [139974332776449,0]
*** on communicator MPI_COMM_WORLD
*** MPI_ERR_IN_STATUS: error code in status
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
--- Stop Module: scf at Fri Mar 24 13:56:25 2017 /rc= -1 (Unknown) ---
I am trying to run a QMMM job linked with Tinker to model TIP3P waters. The job crashes after about an hour and during the slapaf optimization step.
Thanks
Offline
Does it consistently crash at the same spot? MPI errors are often difficult to catch, they could be due to bug in the MPI libraries, or a hardware/software glitch in one of the nodes causing the whole calculation to stop. If possible, try different compiler/MPI versions too.
Offline
This did turnout to be a problem with the openmpi and we solved it by creating some symlinks and using openmpi/1.10.0. When we installed Molcas we used the pre-compiled version. We are hoping that if we get the source code and compile it locally we can use the appropriate openmpi version. But is working for now.
Offline