January 4, 2021, 01:38
|
Problem with Running OpenFOAM Parallel Case
|
#1
|
New Member
Homer Bacanto
Join Date: Dec 2020
Posts: 13
Rep Power: 5
|
Hey guys, I'm trying to run an OpenFOAM case in parallel, specifically in 56 cores/processors. When I try running the Allrun file, I get through everything just fine until I reach the solver. It solves for a couple of minutes until it terminates because I get the following error:
Code:
[amd64-TRX40-AORUS-PRO-WIFI:11125] *** An error occurred in MPI_Bsend
[amd64-TRX40-AORUS-PRO-WIFI:11125] *** reported by process [1304035329,1]
[amd64-TRX40-AORUS-PRO-WIFI:11125] *** on communicator MPI COMMUNICATOR 3 SPLIT FROM 0
[amd64-TRX40-AORUS-PRO-WIFI:11125] *** MPI_ERR_BUFFER: invalid buffer pointer
[amd64-TRX40-AORUS-PRO-WIFI:11125] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[amd64-TRX40-AORUS-PRO-WIFI:11125] *** and potentially your MPI job)
[amd64-TRX40-AORUS-PRO-WIFI:11112] PMIX ERROR: UNREACHABLE in file ../../../src/server/pmix_server.c at line 2193
[amd64-TRX40-AORUS-PRO-WIFI:11112] PMIX ERROR: UNREACHABLE in file ../../../src/server/pmix_server.c at line 2193
[amd64-TRX40-AORUS-PRO-WIFI:11112] 2 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[amd64-TRX40-AORUS-PRO-WIFI:11112] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
I'm not sure what I'm doing wrong, since I think mpi is properly installed because I can see the processor0-x folders being created.
If anyone can point me in the right direction, I'd be very grateful.
|
|
|