CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM

OPENFOAM Parallel Computing, No solution and Sudden Exit with no Solution

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   May 29, 2021, 10:15
Default OPENFOAM Parallel Computing, No solution and Sudden Exit with no Solution
  #1
New Member
 
Join Date: May 2021
Posts: 6
Rep Power: 4
bigoqwerty is on a distinguished road
Hello All,


I try to test Parallel computing with openfoam tutorials. I make sure that Openmpi is installed correctly. So there is no problem there. I also check mpi selector and such(omni_run) to see if it makes any differences. (I have openmpi1 and openmpi2, with Openfoam V1912,2006,2012 installed)



When I try parallel computing to test with various cases, I see that simulation exiting either in recompose phase or at pimpleFoam (or any other) phase after meshing without any solution.


There is no errors. Just solution doesn't start and sudden exit.



It is obviously due to my system (Opensuse leap 15.2). Do you have any suggestions?


Thanks
bigoqwerty is offline   Reply With Quote

Old   May 29, 2021, 15:41
Default
  #2
New Member
 
Join Date: May 2021
Posts: 6
Rep Power: 4
bigoqwerty is on a distinguished road
This is the log file from pimpleFoam. It appears MPI_INIT doesn't kick in.



Code:
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "(null)" (-43) instead of "Success" (0)
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "(null)" (-43) instead of "Success" (0)
--------------------------------------------------------------------------
[localhost.localdomain:04442] *** An error occurred in MPI_Init_thread
[localhost.localdomain:04442] *** on a NULL communicator
[localhost.localdomain:04442] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[localhost.localdomain:04442] ***    and potentially your MPI job)
[localhost.localdomain:04441] *** An error occurred in MPI_Init_thread
[localhost.localdomain:04441] *** on a NULL communicator
[localhost.localdomain:04441] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[localhost.localdomain:04441] ***    and potentially your MPI job)
[localhost.localdomain:04441] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
[localhost.localdomain:04442] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "(null)" (-43) instead of "Success" (0)
--------------------------------------------------------------------------
[localhost.localdomain:04440] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
[localhost.localdomain:04440] *** An error occurred in MPI_Init_thread
[localhost.localdomain:04440] *** on a NULL communicator
[localhost.localdomain:04440] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[localhost.localdomain:04440] ***    and potentially your MPI job)
[localhost.localdomain:04443] *** An error occurred in MPI_Init_thread
[localhost.localdomain:04443] *** on a NULL communicator
[localhost.localdomain:04443] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[localhost.localdomain:04443] ***    and potentially your MPI job)
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  ompi_mpi_init: ompi_rte_init failed
  --> Returned "(null)" (-43) instead of "Success" (0)
--------------------------------------------------------------------------
[localhost.localdomain:04443] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
-------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.
-------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[49625,1],1]
  Exit code:    1
--------------------------------------------------------------------------
bigoqwerty is offline   Reply With Quote

Old   May 29, 2021, 17:14
Default
  #3
New Member
 
Join Date: May 2021
Posts: 6
Rep Power: 4
bigoqwerty is on a distinguished road
I removed all versions V1912,2006 and 2012 and openmpi and openmpi2. Reinstalled 2012. Running a parallel example SimpleFoam turbineSitting. I see no CPU usage and Running simpleFoam (4 processes) goes on.



If you have an any idea. Pls. share Thanks
bigoqwerty is offline   Reply With Quote

Reply

Tags
openfoam, parallel, problem


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Very strange/incorrect flow solution all of a sudden. bndrylyr FLUENT 0 March 21, 2017 15:44


All times are GMT -4. The time now is 10:52.