CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Problem with mpirun in ssh connection

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 22, 2017, 06:29
Default Problem with mpirun in ssh connection
  #1
Member
 
Join Date: Oct 2015
Location: Finland
Posts: 39
Rep Power: 10
blttkgl is on a distinguished road
Hey,

I am not sure if cfd-online forums is the correct place to ask this question but here it goes.

I try to simulate my case with multiple cores using my faculties remote server, connecting from ssh protocol. When I decompose my case and try to run I get the following error:

mpirun -np 4 XiFoam -parallel
NVIDIA: no NVIDIA devices found
[force:26426] *** Process received signal ***
[force:26426] Signal: Bus error (7)
[force:26426] Signal code: Non-existant physical address (2)
[force:26426] Failing at address: 0x7fbc3ad85780
[force:26427] *** Process received signal ***
[force:26427] Signal: Bus error (7)
[force:26427] Signal code: Non-existant physical address (2)
[force:26427] Failing at address: 0x7f338d01fe50
[force:26428] *** Process received signal ***
[force:26428] Signal: Bus error (7)
[force:26428] Signal code: Non-existant physical address (2)
[force:26428] Failing at address: 0x7fd155806898
[force:26429] *** Process received signal ***
[force:26429] Signal: Bus error (7)
[force:26429] Signal code: Non-existant physical address (2)
[force:26429] Failing at address: 0x7fdcc8c5bd50
[force:26429] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x354b0)[0x7fdcd99e24b0]
[force:26429] [ 1] /usr/lib/openmpi/lib/openmpi/mca_btl_vader.so(mca_btl_vader_frag_init+0x8e)[0x7fdccb1e329e]
[force:26429] [ 2] /usr/lib/libmpi.so.12(ompi_free_list_grow+0x1a9)[0x7fdcd7544cf9]
[force:26429] [ 3] /usr/lib/openmpi/lib/openmpi/mca_btl_vader.so(+0x1ea4)[0x7fdccb1e0ea4]
[force:26429] [ 4] /usr/lib/openmpi/lib/openmpi/mca_bml_r2.so(+0x16b3)[0x7fdccb5ec6b3]
[force:26429] [ 5] /usr/lib/openmpi/lib/openmpi/mca_pml_ob1.so(mca_pml_ob1_add_procs+0xca)[0x7fdcca2dac1a]
[force:26429] [force:26426] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x354b0)[0x7fbc53bd54b0]
[force:26426] [ 1] /usr/lib/openmpi/lib/openmpi/mca_btl_sm.so(+0x4c4a)[0x7fbc44cecc4a]
[force:26426] [ 2] /usr/lib/libmpi.so.12(ompi_free_list_grow+0x189)[0x7fbc51737cd9]
[force:26426] [ 3] /usr/lib/openmpi/lib/openmpi/mca_btl_sm.so(mca_btl_sm_add_procs+0x689)[0x7fbc44cea9c9]
[force:26426] [ 4] /usr/lib/openmpi/lib/openmpi/mca_bml_r2.so(+0x16b3)[0x7fbc4572f6b3]
[force:26426] [ 5] /usr/lib/openmpi/lib/openmpi/mca_pml_ob1.so(mca_pml_ob1_add_procs+0xca)[0x7fbc4441dc1a]
[force:26426] [ 6] /usr/lib/libmpi.so.12(ompi_mpi_init+0x859)[0x7fbc51753109]
[force:26426] [ 7] /usr/lib/libmpi.so.12(MPI_Init+0x15d)[0x7fbc5177154d]
[force:26426] [ 8] /opt/openfoam4/platforms/linux64GccDPInt32Opt/lib/openmpi-system/libPstream.so(_ZN4Foam8UPstream4initERiRPPc+0x1f)[0x7fbc539964cf]
[force:26426] [ 9] [force:26427] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x354b0)[0x7f339de464b0]
[force:26427] [ 1] /usr/lib/openmpi/lib/openmpi/mca_btl_vader.so(mca_btl_vader_frag_init+0x8e)[0x7f338f58729e]
[force:26427] [ 2] /usr/lib/libmpi.so.12(ompi_free_list_grow+0x1a9)[0x7f339b9a8cf9]
[force:26427] [ 3] /usr/lib/openmpi/lib/openmpi/mca_btl_vader.so(+0x1ea4)[0x7f338f584ea4]
[force:26427] [ 4] /usr/lib/openmpi/lib/openmpi/mca_bml_r2.so(+0x16b3)[0x7f338f9906b3]
[force:26427] [ 5] /usr/lib/openmpi/lib/openmpi/mca_pml_ob1.so(mca_pml_ob1_add_procs+0xca)[0x7f338e67ec1a]
[force:26427] [ 6] [force:26428] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x354b0)[0x7fd16ee0d4b0]
[force:26428] [ 1] /usr/lib/openmpi/lib/openmpi/mca_allocator_bucket.so(mca_allocator_bucket_alloc _align+0x98)[0x7fd1615c80e8]
[force:26428] [ 2] /usr/lib/openmpi/lib/openmpi/mca_mpool_sm.so(mca_mpool_sm_alloc+0x28)[0x7fd160fbce58]
[force:26428] [ 3] /usr/lib/openmpi/lib/openmpi/mca_btl_sm.so(mca_btl_sm_add_procs+0x5d4)[0x7fd15ff6b914]
[force:26428] [ 4] /usr/lib/openmpi/lib/openmpi/mca_bml_r2.so(+0x16b3)[0x7fd1609b06b3]
[force:26428] [ 5] /usr/lib/openmpi/lib/openmpi/mca_pml_ob1.so(mca_pml_ob1_add_procs+0xca)[0x7fd15f69ec1a]
[force:26428] [ 6] /usr/lib/libmpi.so.12(ompi_mpi_init+0x859)[0x7fd16c98b109]
[force:26428] [ 7] /usr/lib/libmpi.so.12(MPI_Init+0x15d)[0x7fd16c9a954d]
[force:26428] [ 8] /opt/openfoam4/platforms/linux64GccDPInt32Opt/lib/openmpi-system/libPstream.so(_ZN4Foam8UPstream4initERiRPPc+0x1f)[0x7fd16ebce4cf]
[force:26428] [ 9] /usr/lib/libmpi.so.12(ompi_mpi_init+0x859)[0x7f339b9c4109]
[force:26427] [ 7] [ 6] /usr/lib/libmpi.so.12(ompi_mpi_init+0x859)[0x7fdcd7560109]
[force:26429] [ 7] /usr/lib/libmpi.so.12(MPI_Init+0x15d)[0x7fdcd757e54d]
[force:26429] [ 8] /opt/openfoam4/platforms/linux64GccDPInt32Opt/lib/openmpi-system/libPstream.so(_ZN4Foam8UPstream4initERiRPPc+0x1f)[0x7fdcd97a34cf]
[force:26429] [ 9] /opt/openfoam4/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so(_ZN4Foam7argListC1ERiRPPcbbb+0x47a)[0x7fdcdaad87ca]
[force:26429] [10] XiFoam[0x42ddff]
[force:26429] [11] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7fdcd99cd830]
[force:26429] [12] XiFoam[0x43d3b9]
[force:26429] *** End of error message ***
/usr/lib/libmpi.so.12(MPI_Init+0x15d)[0x7f339b9e254d]
[force:26427] [ 8] /opt/openfoam4/platforms/linux64GccDPInt32Opt/lib/openmpi-system/libPstream.so(_ZN4Foam8UPstream4initERiRPPc+0x1f)[0x7f339dc074cf]
[force:26427] [ 9] /opt/openfoam4/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so(_ZN4Foam7argListC1ERiRPPcbbb+0x47a)[0x7f339ef3c7ca]
[force:26427] [10] XiFoam[0x42ddff]
[force:26427] [11] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7f339de31830]
[force:26427] [12] XiFoam[0x43d3b9]
[force:26427] *** End of error message ***
/opt/openfoam4/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so(_ZN4Foam7argListC1ERiRPPcbbb+0x47a)[0x7fbc54ccb7ca]
[force:26426] [10] XiFoam[0x42ddff]
[force:26426] [11] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7fbc53bc0830]
[force:26426] [12] XiFoam[0x43d3b9]
[force:26426] *** End of error message ***
/opt/openfoam4/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so(_ZN4Foam7argListC1ERiRPPcbbb+0x47a)[0x7fd16ff037ca]
[force:26428] [10] XiFoam[0x42ddff]
[force:26428] [11] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7fd16edf8830]
[force:26428] [12] XiFoam[0x43d3b9]
[force:26428] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 26426 on node force exited on signal 7 (Bus error).


mpirun works smoothly if i use the cores in my notebook but I will be running bigger cases and will need 10+ cores very soon.

Any idea or help is much appreciated!

Bulut
blttkgl is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
OF mpirun and parallel problem heksel8i OpenFOAM Running, Solving & CFD 2 September 11, 2013 05:33
Parallel processing problem with Fluent Harish FLUENT 3 February 8, 2011 09:49
[OpenFOAM] Weird Problem with ParaFoam via SSH cwang5 ParaView 2 July 19, 2010 09:00
natural convection problem for a CHT problem Se-Hee CFX 2 June 10, 2007 06:29
Adiabatic and Rotating wall (Convection problem) ParodDav CFX 5 April 29, 2007 19:13


All times are GMT -4. The time now is 16:10.