CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

mpirun openfoam output is buffered, only output at the end

Register Blogs Community New Posts Updated Threads Search

Like Tree1Likes
  • 1 Post By newbie_cfd

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   August 15, 2015, 11:28
Default mpirun openfoam output is buffered, only output at the end
  #1
New Member
 
nija parkman
Join Date: Aug 2015
Posts: 10
Rep Power: 10
newbie_cfd is on a distinguished road
Hello,

Title says it all; when using mpirun to run any solver in parallel the openfoam output is buffered and only appears once the run is finished; i.e. not great to follow it.
I went through mpirun manual and the forum but so far haven't found anything?
Is this also happening to some of you? any fix?

Thanks,
N.
newbie_cfd is offline   Reply With Quote

Old   August 16, 2015, 13:08
Default
  #2
Senior Member
 
Troy Snyder
Join Date: Jul 2009
Location: Akron, OH
Posts: 219
Rep Power: 18
tas38 is on a distinguished road
Quote:
Originally Posted by newbie_cfd View Post
Hello,

Title says it all; when using mpirun to run any solver in parallel the openfoam output is buffered and only appears once the run is finished; i.e. not great to follow it.
I went through mpirun manual and the forum but so far haven't found anything?
Is this also happening to some of you? any fix?

Thanks,
N.
What is the command you are using to submit the mpirun job?

I use something like the following to run the job and dump the output to a log file:

Code:
>>mpirun -np (# of processors) (executable) -parallel > output.log &
tas38 is offline   Reply With Quote

Old   August 16, 2015, 13:24
Default
  #3
New Member
 
nija parkman
Join Date: Aug 2015
Posts: 10
Rep Power: 10
newbie_cfd is on a distinguished road
Hello,

Yes exactly the same. The job runs fine, output as expected but only available at the end.

Thanks,
N
newbie_cfd is offline   Reply With Quote

Old   August 16, 2015, 13:28
Default
  #4
Senior Member
 
Troy Snyder
Join Date: Jul 2009
Location: Akron, OH
Posts: 219
Rep Power: 18
tas38 is on a distinguished road
I am not sure what you are asking. If you want ouput to the file and stdout, you
could do the following:

Code:
mpirun -np (# of processors) (executable) -parallel 2>&1 | tee output.log
tas38 is offline   Reply With Quote

Old   August 16, 2015, 14:29
Default
  #5
New Member
 
nija parkman
Join Date: Aug 2015
Posts: 10
Rep Power: 10
newbie_cfd is on a distinguished road
Ill try to be a bit clearer:
when I run the mpirun command (with required arguments), the processes are starting ok and running fine. No standard output are generated until the process ends. Only when it does ends that output in generated.
It's why I used the term "buffered", it seems mpirun buffers all standard output and only spit it out at the end.
newbie_cfd is offline   Reply With Quote

Old   August 16, 2015, 15:18
Default
  #6
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,975
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Greetings to all!

@newbie_cfd: You will have to provide more details about your working environment, because that is not a common thing to occur with mpirun itself.
A few examples of what I mean by "work environment":
  1. What Operating System are you using?
  2. If it's Linux, which distribution?
  3. What MPI toolbox are you using? For example, run:
    Code:
    mpirun --version
  4. Are you using a workstation, a cluster or a supercomputer?
Because from your description, it looks like you're using a wrapper script that schedules the run, which would explain why it only outputs at the end.

The problem here is that we cannot guess what you're actually using, because mpirun usually always gives us the output while it's running, unless we use a job scheduler on a cluster or supercomputer; although for these latter ones the job scheduler usually is not named "mpirun".

Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   August 16, 2015, 15:32
Default
  #7
New Member
 
nija parkman
Join Date: Aug 2015
Posts: 10
Rep Power: 10
newbie_cfd is on a distinguished road
Hello,

Thanks for helping me onto that one.

So I am running on a workstation with CentOS.

Code:
mpirun --version
mpirun (Open MPI) 1.8.2


Code:
which mpirun
/usr/mpi/gcc/openmpi-1.8.2/bin/mpirun


Code:
cat /etc/centos-release
CentOS release 6.5 (Final)



Hope this helps,
Thanks,
N
newbie_cfd is offline   Reply With Quote

Old   August 16, 2015, 15:44
Default
  #8
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,975
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Try the following (change the number of cores and solver name to your own):
Code:
mpirun -np 4 -output-filename file.log interFoam -parallel &
wait a few seconds for it to effectively start running and then run:
Code:
tail -f file.log.1.0
This command follows the output of the file "file.log.1.0". Use the key combination Ctrl+C to abort the tailing.

Another detail is that perhaps you shouldn't use all of the cores available in the workstation. For example, try with a case with only 2 sub-domains, so that this way you're certain that your machine still has enough cores to do other stuff.

If this still has issues, then something else is getting in the way, possibly some configuration of the file system.

----------

Edit:
Also, it seems that you're using a custom installation of Open-MPI, because the default is 1.8.1 on CentOS. Check the contents of the file "openmpi-mca-params.conf", which you can find with this command:
Code:
find /usr/mpi/gcc/openmpi-1.8.2/ -name "*mca-params.conf"
That file has the settings for Open-MPI for running and in that file might be configured something that does that delayed output you're witnessing.

Last edited by wyldckat; August 16, 2015 at 15:55. Reason: see "Edit:"
wyldckat is offline   Reply With Quote

Old   August 16, 2015, 16:08
Default
  #9
New Member
 
nija parkman
Join Date: Aug 2015
Posts: 10
Rep Power: 10
newbie_cfd is on a distinguished road
Ok, great. Solved!

Running with:
Code:
 -output-filename file.log
didn't output anything to the file.log.* files (did create the files but they were blank)


But, the "mca-params.conf" file you mentioned was the 'culprit', from here it's mentionned that some config have "opal_event_include=poll" in their config files which shouldn't be there.
I had "opal_event_include=epoll" which I commented.

It fixes my issues and I get the output updated as the simulations runs.

Thanks Wyldckat!,
N
wyldckat likes this.
newbie_cfd is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
mass flow in is not equal to mass flow out saii CFX 12 March 19, 2018 05:21
Compression stoke is giving higher pressure than calculated nickjuana CFX 62 May 19, 2015 13:32
Radiation interface hinca CFX 15 January 26, 2014 17:11
Question about heat transfer coefficient setting for CFX Anna Tian CFX 1 June 16, 2013 06:28
FSI: Pressure and Normal Force don't match with expected values Geraud CFX 6 August 21, 2012 15:34


All times are GMT -4. The time now is 20:08.