CFD Online Discussion Forums

CFD Online Discussion Forums (https://www.cfd-online.com/Forums/)
-   OpenFOAM Running, Solving & CFD (https://www.cfd-online.com/Forums/openfoam-solving/)
-   -   mpirun openfoam output is buffered, only output at the end (https://www.cfd-online.com/Forums/openfoam-solving/158106-mpirun-openfoam-output-buffered-only-output-end.html)

newbie_cfd August 15, 2015 11:28

mpirun openfoam output is buffered, only output at the end
 
Hello,

Title says it all; when using mpirun to run any solver in parallel the openfoam output is buffered and only appears once the run is finished; i.e. not great to follow it.
I went through mpirun manual and the forum but so far haven't found anything?
Is this also happening to some of you? any fix?

Thanks,
N.

tas38 August 16, 2015 13:08

Quote:

Originally Posted by newbie_cfd (Post 559680)
Hello,

Title says it all; when using mpirun to run any solver in parallel the openfoam output is buffered and only appears once the run is finished; i.e. not great to follow it.
I went through mpirun manual and the forum but so far haven't found anything?
Is this also happening to some of you? any fix?

Thanks,
N.

What is the command you are using to submit the mpirun job?

I use something like the following to run the job and dump the output to a log file:

Code:

>>mpirun -np (# of processors) (executable) -parallel > output.log &

newbie_cfd August 16, 2015 13:24

Hello,

Yes exactly the same. The job runs fine, output as expected but only available at the end.

Thanks,
N

tas38 August 16, 2015 13:28

I am not sure what you are asking. If you want ouput to the file and stdout, you
could do the following:

Code:

mpirun -np (# of processors) (executable) -parallel 2>&1 | tee output.log

newbie_cfd August 16, 2015 14:29

Ill try to be a bit clearer:
when I run the mpirun command (with required arguments), the processes are starting ok and running fine. No standard output are generated until the process ends. Only when it does ends that output in generated.
It's why I used the term "buffered", it seems mpirun buffers all standard output and only spit it out at the end.

wyldckat August 16, 2015 15:18

Greetings to all!

@newbie_cfd: You will have to provide more details about your working environment, because that is not a common thing to occur with mpirun itself.
A few examples of what I mean by "work environment":
  1. What Operating System are you using?
  2. If it's Linux, which distribution?
  3. What MPI toolbox are you using? For example, run:
    Code:

    mpirun --version
  4. Are you using a workstation, a cluster or a supercomputer?
Because from your description, it looks like you're using a wrapper script that schedules the run, which would explain why it only outputs at the end.

The problem here is that we cannot guess what you're actually using, because mpirun usually always gives us the output while it's running, unless we use a job scheduler on a cluster or supercomputer; although for these latter ones the job scheduler usually is not named "mpirun".

Best regards,
Bruno

newbie_cfd August 16, 2015 15:32

Hello,

Thanks for helping me onto that one.

So I am running on a workstation with CentOS.

Code:

mpirun --version
mpirun (Open MPI) 1.8.2


Code:

which mpirun
/usr/mpi/gcc/openmpi-1.8.2/bin/mpirun


Code:

cat /etc/centos-release
CentOS release 6.5 (Final)



Hope this helps,
Thanks,
N

wyldckat August 16, 2015 15:44

Try the following (change the number of cores and solver name to your own):
Code:

mpirun -np 4 -output-filename file.log interFoam -parallel &
wait a few seconds for it to effectively start running and then run:
Code:

tail -f file.log.1.0
This command follows the output of the file "file.log.1.0". Use the key combination Ctrl+C to abort the tailing.

Another detail is that perhaps you shouldn't use all of the cores available in the workstation. For example, try with a case with only 2 sub-domains, so that this way you're certain that your machine still has enough cores to do other stuff.

If this still has issues, then something else is getting in the way, possibly some configuration of the file system.

----------

Edit:
Also, it seems that you're using a custom installation of Open-MPI, because the default is 1.8.1 on CentOS. Check the contents of the file "openmpi-mca-params.conf", which you can find with this command:
Code:

find /usr/mpi/gcc/openmpi-1.8.2/ -name "*mca-params.conf"
That file has the settings for Open-MPI for running and in that file might be configured something that does that delayed output you're witnessing.

newbie_cfd August 16, 2015 16:08

Ok, great. Solved!

Running with:
Code:

-output-filename file.log
didn't output anything to the file.log.* files (did create the files but they were blank)


But, the "mca-params.conf" file you mentioned was the 'culprit', from here it's mentionned that some config have "opal_event_include=poll" in their config files which shouldn't be there.
I had "opal_event_include=epoll" which I commented.

It fixes my issues and I get the output updated as the simulations runs.

Thanks Wyldckat!,
N


All times are GMT -4. The time now is 14:07.