CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM

Problem with parallelization and a constant field located at constant folder

Register Blogs Community New Posts Updated Threads Search

Like Tree2Likes
  • 1 Post By rucky96
  • 1 Post By geth03

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   November 12, 2019, 21:41
Exclamation Problem with parallelization and a constant field located at constant folder
  #1
New Member
 
Join Date: Sep 2019
Posts: 18
Rep Power: 6
rucky96 is on a distinguished road
Hi Foamers,

I have the following problem: I am solving MHD with mhdFoam for a cylinder with an external constant magnetic field. I had to modify the solver to take in account this external and constant field. The idea was to split up B in one part fixed in time BFixed(r) and other variable in time and space BVble(r,t). As BFixed is the same every time step, I located it in constant folder. Otherwise it would be writing the same file always in every time directory.

The solver works good but the problem is when I want to divide the work in several subdomains. I wrote decomposePar, foamJob mpirun -np 2 cylinderMhdFoam -parallel &. But, the BFixed is not parallelized for some reason:
Code:
[0]
[0]
[0] --> FOAM FATAL ERROR:
[0] cannot find file "/home/foamusr/localfolder/tfg/cylinder/processor0/constant/BFixed"
[0]
[0]     From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::readStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const
[0]     in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 538.
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 3 SPLIT FROM 0
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[1]
[1]
[1] --> FOAM FATAL ERROR:
[1] cannot find file "/home/foamusr/localfolder/tfg/cylinder/processor1/constant/BFixed"
[1]
[1]     From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::readStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const
[1]     in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 538.
[1]
FOAM parallel run exiting
[1]
[7cafdb3f0a70:00343] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[7cafdb3f0a70:00343] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Any idea?

Thanks in advance
rucky96 is offline   Reply With Quote

Old   November 17, 2019, 15:05
Default
  #2
New Member
 
Join Date: Sep 2019
Posts: 18
Rep Power: 6
rucky96 is on a distinguished road
The problem of the parallelization with a field located in constant folder cannot be solved (easely at least). So I found out a solution much simpler: just make the solver read the field located at time folder 0 every time step. This has the same result.
ashishmagar600 likes this.
rucky96 is offline   Reply With Quote

Old   July 17, 2020, 04:31
Default
  #3
Member
 
Join Date: Sep 2018
Location: France
Posts: 62
Rep Power: 7
john myce is on a distinguished road
Quote:
Originally Posted by rucky96 View Post
The problem of the parallelization with a field located in constant folder cannot be solved (easely at least). So I found out a solution much simpler: just make the solver read the field located at time folder 0 every time step. This has the same result.
Hi !

I am not sure to understand what are you saying. I have got the same issue as the solver needs to read a field located in the constant folder which is not included when I decomposed the case. How do you force the solver to read this field ?

Cheers.
john myce is offline   Reply With Quote

Old   January 26, 2021, 22:36
Default
  #4
New Member
 
Join Date: Jan 2021
Location: Edmonton
Posts: 4
Rep Power: 5
saavedra00 is on a distinguished road
Hello, I am adapting an immiscible two-phase flow solver for porous media for my own case. The solver runs fine in series but not in parallel.

Solver compilation and preliminary steps do not show any errors. Also, no error appears on "stepFields" or "decomposePar". The error appears when running the solver in parallel (please see below).

I believe the error has to do with the permeability "K" (constant cell-centered scalar field) being in the "constant" folder. When I run "decomposePar" the application does not decompose the permeability "K" in the processors.

Any feedback you could give me will be very appreciated.

Thank you very much for your attention and all the very best.

Sebastian


Reading porosity field eps (if present)

Reading permeability field K
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 3 in communicator MPI COMMUNICATOR 3 SPLIT FROM 0
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

--------------------------------------------------------------------------
[0]
[0]
[0] --> FOAM FATAL ERROR:
[0] cannot find file "/home/lopezsaa/OpenFOAM/lopezsaa-8/run/porousMultiphaseFoam-openfoam-v8/tutorials/lcl-tutorials/two_phase/paSaBCAnisob5PARALLEL/processor0/constant/K"
[0]
[0] From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::rea dStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const
[0] in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 538.
[0]
FOAM parallel run exiting
[0]
[1]
[1]
[1] --> FOAM FATAL ERROR:
[1] cannot find file "/home/lopezsaa/OpenFOAM/lopezsaa-8/run/porousMultiphaseFoam-openfoam-v8/tutorials/lcl-tutorials/two_phase/paSaBCAnisob5PARALLEL/processor1/constant/K"
[1]
[1] From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::rea dStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const
[1] in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 538.
[1]
FOAM parallel run exiting
[1]
[2]
[2]
[2] --> FOAM FATAL ERROR:
[2] cannot find file "/home/lopezsaa/OpenFOAM/lopezsaa-8/run/porousMultiphaseFoam-openfoam-v8/tutorials/lcl-tutorials/two_phase/paSaBCAnisob5PARALLEL/processor2/constant/K"
[2]
[2] From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::rea dStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const
[2] in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 538.
[2]
FOAM parallel run exiting
[2]
[3]
[3]
[3] --> FOAM FATAL ERROR:
[3] cannot find file "/home/lopezsaa/OpenFOAM/lopezsaa-8/run/porousMultiphaseFoam-openfoam-v8/tutorials/lcl-tutorials/two_phase/paSaBCAnisob5PARALLEL/processor3/constant/K"
[3]
[3] From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::rea dStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const
[3] in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 538.
[3]
FOAM parallel run exiting
[3]

Last edited by saavedra00; January 27, 2021 at 21:51. Reason: Better describe my problem.
saavedra00 is offline   Reply With Quote

Old   January 28, 2021, 04:55
Default
  #5
Senior Member
 
Join Date: Dec 2019
Location: Cologne, Germany
Posts: 355
Rep Power: 8
geth03 is on a distinguished road
did you modify anything in the solver code?
how do you create this K from the solverside?
volScalarField K ( IOobject -> from where does it read? ...

more input is required ...
saavedra00 likes this.
geth03 is offline   Reply With Quote

Old   January 28, 2021, 11:12
Default
  #6
New Member
 
Join Date: Jan 2021
Location: Edmonton
Posts: 4
Rep Power: 5
saavedra00 is on a distinguished road
Hello geth03.

After consulting experts I was able to solve the problem.

The problem was that the volScalarField K was being read from the "constant" folder. I learned that one shouldn't read fields from the constant solver - they belong into the time directory, which is decomposed.

So just had to modify the "createFields.H" file from this:

Quote:
//
Info << nl << "Reading permeability field K" << endl;
volScalarField K
(
* * IOobject
* * (
* * * * "K",
* * * * runTime.constant(),
* * * * mesh,
* * * * IOobject::MUST_READ,
* * * * IOobject::AUTO_WRITE
* * ),
* * mesh
);

to this:


Quote:
//
Info << nl << "Reading permeability field K" << endl;
volScalarField K
(
* * IOobject
* * (
* * * * "K",
* * * * runTime.timeName(),
* * * * mesh,
* * * * IOobject::MUST_READ,
* * * * IOobject::AUTO_WRITE
* * ),
* * mesh
);
After that I recompiled the solver. Ran "decomposePar", ran in parallel and finally ran "reconstructPar" to visualize in paraview.

Thanks a lot for offering help!
saavedra00 is offline   Reply With Quote

Old   January 29, 2021, 01:35
Default
  #7
Senior Member
 
Join Date: Dec 2019
Location: Cologne, Germany
Posts: 355
Rep Power: 8
geth03 is on a distinguished road
no problem,

i assumed that K was taken from the constant folder,
that is why i asked how you read it and instantiate it.

anyway, glad you solved it already.
geth03 is offline   Reply With Quote

Old   January 29, 2021, 16:54
Default
  #8
Member
 
Ran
Join Date: Aug 2016
Posts: 69
Rep Power: 9
random_ran is on a distinguished road
> runTime.constant(),

what if

> runTime.monkey(),

What does this line of code really do?
__________________
Yours in CFD,

Ran
random_ran is offline   Reply With Quote

Reply

Tags
cannot find, foam fatal error, magnetic field, mhdfoam, parallelization


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
OpenFOAM Parallelization MPI Cluster Problem arslan.ali OpenFOAM Running, Solving & CFD 4 September 23, 2018 13:50
Fluent Parallelization Problem After AC Power Dropped pawl Hardware 5 November 13, 2016 06:08
Problem with parallelization on cluster GiuMan OpenFOAM Running, Solving & CFD 12 August 14, 2015 05:11


All times are GMT -4. The time now is 21:36.