CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Programming & Development

MPI error in parallel application

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   June 30, 2017, 23:28
Default MPI error in parallel application
  #1
Senior Member
 
Join Date: Sep 2015
Location: Singapore
Posts: 102
Rep Power: 10
usv001 is on a distinguished road
Hello Foamers,

I have coded a utility application capable of running in parallel. I tested it on my laptop (4-8 cores) and also in a cluster (48-96 cores). It was able to run successfully all the time. Recently, I tried to run it on a mesh using 96 cores but it keeps crashing with the following error:

Code:
[45] [93] 
[93] 
[93] --> FOAM FATAL IO ERROR: 
[93] wrong token type - expected int, found on line 0 the word 'x'
[93] 
[93] file: IOstream at line 0.
[93] 
[93]     From function operator>>(Istream&, int&)
[93]     in file primitives/ints/int/intIO.C at line 68.
[93] 
FOAM parallel run exiting
[93] 

[87] [88] 
[88] 
[88] --> FOAM FATAL IO ERROR: 
[88] wrong token type - expected int, found on line 0 the word 'x'
[88] 
[88] file: IOstream at line 0.
[88] 
[88]     From function operator>>(Istream&, int&)
[88]     in file primitives/ints/int/intIO.C at line 68.
[88] 
FOAM parallel run exiting
[88] 
[45] [92] 
[92] 
[92] --> FOAM FATAL IO ERROR: 
[92] wrong token type - expected int, found on line 0 the word 'x'
[92] 
[92] file: IOstream at line 0.
[92] 
[92]     From function operator>>(Istream&, int&)
[92]     in file primitives/ints/int/intIO.C at line 68.
[92] 
FOAM parallel run exiting
[92] 
[94] 
[94] 
[94] --> FOAM FATAL IO ERROR: 
[94] wrong token type - expected int, found on line 0 the word 'x'
[94] 
[94] file: IOstream at line 0.
[94] 
[94]     From function operator>>(Istream&, int&)
[94]     in file primitives/ints/int/intIO.C at line 68.
[94] 
FOAM parallel run exiting
[94] 

[95] 
[95] 
[95] --> FOAM FATAL IO ERROR: 
[95] wrong token type - expected int, found on line 0 the word 'x'
[95] 
[95] file: IOstream at line 0.
[95] 
[95]     From function operator>>(Istream&, int&)
[95]     in file primitives/ints/int/intIO.C at line 68.
[95] 
FOAM parallel run exiting
[95] 
[89] 
[89] 
[89] --> FOAM FATAL IO ERROR: 
[89] wrong token type - expected int, found on line 0 the word 'x'
[89] 
[89] file: IOstream at line 0.
[89] 
[89]     From function operator>>(Istream&, int&)
[89]     in file primitives/ints/int/intIO.C at line 68.
[89] 
FOAM parallel run exiting
[89] 
[90] 
[90] 
[90] --> FOAM FATAL IO ERROR: 
[90] wrong token type - expected int, found on line 0 the word 'x'
[90] 
[90] file: IOstream at line 0.
[90] 
[90]     From function operator>>(Istream&, int&)
[90]     in file primitives/ints/int/intIO.C at line 68.
[90] 
FOAM parallel run exiting
[90] 
[91] 
[91] 
[91] --> FOAM FATAL IO ERROR: 
[91] wrong token type - expected int, found on line 0 the word 'x'
[91] 
[91] file: IOstream at line 0.
[91] 
[91]     From function operator>>(Istream&, int&)
[91]     in file primitives/ints/int/intIO.C at line 68.
[91] 
FOAM parallel run exiting
[91] 
[45] --> FOAM FATAL IO ERROR: 
[87] 
[87] --> FOAM FATAL IO ERROR: 
[87] wrong token type - expected int, found on line 0 the word 'x'
[87] 
[87] file: IOstream at line 0.
[87] 
[87]     From function operator>>(Istream&, int&)
[87]     in file primitives/ints/int/intIO.C at line 68.
[87] 
FOAM parallel run exiting
[87] 

[45] wrong token type - expected int, found on line 0 the word 'x'
[45] 
[45] file: IOstream at line 0.
[45] 
[45]     From function operator>>(Istream&, int&)
[45]     in file primitives/ints/int/intIO.C at line 68.
[45] 
FOAM parallel run exiting
[45] 
[std0741:29931] 9 more processes have sent help message help-mpi-api.txt / mpi-abort
[std0741:29931] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
Sometimes the word in the error message is 'p' or 'X' instead of 'x'. Strangely enough, it runs fine with 48 processors on the same mesh. Has anybody faced a similar problem or have any idea why this might happen when the number of processors is increased?

I'll be glad to provide any additional details.

Many thanks,
USV
usv001 is offline   Reply With Quote

Old   August 1, 2017, 00:34
Default
  #2
Senior Member
 
Join Date: Sep 2015
Location: Singapore
Posts: 102
Rep Power: 10
usv001 is on a distinguished road
Dear Foamers,

I have not managed to figure out the problem yet. As you can see, it says error in "IOstream" which I am certain refers to the IPstream from which it is supposed to receive some data. The perplexing thing is that there was no problem running this program on 48 cores. It happens when I increase the number of cores to 192.

Code:
[119] --> FOAM FATAL IO ERROR:
[119] error in IOstream "IOstream" for operation operator>>(Istream&, List<T>&)
[119]
[119] file: IOstream at line 0.
[119]
[119]     From function IOstream::fatalCheck(const char*) const
[119]     in file db/IOstreams/IOstreams/IOstream.C at line 114.
[119]
FOAM parallel run exiting
It would be of great help to me if someone has an idea of why this happens.

Many thanks,
USV
usv001 is offline   Reply With Quote

Old   September 14, 2017, 11:30
Default
  #3
New Member
 
Lento
Join Date: Oct 2012
Posts: 2
Rep Power: 0
MrLento234 is on a distinguished road
I am facing a similar issue.

What I noticed is that it also depends on the decomposition approach. If I use 'simple' it works by not with a 'scotch' approach.

So far, I don't know what is the source of the error.
MrLento234 is offline   Reply With Quote

Reply

Tags
error, mpi, parallel


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
mpirun, best parameters pablodecastillo Hardware 18 November 10, 2016 12:36
MPI application rank 0 exited before.. jypark FLUENT 1 July 15, 2016 12:58
Explicitly filtered LES saeedi Main CFD Forum 16 October 14, 2015 11:58
CFX parallel hp MPI problem fluidmechanics CFX 5 June 19, 2013 19:05
parallel Grief: BoundaryFields ok in single CPU but NOT in Parallel JR22 OpenFOAM Running, Solving & CFD 2 April 19, 2013 16:49


All times are GMT -4. The time now is 07:18.