|
[Sponsors] |
August 8, 2013, 22:47 |
Boundary condition with mpirun
|
#1 |
Senior Member
Joachim
Join Date: Mar 2012
Location: Paris, France
Posts: 145
Rep Power: 15 |
Hello everyone!
I just implemented Lund's recycled boundary condition (thank you Perry Johnson ) in OpenFOAM, and it seems to work just fine on a single processor. However, when I run the simulation on several processors, I get a pretty uggly error (sigSegv, sigFpe, etc...). Is there anything I should be doing to parallelize my code? Basically, the algorithms starts with void Foam::scaledMappedVelocityFixedValueFvPatchField:: updateCoeffs() { if (updated()) { return; } // Since we're inside initEvaluate/evaluate there might be processor // comms underway. Change the tag we use. int oldTag = UPstream::msgType(); UPstream::msgType() = oldTag+1; // Get the mappedPatchBase const mappedPatchBase& mpp = refCast<const mappedPatchBase> ( scaledMappedVelocityFixedValueFvPatchField:atch( ).patch() ); const fvMesh& nbrMesh = refCast<const fvMesh>(mpp.sampleMesh()); const word& fieldName = dimensionedInternalField().name(); const volVectorField& nbrField = nbrMesh.lookupObject<volVectorField>(fieldName); after that, I extract a plane from the domain, scale it, and eventually return the inlet as follows: // return the velocity values to inlet condition operator==(scaledU); // Restore tag UPstream::msgType() = oldTag; fixedValueFvPatchVectorField::updateCoeffs(); } If someone had an idea, it would be really great! I don't really feel like running my LES on a single processor. Thanks! Joachim |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Domain Imbalance | HMR | CFX | 5 | October 10, 2016 05:57 |
Radiation interface | hinca | CFX | 15 | January 26, 2014 17:11 |
An error has occurred in cfx5solve: | volo87 | CFX | 5 | June 14, 2013 17:44 |
Error finding variable "THERMX" | sunilpatil | CFX | 8 | April 26, 2013 07:00 |
Setting outlet Pressure boundary condition using CAFFA code | Mukund Pondkule | Main CFD Forum | 0 | March 16, 2011 03:23 |