|
[Sponsors] | |||||
|
|
|
#21 |
|
Member
sonu
Join Date: Jul 2013
Location: delhi
Posts: 92
Rep Power: 14 ![]() |
greetings bruno and everyone ,
i am trying to simulate my case of cold flow simulation using the rhoSimplecFoam solver but after some time of run it ends with the following error .. can you plz help me to understand where i am doin wrong ? thanks in advance ![]() Code:
Time = 43 GAMG: Solving for Ux, Initial residual = 0.19957, Final residual = 0.0130805, No Iterations 1 GAMG: Solving for Uy, Initial residual = 0.432992, Final residual = 0.030986, No Iterations 1 GAMG: Solving for e, Initial residual = 0.00150254, Final residual = 0.000111488, No Iterations 1 GAMG: Solving for p, Initial residual = 0.164657, Final residual = 0.122731, No Iterations 1000 time step continuity errors : sum local = 0.0390443, global = -0.0148295, cumulative = -0.639136 rho max/min : 1 0.382345 GAMG: Solving for epsilon, Initial residual = 0.000488404, Final residual = 2.53703e-08, No Iterations 1 bounding epsilon, min: -4.33742e-06 max: 857.902 average: 1.75282 GAMG: Solving for k, Initial residual = 0.128439, Final residual = 0.0118675, No Iterations 1 bounding k, min: 1e-16 max: 5005.77 average: 1.11456 ExecutionTime = 1274.69 s ClockTime = 1275 s Time = 44 GAMG: Solving for Ux, Initial residual = 0.00778892, Final residual = 0.000427613, No Iterations 1 GAMG: Solving for Uy, Initial residual = 0.262673, Final residual = 0.0165556, No Iterations 1 GAMG: Solving for e, Initial residual = 1.08917e-05, Final residual = 6.74763e-07, No Iterations 1 GAMG: Solving for p, Initial residual = 0.198508, Final residual = 0.14558, No Iterations 1000 time step continuity errors : sum local = 0.0445931, global = -0.0148271, cumulative = -0.653963 rho max/min : 1 0.382345 GAMG: Solving for epsilon, Initial residual = 0.000403613, Final residual = 1.28867e-07, No Iterations 1 bounding epsilon, min: -3.4861e-08 max: 857.872 average: 1.76686 GAMG: Solving for k, Initial residual = 0.0963778, Final residual = 0.00239046, No Iterations 2 bounding k, min: -2.2769e-08 max: 18855.3 average: 2.6373 ExecutionTime = 1302.92 s ClockTime = 1303 s Time = 45 GAMG: Solving for Ux, Initial residual = 0.823156, Final residual = 0.0483317, No Iterations 1 GAMG: Solving for Uy, Initial residual = 0.673439, Final residual = 0.034677, No Iterations 1 GAMG: Solving for e, Initial residual = 0.00301749, Final residual = 0.000137252, No Iterations 1 #0 Foam::error::printStack(Foam::Ostream&) at ??:? #1 Foam::sigFpe::sigHandler(int) at ??:? #2 in "/lib/x86_64-linux-gnu/libc.so.6" #3 double Foam::sumProd<double>(Foam::UList<double> const&, Foam::UList<double> const&) at ??:? #4 Foam::PBiCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const at ??:? #5 Foam::GAMGSolver::solveCoarsestLevel(Foam::Field<double>&, Foam::Field<double> const&) const at ??:? #6 Foam::GAMGSolver::Vcycle(Foam::PtrList<Foam::lduMatrix::smoother> const&, Foam::Field<double>&, Foam::Field<double> const&, Foam::Field<double>&, Foam::Field<double>&, Foam::Field<double>&, Foam::PtrList<Foam::Field<double> >&, Foam::PtrList<Foam::Field<double> >&, unsigned char) const at ??:? #7 Foam::GAMGSolver::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const at ??:? #8 Foam::fvMatrix<double>::solveSegregated(Foam::dictionary const&) at ??:? #9 Foam::fvMatrix<double>::solve(Foam::dictionary const&) at ??:? #10 Foam::fvMatrix<double>::solve() at ??:? #11 at ??:? #12 __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6" #13 at ??:? Floating point exception (core dumped) Last edited by wyldckat; August 27, 2013 at 18:28. Reason: Added [CODE][/CODE] |
|
|
|
|
|
|
|
|
#22 | |
|
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 130 ![]() ![]() ![]() ![]() ![]() ![]() |
Greetings yash.aesi,
Not much information to work with. But from the output you've showed: Quote:
Best regards Bruno
__________________
|
||
|
|
|
||
|
|
|
#23 |
|
Member
sonu
Join Date: Jul 2013
Location: delhi
Posts: 92
Rep Power: 14 ![]() |
thanks bruno ,
i wl try to check BC's
|
|
|
|
|
|
|
|
|
#24 |
|
Member
sonu
Join Date: Jul 2013
Location: delhi
Posts: 92
Rep Power: 14 ![]() |
helo bruno ,
i tried to check my BC's. Now after giving a run its goin fine but i think the problem is not solved yet as the point you mentioned in last post about the pressure is not solved yet . The output is showing as (without giving error ): Code:
Time = 298 GAMG: Solving for Ux, Initial residual = 0.00248195, Final residual = 2.77057e-05, No Iterations 1 GAMG: Solving for Uy, Initial residual = 0.00686025, Final residual = 0.000236565, No Iterations 1 GAMG: Solving for e, Initial residual = 0.00343494, Final residual = 5.85371e-05, No Iterations 1 GAMG: Solving for p, Initial residual = 0.20715, Final residual = 0.152599, No Iterations 1000 time step continuity errors : sum local = 0.0140403, global = -0.011032, cumulative = -0.311197 rho max/min : 1 0.352192 GAMG: Solving for epsilon, Initial residual = 0.000197142, Final residual = 7.86035e-06, No Iterations 1 GAMG: Solving for k, Initial residual = 0.00474871, Final residual = 0.000184764, No Iterations 1 ExecutionTime = 860.06 s ClockTime = 860 s Time = 299 GAMG: Solving for Ux, Initial residual = 0.00247746, Final residual = 2.75936e-05, No Iterations 1 GAMG: Solving for Uy, Initial residual = 0.0068694, Final residual = 0.000236605, No Iterations 1 GAMG: Solving for e, Initial residual = 0.00342726, Final residual = 5.82198e-05, No Iterations 1 GAMG: Solving for p, Initial residual = 0.209186, Final residual = 0.153251, No Iterations 1000 time step continuity errors : sum local = 0.0140198, global = -0.0110261, cumulative = -0.322224 rho max/min : 1 0.352192 GAMG: Solving for epsilon, Initial residual = 0.000196554, Final residual = 7.82037e-06, No Iterations 1 GAMG: Solving for k, Initial residual = 0.00473606, Final residual = 0.000183578, No Iterations 1 ExecutionTime = 889.64 s ClockTime = 890 s thanks alot in advance
Last edited by wyldckat; August 31, 2013 at 15:56. Reason: Added [CODE][/CODE] |
|
|
|
|
|
|
|
|
#25 |
|
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 130 ![]() ![]() ![]() ![]() ![]() ![]() |
Hi Sonu,
From what I can see, the pressure in the outlet should not be defined as being of type "calculated". Because that way you have an undefined boundary on the outlet, since you say that the velocity is of type "zeroGradient". For ideas on what the boundary conditions should be, see the tutorials on OpenFOAM and see the link "Boundary Conditions" on this page: http://foam.sourceforge.net/docs/cpp/index.html Best regards, Bruno
__________________
|
|
|
|
|
|
|
|
|
#26 |
|
Member
sonu
Join Date: Jul 2013
Location: delhi
Posts: 92
Rep Power: 14 ![]() |
Greetings Bruno ,
thanks bruno for suggesting these link which are useful to me for better understanding . but i already changed the outlet BC to zeroGradient then simulation keep on running its not converging . Rite now i dnt have output to show but show you other day . Regards , sonu |
|
|
|
|
|
|
|
|
#27 |
|
Member
sonu
Join Date: Jul 2013
Location: delhi
Posts: 92
Rep Power: 14 ![]() |
helo
here is what is shown in the output : Code:
Time = 1499 GAMG: Solving for Ux, Initial residual = 0.000370436, Final residual = 1.78681e-06, No Iterations 1 GAMG: Solving for Uy, Initial residual = 0.000186964, Final residual = 4.67884e-06, No Iterations 1 GAMG: Solving for e, Initial residual = 0.000322874, Final residual = 4.85704e-06, No Iterations 1 GAMG: Solving for p, Initial residual = 0.0030172, Final residual = 0.000281682, No Iterations 8 time step continuity errors : sum local = 4.1905e-08, global = -8.58901e-19, cumulative = 2.08053e-17 rho max/min : 0.1 0.1 GAMG: Solving for epsilon, Initial residual = 1.37661e-06, Final residual = 4.21716e-08, No Iterations 1 GAMG: Solving for k, Initial residual = 0.000106335, Final residual = 3.04613e-06, No Iterations 1 ExecutionTime = 1238.57 s ClockTime = 1404 s Time = 1500 GAMG: Solving for Ux, Initial residual = 0.000370261, Final residual = 1.78544e-06, No Iterations 1 GAMG: Solving for Uy, Initial residual = 0.000187208, Final residual = 4.67032e-06, No Iterations 1 GAMG: Solving for e, Initial residual = 0.000322533, Final residual = 4.85232e-06, No Iterations 1 GAMG: Solving for p, Initial residual = 0.00300666, Final residual = 0.000292086, No Iterations 8 time step continuity errors : sum local = 4.34695e-08, global = -1.09357e-18, cumulative = 1.97117e-17 rho max/min : 0.1 0.1 GAMG: Solving for epsilon, Initial residual = 1.37571e-06, Final residual = 4.21285e-08, No Iterations 1 GAMG: Solving for k, Initial residual = 0.000106287, Final residual = 3.04439e-06, No Iterations 1 ExecutionTime = 1241.65 s ClockTime = 1407 s. Code:
dimensions [1 -1 -2 0 0 0 0];
internalField uniform 101325;
boundaryField
{
fuel_inlet
{
type zeroGradient;
//type mixed;
refValue uniform 101325;
refGradient uniform 0;
valueFraction uniform 0.3;
}
coflow_inlet
{
type zeroGradient;
//type mixed;
refValue uniform 101325;
refGradient uniform 0;
valueFraction uniform 0.3;
}
Outlet
{
type zeroGradient;
//type mixed;
//refValue uniform 101325;
//refGradient uniform 0;
//valueFraction uniform 1;
//type transonicOutletPressure;
//U U;
//phi phi;
//gamma 1.4;
//psi psi;
//pInf uniform 101325;
}
Axis
{
type symmetryPlane;
}
Upper_wall
{
type zeroGradient;
}
frontAndBack
{
type empty;
}
Regards , Sonu Last edited by wyldckat; August 31, 2013 at 17:07. Reason: Added [CODE][/CODE] |
|
|
|
|
|
|
|
|
#28 |
|
Senior Member
Ehsan
Join Date: Oct 2012
Location: Iran
Posts: 2,208
Rep Power: 28 ![]() |
outlet BC for pressure should be fixedValue,assign a back pressure there.
__________________
Injustice Anywhere is a Threat for Justice Everywhere.Martin Luther King. To Be or Not To Be,Thats the Question! The Only Stupid Question Is the One that Goes Unasked. |
|
|
|
|
|
|
|
|
#29 |
|
Member
sonu
Join Date: Jul 2013
Location: delhi
Posts: 92
Rep Power: 14 ![]() |
greetings Ehsan and bruno ,
i changed my pressure outlet BC's from zeroGradient to fixedValue but now again its giving error : Code:
Time = 19 GAMG: Solving for Ux, Initial residual = 0.140082, Final residual = 0.0129345, No Iterations 1 GAMG: Solving for Uy, Initial residual = 0.111632, Final residual = 0.00749058, No Iterations 1 GAMG: Solving for e, Initial residual = 0.111638, Final residual = 0.00888687, No Iterations 1 GAMG: Solving for p, Initial residual = 0.0145399, Final residual = 0.00133764, No Iterations 14 time step continuity errors : sum local = 0.000262946, global = 2.02579e-05, cumulative = -3.88151e-05 rho max/min : 1.35725 0.657245 GAMG: Solving for epsilon, Initial residual = 0.0506423, Final residual = 0.000552153, No Iterations 1 bounding epsilon, min: -0.164911 max: 1392.55 average: 1.11572 GAMG: Solving for k, Initial residual = 0.0791398, Final residual = 0.000638985, No Iterations 1 ExecutionTime = 112.35 s ClockTime = 115 s Time = 20 GAMG: Solving for Ux, Initial residual = 0.163277, Final residual = 0.00968811, No Iterations 1 GAMG: Solving for Uy, Initial residual = 0.102144, Final residual = 0.0040134, No Iterations 1 GAMG: Solving for e, Initial residual = 0.999875, Final residual = 0.0148186, No Iterations 2 #0 Foam::error::printStack(Foam::Ostream&) at ??:? #1 Foam::sigFpe::sigHandler(int) at ??:? #2 in "/lib/x86_64-linux-gnu/libc.so.6" #3 Foam::hePsiThermo<Foam::psiThermo, Foam::pureMixture<Foam::sutherlandTransport<Foam::species::thermo<Foam::hConstThermo<Foam::perfectGas<Foam::specie> >, Foam::sensibleInternalEnergy> > > >::calculate() at ??:? #4 Foam::hePsiThermo<Foam::psiThermo, Foam::pureMixture<Foam::sutherlandTransport<Foam::species::thermo<Foam::hConstThermo<Foam::perfectGas<Foam::specie> >, Foam::sensibleInternalEnergy> > > >::correct() at ??:? #5 at ??:? #6 __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6" #7 at ??:? Floating point exception (core dumped) thanks Last edited by wyldckat; September 1, 2013 at 08:11. Reason: Added [CODE][/CODE] |
|
|
|
|
|
|
|
|
#30 |
|
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 130 ![]() ![]() ![]() ![]() ![]() ![]() |
Hi Sonu,
First, please follow the instructions on my second signature link, for when you need to post output or code, namely: How to post code using [CODE] Second, you still have epsilon values that are outside of the normal zone of operations, namely: Code:
epsilon, min: -0.164911 Last but not least, you should not jump directly to such high flow rates. With OpenFOAM, as well with anything you don't know well enough, the approach is to not jump directly into the final case set-up, because it's very unlikely that you will succeed to have a working simulation. And in that situation, you'll have too many possible reasons for the simulation to not work, making it nearly impossible to fix all of the problems in a single step. Therefore, you should gradually evolve from the simplest form of your problem, increasing the level of complexity one detail at a time. Best regards, Bruno
__________________
Last edited by wyldckat; September 1, 2013 at 10:16. Reason: fixed typos |
|
|
|
|
|
|
|
|
#31 |
|
New Member
Join Date: Jan 2013
Location: Lisboa-Funchal
Posts: 23
Rep Power: 14 ![]() |
I was running a compressible LES simulation when it stopped with a similar or the same error, displayed just below.
Code:
Mean and max Courant Numbers = 0.0327681 0.0831581 Time = 0.00128178 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUx, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUy, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUz, Initial residual = 0, Final residual = 0, No Iterations 0 smoothSolver: Solving for Ux, Initial residual = 1.9902e-06, Final residual = 2.04476e-14, No Iterations 3 smoothSolver: Solving for Uy, Initial residual = 3.69969e-06, Final residual = 3.95952e-14, No Iterations 3 smoothSolver: Solving for Uz, Initial residual = 2.22835e-05, Final residual = 4.25565e-13, No Iterations 3 diagonal: Solving for rhoE, Initial residual = 0, Final residual = 0, No Iterations 0 smoothSolver: Solving for e, Initial residual = 9.02514e-06, Final residual = 8.30103e-13, No Iterations 3 ExecutionTime = 40520.5 s ClockTime = 40657 s Mean and max Courant Numbers = 0.0327679 0.083161 Time = 0.00128182 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUx, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUy, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUz, Initial residual = 0, Final residual = 0, No Iterations 0 smoothSolver: Solving for Ux, Initial residual = 2.00094e-06, Final residual = 2.06337e-14, No Iterations 3 smoothSolver: Solving for Uy, Initial residual = 3.70838e-06, Final residual = 4.11045e-14, No Iterations 3 smoothSolver: Solving for Uz, Initial residual = 2.23139e-05, Final residual = 4.33116e-13, No Iterations 3 diagonal: Solving for rhoE, Initial residual = 0, Final residual = 0, No Iterations 0 [24] #0 Foam::error::printStack(Foam::Ostream&)[56] #0 Foam::error::printStack(Foam::Ostream&)-------------------------------------------------------------------------- An MPI process has executed an operation involving a call to the "fork()" system call to create a child process. Open MPI is currently operating in a condition that could result in memory corruption or other system errors; your MPI job may hang, crash, or produce silent data corruption. The use of fork() (or system() or other calls that create child processes) is strongly discouraged. The process that invoked fork was: Local host: g03 (PID 33539) MPI_COMM_WORLD rank: 24 If you are *absolutely sure* that your application will successfully and correctly survive a call to fork(), you may disable this warning by setting the mpi_warn_on_fork MCA parameter to 0. -------------------------------------------------------------------------- [40] #0 Foam::error::printStack(Foam::Ostream&)[8] #0 Foam::error::printStack(Foam::Ostream&) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [24] #1 Foam::sigFpe::sigHandler(int) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [8] #1 Foam::sigFpe::sigHandler(int) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [56] #1 Foam::sigFpe::sigHandler(int) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [40] #1 Foam::sigFpe::sigHandler(int) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [24] #2 in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [8] #2 in "/lib/x86_64-linux-gnu/libc.so.6" [8] #3 Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::calculate() in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [56] #2 in "/lib/x86_64-linux-gnu/libc.so.6" [24] #3 Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::calculate() in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so" [8] #4 Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::correct() in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so" [8] #5 in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so" [24] #4 Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::correct() in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [40] #2 in "/lib/x86_64-linux-gnu/libc.so.6" [56] #3 Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::calculate() in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so" [24] #5 in "/lib/x86_64-linux-gnu/libc.so.6" [40] #3 Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::calculate() in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so" [56] #4 Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::correct() [8] in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoCentralFoam" [8] #6 __libc_start_main in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so" [56] #5 [24] in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoCentralFoam" [24] #6 __libc_start_main in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so" [40] #4 Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::correct() in "/lib/x86_64-linux-gnu/libc.so.6" [8] #7 in "/lib/x86_64-linux-gnu/libc.so.6" [24] #7 in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so" [40] #5 [56] in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoCentralFoam" [56] #6 __libc_start_main[8] in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoCentralFoam" [g03:33523] *** Process received signal *** [g03:33523] Signal: Floating point exception (8) [g03:33523] Signal code: (-6) [g03:33523] Failing at address: 0x3f8000082f3 [g03:33523] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2b8d0a824480] [g03:33523] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x2b8d0a824405] [g03:33523] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2b8d0a824480] [g03:33523] [ 3] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so(_ZN4Foam10ePsiThermoINS_11pureMixtureINS_19sutherlandTransportINS_12specieThermoINS_12hConstThermoINS_10perfectGasEEEEEEEEEE9calculateEv+0x3f9) [0x2b8d081471e9] [g03:33523] [ 4] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so(_ZN4Foam10ePsiThermoINS_11pureMixtureINS_19sutherlandTransportINS_12specieThermoINS_12hConstThermoINS_10perfectGasEEEEEEEEEE7correctEv+0x32) [0x2b8d0814d262] [g03:33523] [ 5] rhoCentralFoam() [0x4236cb] [g03:33523] [ 6] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xfd) [0x2b8d0a810ead] [g03:33523] [ 7] rhoCentralFoam() [0x41c709] [g03:33523] *** End of error message *** [24] in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoCentralFoam" [g03:33539] *** Process received signal *** [g03:33539] Signal: Floating point exception (8) [g03:33539] Signal code: (-6) [g03:33539] Failing at address: 0x3f800008303 [g03:33539] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2af004454480] [g03:33539] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x2af004454405] [g03:33539] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2af004454480] [g03:33539] [ 3] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so(_ZN4Foam10ePsiThermoINS_11pureMixtureINS_19sutherlandTransportINS_12specieThermoINS_12hConstThermoINS_10perfectGasEEEEEEEEEE9calculateEv+0x3f9) [0x2af001d771e9] [g03:33539] [ 4] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so(_ZN4Foam10ePsiThermoINS_11pureMixtureINS_19sutherlandTransportINS_12specieThermoINS_12hConstThermoINS_10perfectGasEEEEEEEEEE7correctEv+0x32) [0x2af001d7d262] [g03:33539] [ 5] rhoCentralFoam() [0x4236cb] [g03:33539] [ 6] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xfd) [0x2af004440ead] [g03:33539] [ 7] rhoCentralFoam() [0x41c709] [g03:33539] *** End of error message *** in "/lib/x86_64-linux-gnu/libc.so.6" [56] #7 [40] in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoCentralFoam" [40] #6 __libc_start_main[56] in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoCentralFoam" [g02:10983] *** Process received signal *** [g02:10983] Signal: Floating point exception (8) [g02:10983] Signal code: (-6) [g02:10983] Failing at address: 0x3f800002ae7 [g02:10983] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2b7f3585c480] [g02:10983] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x2b7f3585c405] [g02:10983] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2b7f3585c480] [g02:10983] [ 3] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so(_ZN4Foam10ePsiThermoINS_11pureMixtureINS_19sutherlandTransportINS_12specieThermoINS_12hConstThermoINS_10perfectGasEEEEEEEEEE9calculateEv+0x3f9) [0x2b7f3317f1e9] [g02:10983] [ 4] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so(_ZN4Foam10ePsiThermoINS_11pureMixtureINS_19sutherlandTransportINS_12specieThermoINS_12hConstThermoINS_10perfectGasEEEEEEEEEE7correctEv+0x32) [0x2b7f33185262] [g02:10983] [ 5] rhoCentralFoam() [0x4236cb] [g02:10983] [ 6] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xfd) [0x2b7f35848ead] [g02:10983] [ 7] rhoCentralFoam() [0x41c709] [g02:10983] *** End of error message *** in "/lib/x86_64-linux-gnu/libc.so.6" [40] #7 [40] in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoCentralFoam" [g02:10967] *** Process received signal *** [g02:10967] Signal: Floating point exception (8) [g02:10967] Signal code: (-6) [g02:10967] Failing at address: 0x3f800002ad7 [g02:10967] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2ae44b8f5480] [g02:10967] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x2ae44b8f5405] [g02:10967] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2ae44b8f5480] [g02:10967] [ 3] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so(_ZN4Foam10ePsiThermoINS_11pureMixtureINS_19sutherlandTransportINS_12specieThermoINS_12hConstThermoINS_10perfectGasEEEEEEEEEE9calculateEv+0x3f9) [0x2ae4492181e9] [g02:10967] [ 4] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so(_ZN4Foam10ePsiThermoINS_11pureMixtureINS_19sutherlandTransportINS_12specieThermoINS_12hConstThermoINS_10perfectGasEEEEEEEEEE7correctEv+0x32) [0x2ae44921e262] [g02:10967] [ 5] rhoCentralFoam() [0x4236cb] [g02:10967] [ 6] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xfd) [0x2ae44b8e1ead] [g02:10967] [ 7] rhoCentralFoam() [0x41c709] [g02:10967] *** End of error message *** [g03:33510] 3 more processes have sent help message help-mpi-runtime.txt / mpi_init:warn-fork [g03:33510] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages -------------------------------------------------------------------------- mpirun noticed that process rank 56 with PID 10983 on node g02 exited on signal 8 (Floating point exception). -------------------------------------------------------------------------- I also checked my mesh with the command checkMesh, the output was Code:
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time
Create polyMesh for time = 0
Time = 0
Mesh stats
points: 5440550
faces: 15827778
internal faces: 15337566
cells: 5194224
boundary patches: 6
point zones: 0
face zones: 0
cell zones: 0
Overall number of cells of each type:
hexahedra: 5194224
prisms: 0
wedges: 0
pyramids: 0
tet wedges: 0
tetrahedra: 0
polyhedra: 0
Checking topology...
Boundary definition OK.
Cell to face addressing OK.
Point usage OK.
Upper triangular ordering OK.
Face vertices OK.
Number of regions: 1 (OK).
Checking patch topology for multiply connected surfaces ...
Patch Faces Points Surface topology
entrada 6840 7150 ok (non-closed singly connected)
topo 14784 15425 ok (non-closed singly connected)
saida 6840 7150 ok (non-closed singly connected)
parede 28896 30125 ok (non-closed singly connected)
tras 216426 217622 ok (non-closed singly connected)
frente 216426 217622 ok (non-closed singly connected)
Checking geometry...
Overall domain bounding box (-0.05 0 -0.06) (0.2 0.22 0.06)
Mesh (non-empty, non-wedge) directions (1 1 1)
Mesh (non-empty) directions (1 1 1)
Boundary openness (-2.66602e-15 1.51299e-14 4.75939e-14) OK.
Max cell openness = 2.21102e-16 OK.
Max aspect ratio = 15.8197 OK.
Minumum face area = 1.00095e-07. Maximum face area = 2.88462e-06. Face area magnitudes OK.
Min volume = 5.00473e-10. Max volume = 1.2275e-09. Total volume = 0.00372. Cell volumes OK.
Mesh non-orthogonality Max: 0 average: 0
Non-orthogonality check OK.
Face pyramids OK.
Max skewness = 4.99527e-06 OK.
Mesh OK.
End
And, as I see from the residuals and Courant number, all of them seem alright. Resuming, when I run simulations with two computers, this errors happens. So is there a problem, when parallel computing is being performed with processors of two machines ? I mean the communication between them ?
__________________
Se Urso Vires Foge Tocando Gaita Para Hamburgo |
|
|
|
|
|
|
|
|
#32 |
|
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 130 ![]() ![]() ![]() ![]() ![]() ![]() |
Greetings guilha,
To assess if the problem is related the communication between the two machines, try running with 16 cores on each machine, therefore having 32 cores in total. This way you can isolate if the problem is due to using too many cores, or a bad decomposition or if it's related to the communication. In addition, there are a few other things that can affect this:
Bruno
__________________
|
|
|
|
|
|
|
|
|
#33 |
|
New Member
Join Date: Jan 2013
Location: Lisboa-Funchal
Posts: 23
Rep Power: 14 ![]() |
Hello Bruno and all other FOAMers, thanks for your help and patience
I did not have time to test the communication between the machines. Now, following your list: 1 - Yes, I have cyclic patches, my case is almost two dimensional and LES; 2 - The "commsType" is set with nonBlocking. If I change do I have to compile anything ? 3 - I know it not, I talked to the administrator and we both do not know, but probably it is because I do not understand what really means the file sharing, however it seems not to be NFS as she said we did not use it; 4 - The communication between the machines is Ethernet. 5 - It is ok, and the output is just below (checkMesh with more options). Code:
/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.0.1 |
| \\ / A nd | Web: www.OpenFOAM.com |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 2.0.1-51f1de99a4bc
Exec : checkMesh -allGeometry -allTopology
Date : Sep 30 2013
Time : 11:24:57
Host : g01
PID : 47387
Case : /home/guilha/cavidade_216kx24_les_smagorinsky_galego_euler
nProcs : 1
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time
Create polyMesh for time = 0
Time = 0
Mesh stats
points: 5440550
faces: 15827778
internal faces: 15337566
cells: 5194224
boundary patches: 6
point zones: 0
face zones: 0
cell zones: 0
Overall number of cells of each type:
hexahedra: 5194224
prisms: 0
wedges: 0
pyramids: 0
tet wedges: 0
tetrahedra: 0
polyhedra: 0
Checking topology...
Boundary definition OK.
Cell to face addressing OK.
Point usage OK.
Upper triangular ordering OK.
Face vertices OK.
Topological cell zip-up check OK.
Face-face connectivity OK.
Number of regions: 1 (OK).
Checking patch topology for multiply connected surfaces ...
Patch Faces Points Surface topology Bounding box
entrada 6840 7150 ok (non-closed singly connected) (-0.05 0.12 -0.06) (-0.05 0.22 0.06)
topo 14784 15425 ok (non-closed singly connected) (-0.05 0.22 -0.06) (0.2 0.22 0.06)
saida 6840 7150 ok (non-closed singly connected) (0.2 0.12 -0.06) (0.2 0.22 0.06)
parede 28896 30125 ok (non-closed singly connected) (-0.05 0 -0.06) (0.2 0.12 0.06)
tras 216426 217622 ok (non-closed singly connected) (-0.05 0 -0.06) (0.2 0.22 -0.06)
frente 216426 217622 ok (non-closed singly connected) (-0.05 0 0.06) (0.2 0.22 0.06)
Checking geometry...
Overall domain bounding box (-0.05 0 -0.06) (0.2 0.22 0.06)
Mesh (non-empty, non-wedge) directions (1 1 1)
Mesh (non-empty) directions (1 1 1)
Boundary openness (-2.66602e-15 1.51299e-14 4.75939e-14) OK.
Max cell openness = 2.21102e-16 OK.
Max aspect ratio = 15.8197 OK.
Minumum face area = 1.00095e-07. Maximum face area = 2.88462e-06. Face area magnitudes OK.
Min volume = 5.00473e-10. Max volume = 1.2275e-09. Total volume = 0.00372. Cell volumes OK.
Mesh non-orthogonality Max: 0 average: 0
Non-orthogonality check OK.
Face pyramids OK.
Max skewness = 4.99527e-06 OK.
Face tets OK.
Min/max edge length = 0.000316061 0.005 OK.
All angles in faces OK.
Face flatness (1 = flat, 0 = butterfly) : average = 1 min = 1
All face flatness OK.
Cell determinant (wellposedness) : minimum: 0.031003 average: 0.371808
Cell determinant check OK.
Concave cell check OK.
Mesh OK.
End
http://www.cfd-online.com/Forums/ope...ple-cores.html I do not need to do the decomposition with parallel computing but it gave me the alert, is the problem in my decomposition ? I have been doing it by simply typing decomposePar -force, and if I wish to do it with more processors do I have to change anything in the command ? I ask it because I thought I could simply write on the script the number of processors used to do the decomposition. Although I am not using scripts to do the decomposition. And to finish, the two cases I was running, now on SINGLE MACHINES, one of them stopped with almost the same erros (almost, because I checked the messages and they have very few differences). I must remember that the two cases are the same, but one with a more refined mesh. And the one that had given the error is the most refined case, the time of simulation at which appeared the error is almost the same as a particle at the speed of U0 have traveled the distance of the whole domain 2 times. The other case, I am running to get more samples for statistic issues, and until now no problems. So I am totally dizzied. It does not seem to be anything unphysical, I mean the residuals and Courant (and the other coarser mesh gave great results), the new error message is this one: Code:
Mean and max Courant Numbers = 0.0354678 0.0909748 Time = 0.00225237 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUx, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUy, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUz, Initial residual = 0, Final residual = 0, No Iterations 0 smoothSolver: Solving for Ux, Initial residual = 1.52504e-06, Final residual = 2.11393e-14, No Iterations 3 smoothSolver: Solving for Uy, Initial residual = 2.81461e-06, Final residual = 2.76122e-14, No Iterations 3 smoothSolver: Solving for Uz, Initial residual = 1.2395e-05, Final residual = 1.6969e-13, No Iterations 3 diagonal: Solving for rhoE, Initial residual = 0, Final residual = 0, No Iterations 0 smoothSolver: Solving for e, Initial residual = 5.03938e-06, Final residual = 4.94582e-13, No Iterations 3 ExecutionTime = 214443 s ClockTime = 216156 s Mean and max Courant Numbers = 0.0354677 0.0909748 Time = 0.00225242 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUx, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUy, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUz, Initial residual = 0, Final residual = 0, No Iterations 0 smoothSolver: Solving for Ux, Initial residual = 1.52931e-06, Final residual = 2.19558e-14, No Iterations 3 smoothSolver: Solving for Uy, Initial residual = 2.82524e-06, Final residual = 2.91866e-14, No Iterations 3 smoothSolver: Solving for Uz, Initial residual = 1.24457e-05, Final residual = 1.75215e-13, No Iterations 3 diagonal: Solving for rhoE, Initial residual = 0, Final residual = 0, No Iterations 0 [16] #0 Foam::error::printStack(Foam::Ostream&)[0] #0 Foam::error::printStack(Foam::Ostream&)-------------------------------------------------------------------------- An MPI process has executed an operation involving a call to the "fork()" system call to create a child process. Open MPI is currently operating in a condition that could result in memory corruption or other system errors; your MPI job may hang, crash, or produce silent data corruption. The use of fork() (or system() or other calls that create child processes) is strongly discouraged. The process that invoked fork was: Local host: g01 (PID 38756) MPI_COMM_WORLD rank: 16 If you are *absolutely sure* that your application will successfully and correctly survive a call to fork(), you may disable this warning by setting the mpi_warn_on_fork MCA parameter to 0. -------------------------------------------------------------------------- in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [16] #1 Foam::sigFpe::sigHandler(int) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [0] #1 Foam::sigFpe::sigHandler(int) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [16] #2 in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" [0] #2 in "/lib/x86_64-linux-gnu/libc.so.6" [16] #3 Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::calculate() in "/lib/x86_64-linux-gnu/libc.so.6" [0] #3 Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::calculate() in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so" [16] #4 Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::correct() in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so" [0] #4 Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::correct() in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so" [0] #5 in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so" [16] #5 [16] in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoCentralFoam" [16] #6 __libc_start_main[0] in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoCentralFoam" [0] #6 __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6" [16] #7 in "/lib/x86_64-linux-gnu/libc.so.6" [0] #7 [0] in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoCentralFoam" [g01:38740] *** Process received signal *** [g01:38740] Signal: Floating point exception (8) [g01:38740] Signal code: (-6) [g01:38740] Failing at address: 0x3f800009754 [g01:38740] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2b4c82bf3480] [g01:38740] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x2b4c82bf3405] [g01:38740] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2b4c82bf3480] [g01:38740] [ 3] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so(_ZN4Foam10ePsiThermoINS_11pureMixtureINS_19sutherlandTransportINS_12specieThermoINS_12hConstThermoINS_10perfectGasEEEEEEEEEE9calculateEv+0x3f9) [0x2b4c805161e9] [g01:38740] [ 4] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so(_ZN4Foam10ePsiThermoINS_11pureMixtureINS_19sutherlandTransportINS_12specieThermoINS_12hConstThermoINS_10perfectGasEEEEEEEEEE7correctEv+0x32) [0x2b4c8051c262] [g01:38740] [ 5] rhoCentralFoam() [0x4236cb] [g01:38740] [ 6] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xfd) [0x2b4c82bdfead] [g01:38740] [ 7] rhoCentralFoam() [0x41c709] [g01:38740] *** End of error message *** [16] in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoCentralFoam" [g01:38756] *** Process received signal *** [g01:38756] Signal: Floating point exception (8) [g01:38756] Signal code: (-6) [g01:38756] Failing at address: 0x3f800009764 [g01:38756] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2add71707480] [g01:38756] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x2add71707405] [g01:38756] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2add71707480] [g01:38756] [ 3] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so(_ZN4Foam10ePsiThermoINS_11pureMixtureINS_19sutherlandTransportINS_12specieThermoINS_12hConstThermoINS_10perfectGasEEEEEEEEEE9calculateEv+0x3f9) [0x2add6f02a1e9] [g01:38756] [ 4] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so(_ZN4Foam10ePsiThermoINS_11pureMixtureINS_19sutherlandTransportINS_12specieThermoINS_12hConstThermoINS_10perfectGasEEEEEEEEEE7correctEv+0x32) [0x2add6f030262] [g01:38756] [ 5] rhoCentralFoam() [0x4236cb] [g01:38756] [ 6] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xfd) [0x2add716f3ead] [g01:38756] [ 7] rhoCentralFoam() [0x41c709] [g01:38756] *** End of error message *** [g01:38739] 1 more process has sent help message help-mpi-runtime.txt / mpi_init:warn-fork [g01:38739] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages -------------------------------------------------------------------------- mpirun noticed that process rank 16 with PID 38756 on node g01 exited on signal 8 (Floating point exception). --------------------------------------------------------------------------
__________________
Se Urso Vires Foge Tocando Gaita Para Hamburgo |
|
|
|
|
|
|
|
|
#34 | |||||||
|
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 130 ![]() ![]() ![]() ![]() ![]() ![]() |
Hi guilha,
Quote:
Quote:
Either your case is 2D or it isn't. In OpenFOAM, "2D" is when we use front and back patches defined as "empty" and there is only one cell thickness in the Z direction. As for LES in 2D... I vaguely remember that it's not exactly a good idea... because the turbulence is actually 3D. But then again, I vaguely remember that OpenFOAM has got one or two tutorials working with LES in 2D. Quote:
Quote:
Since it's only two machines, I guess that's enough. Quote:
Quote:
Quote:
Code:
Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::calculate() Problem here is that there is no clear indication of which operation might have given a division by zero. I would choose to write frequent time snapshots near the crashing time location and then visually inspect where the fields are getting high or rather low values. Best regards, Bruno
__________________
|
||||||||
|
|
|
||||||||
|
|
|
#35 |
|
New Member
Join Date: Jan 2013
Location: Lisboa-Funchal
Posts: 23
Rep Power: 14 ![]() |
Hello Bruno, thank you for your analysis. I have been too much busy lately.
The test between the machines, I did not do because I can not use x processors of one machine and y processors of the other. However, I did a test (the case is the same, only has a bigger time step), and I run it in a single processor, it failed here: Code:
Mean and max Courant Numbers = 0.200668 0.571628 Time = 2.12e-05 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUx, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUy, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUz, Initial residual = 0, Final residual = 0, No Iterations 0 smoothSolver: Solving for Ux, Initial residual = 0.000425058, Final residual = 9.90843e-09, No Iterations 3 smoothSolver: Solving for Uy, Initial residual = 0.000855807, Final residual = 9.09303e-09, No Iterations 3 smoothSolver: Solving for Uz, Initial residual = 0.00843877, Final residual = 1.09127e-11, No Iterations 6 diagonal: Solving for rhoE, Initial residual = 0, Final residual = 0, No Iterations 0 smoothSolver: Solving for e, Initial residual = 0.00155423, Final residual = 9.73893e-12, No Iterations 6 ExecutionTime = 1511.9 s ClockTime = 1513 s Mean and max Courant Numbers = 0.200544 0.571627 Time = 2.16e-05 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUx, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUy, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUz, Initial residual = 0, Final residual = 0, No Iterations 0 smoothSolver: Solving for Ux, Initial residual = 0.000381028, Final residual = 9.74391e-09, No Iterations 3 smoothSolver: Solving for Uy, Initial residual = 0.000849587, Final residual = 8.09615e-09, No Iterations 3 smoothSolver: Solving for Uz, Initial residual = 0.00710211, Final residual = 8.182e-12, No Iterations 6 diagonal: Solving for rhoE, Initial residual = 0, Final residual = 0, No Iterations 0 smoothSolver: Solving for e, Initial residual = 0.00131698, Final residual = 1.03579e-11, No Iterations 6 ExecutionTime = 1547.3 s ClockTime = 1549 s Mean and max Courant Numbers = 0.200412 0.593619 Time = 2.2e-05 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUx, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUy, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUz, Initial residual = 0, Final residual = 0, No Iterations 0 smoothSolver: Solving for Ux, Initial residual = 0.000346998, Final residual = 8.89842e-09, No Iterations 3 smoothSolver: Solving for Uy, Initial residual = 0.000814717, Final residual = 7.78361e-09, No Iterations 3 smoothSolver: Solving for Uz, Initial residual = 0.00670942, Final residual = 6.52104e-12, No Iterations 6 diagonal: Solving for rhoE, Initial residual = 0, Final residual = 0, No Iterations 0 smoothSolver: Solving for e, Initial residual = 0.00145455, Final residual = 9.90931e-08, No Iterations 3 ExecutionTime = 1567.72 s ClockTime = 1569 s Mean and max Courant Numbers = 0.200281 0.711964 Time = 2.24e-05 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUx, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUy, Initial residual = 0, Final residual = 0, No Iterations 0 diagonal: Solving for rhoUz, Initial residual = 0, Final residual = 0, No Iterations 0 smoothSolver: Solving for Ux, Initial residual = 0.000390677, Final residual = 8.75177e-09, No Iterations 3 smoothSolver: Solving for Uy, Initial residual = 0.000795014, Final residual = 7.67302e-09, No Iterations 3 smoothSolver: Solving for Uz, Initial residual = 0.0075196, Final residual = 8.23954e-12, No Iterations 6 diagonal: Solving for rhoE, Initial residual = 0, Final residual = 0, No Iterations 0 #0 Foam::error::printStack(Foam::Ostream&) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" #1 Foam::sigFpe::sigHandler(int) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so" #2 in "/lib/libc.so.6" #3 Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::calculate() in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so" #4 Foam::ePsiThermo<Foam::pureMixture<Foam::sutherlandTransport<Foam::specieThermo<Foam::hConstThermo<Foam::perfectGas> > > > >::correct() in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libbasicThermophysicalModels.so" #5 in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoCentralFoam" #6 __libc_start_main in "/lib/libc.so.6" #7 in "/opt/openfoam201/platforms/linux64GccDPOpt/bin/rhoCentralFoam" Floating point exception About my simulations, they are LES 3D, of course. What I meant for almost 2D, was the type of flow, which is quasi-2D. For the cases where my time step is smaller than the one I posted in code, and running in parallel, the errors are random. And I can not have the results stored since it leads to a lot of memory usage. But for the last test (relatively big time step) I did (which was in a single processor), the error is not random, I run the case twice and confirmed it. From the post processing, my velocity grows in a sharp corner, and this is the reason why Courant increases, but I think it is compatible with the perfect fluid solution. But in this simulation, I think it gets unstable due to the Courant increase, that is with a smaller time step it might be bounded to the stability limit. Also I saw the function, which you told me about, the ePsiThermo. Where there is an alpha. For certains boundaries conditions (alphaSGS, muSGS and muTilda), I used a standard one (as I could not find in the literature any value for these variables) which I saw on an OpenFOAM tutorial, and they are essentially 0. Regarding the cyclic patch, in this link http://www.cfd-online.com/Forums/ope...tml#post241413, I think I have this in the computer, Code:
//- Keep owner and neighbour on same processor for faces in patches: // (makes sense only for cyclic patches) //preservePatches (cyclic_half0 cyclic_half1);
__________________
Se Urso Vires Foge Tocando Gaita Para Hamburgo |
|
|
|
|
|
|
|
|
#36 | |||||
|
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 130 ![]() ![]() ![]() ![]() ![]() ![]() |
Hi guilha,
Quote:
Higher values should only be used if you know what you are doing ![]() Quote:
Quote:
Quote:
Quote:
Code:
preservePatches (
up_patch
downPatch
patch_left
thenRight
);
Bruno
__________________
|
||||||
|
|
|
||||||
|
|
|
#37 |
|
New Member
Join Date: Jan 2013
Location: Lisboa-Funchal
Posts: 23
Rep Power: 14 ![]() |
Good evening,
I am again in this thread because recently I have had a wierd error. When I run my case in 16 processors or 24 (the cases tested), no problems appear, however with more processors like 30, 32 or 64 (the cases tested) it appears this error Code:
/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.0.1 |
| \\ / A nd | Web: www.OpenFOAM.com |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 2.0.1-51f1de99a4bc
Exec : rhoCentralFoam -parallel
Date : Feb 15 2014
Time : 21:23:14
Host : g01
PID : 35702
Case : /home/guilha/cavidade_LES_130kx24_smagorinsky_v1_perfil_power_law_v4
nProcs : 32
Slaves :
31
(
g01.35703
g01.35704
g01.35705
g01.35706
g01.35707
g01.35708
g01.35709
g01.35710
g01.35711
g01.35712
g01.35713
g01.35714
g01.35715
g01.35716
g01.35717
g01.35718
g01.35719
g01.35720
g01.35721
g01.35722
g01.35723
g01.35724
g01.35725
g01.35726
g01.35727
g01.35728
g01.35729
g01.35730
g01.35731
g01.35732
g01.35733
)
Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : nonBlocking
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time
Create mesh for time = 0
[g01:35706] *** Process received signal ***
[g01:35706] Signal: Segmentation fault (11)
[g01:35706] Signal code: Address not mapped (1)
[g01:35706] Failing at address: 0xfffffffe03990ad8
[g01:35706] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2ad84978a480]
[g01:35706] [ 1] /lib/x86_64-linux-gnu/libc.so.6(+0x728fa) [0x2ad8497ca8fa]
[g01:35706] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x74d64) [0x2ad8497ccd64]
[g01:35706] [ 3] /lib/x86_64-linux-gnu/libc.so.6(__libc_malloc+0x70) [0x2ad8497cf420]
[g01:35706] [ 4] /usr/lib/x86_64-linux-gnu/libstdc++.so.6(_Znwm+0x1d) [0x2ad84907268d]
[g01:35706] [ 5] /usr/lib/x86_64-linux-gnu/libstdc++.so.6(_Znam+0x9) [0x2ad8490727a9]
[g01:35706] [ 6] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam5error10printStackERNS_7OstreamE+0x128b) [0x2ad848ad81db]
[g01:35706] [ 7] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam7sigSegv10sigHandlerEi+0x30) [0x2ad848acaec0]
[g01:35706] [ 8] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2ad84978a480]
[g01:35706] [ 9] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam18processorPolyPatch10updateMeshERNS_14PstreamBuffersE+0x2da) [0x2ad84894af3a]
[g01:35706] [10] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam16polyBoundaryMesh10updateMeshEv+0x2b1) [0x2ad848951fc1]
[g01:35706] [11] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam8polyMeshC2ERKNS_8IOobjectE+0x10ea) [0x2ad8489a316a]
[g01:35706] [12] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libfiniteVolume.so(_ZN4Foam6fvMeshC1ERKNS_8IOobjectE+0x19) [0x2ad8462d25f9]
[g01:35706] [13] rhoCentralFoam() [0x41f624]
[g01:35706] [14] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xfd) [0x2ad849776ead]
[g01:35706] [15] rhoCentralFoam() [0x41c709]
[5] #0 Foam::error::printStack(Foam::Ostream&)[g01:35706] *** End of error message ***
--------------------------------------------------------------------------
An MPI process has executed an operation involving a call to the
"fork()" system call to create a child process. Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your MPI job may hang, crash, or produce silent
data corruption. The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.
The process that invoked fork was:
Local host: g01 (PID 35707)
MPI_COMM_WORLD rank: 5
If you are *absolutely sure* that your application will successfully
and correctly survive a call to fork(), you may disable this warning
by setting the mpi_warn_on_fork MCA parameter to 0.
--------------------------------------------------------------------------
in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[5] #1 Foam::sigSegv::sigHandler(int) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[5] #2 in "/lib/x86_64-linux-gnu/libc.so.6"
[5] #3 Foam::processorPolyPatch::updateMesh(Foam::PstreamBuffers&) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[5] #4 Foam::polyBoundaryMesh::updateMesh()--------------------------------------------------------------------------
mpirun noticed that process rank 4 with PID 35706 on node g01 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
__________________
Se Urso Vires Foge Tocando Gaita Para Hamburgo |
|
|
|
|
|
|
|
|
#38 |
|
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 130 ![]() ![]() ![]() ![]() ![]() ![]() |
Hi guilha,
It could be a problem in the installation of OpenFOAM on one of the machines. Try running checkMesh in parallel, the same way you run rhoCentralFoam. And a few questions (I don't remember the details):
Bruno
__________________
|
|
|
|
|
|
|
|
|
#39 |
|
New Member
Join Date: Jan 2013
Location: Lisboa-Funchal
Posts: 23
Rep Power: 14 ![]() |
Bruno thanks a lot for your replies and all the support.
Running the checkMesh in parallel gives an error yes Code:
/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.0.1 |
| \\ / A nd | Web: www.OpenFOAM.com |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 2.0.1-51f1de99a4bc
Exec : checkMesh -parallel
Date : Feb 15 2014
Time : 22:29:05
Host : g04
PID : 3313
Case : /home/guilha/testes
nProcs : 32
Slaves :
31
(
g04.3314
g04.3315
g04.3316
g04.3317
g04.3318
g04.3319
g04.3320
g04.3321
g04.3322
g04.3323
g04.3324
g04.3325
g04.3326
g04.3327
g04.3328
g04.3329
g04.3330
g04.3331
g04.3332
g04.3333
g04.3334
g04.3335
g04.3336
g04.3337
g04.3338
g04.3339
g04.3340
g04.3341
g04.3342
g04.3343
g04.3344
)
Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : nonBlocking
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
[0]
[0]
[0] --> FOAM FATAL ERROR:
[0] checkMesh: cannot open case directory "/home/guilha/testes/processor0"
[0]
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 3313 on
node g04 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
I have cyclic boundary conditions. The decomposePar I think it works perfectly fine, the output is for the 32 processors Code:
Processor 0
Number of cells = 97416
Number of faces shared with processor 1 = 2736
Number of faces shared with processor 2 = 1243
Number of faces shared with processor 3 = 1205
Number of faces shared with processor 30 = 720
Number of processor patches = 4
Number of processor faces = 5904
Number of boundary faces = 9750
Processor 1
Number of cells = 98736
Number of faces shared with processor 0 = 2736
Number of faces shared with processor 12 = 1632
Number of faces shared with processor 13 = 864
Number of faces shared with processor 14 = 768
Number of faces shared with processor 30 = 1560
Number of processor patches = 5
Number of processor faces = 7560
Number of boundary faces = 8444
Processor 2
Number of cells = 97046
Number of faces shared with processor 0 = 1243
Number of faces shared with processor 3 = 2014
Number of faces shared with processor 3 = 3
Number of faces shared with processor 7 = 1272
Number of processor patches = 4
Number of processor faces = 4532
Number of boundary faces = 10102
Processor 3
Number of cells = 97426
Number of faces shared with processor 0 = 1205
Number of faces shared with processor 2 = 2014
Number of faces shared with processor 2 = 3
Number of faces shared with processor 6 = 1289
Number of faces shared with processor 7 = 103
Number of processor patches = 5
Number of processor faces = 4614
Number of boundary faces = 9914
Processor 4
Number of cells = 97976
Number of faces shared with processor 5 = 2142
Number of faces shared with processor 5 = 3
Number of faces shared with processor 5 = 1
Number of faces shared with processor 6 = 1222
Number of processor patches = 4
Number of processor faces = 3368
Number of boundary faces = 11280
Processor 5
Number of cells = 97168
Number of faces shared with processor 4 = 2142
Number of faces shared with processor 4 = 1
Number of faces shared with processor 4 = 3
Number of faces shared with processor 6 = 74
Number of faces shared with processor 7 = 1224
Number of processor patches = 5
Number of processor faces = 3444
Number of boundary faces = 11238
Processor 6
Number of cells = 98412
Number of faces shared with processor 3 = 1289
Number of faces shared with processor 4 = 1222
Number of faces shared with processor 5 = 74
Number of faces shared with processor 7 = 2178
Number of faces shared with processor 7 = 3
Number of processor patches = 5
Number of processor faces = 4766
Number of boundary faces = 10228
Processor 7
Number of cells = 97044
Number of faces shared with processor 2 = 1272
Number of faces shared with processor 3 = 103
Number of faces shared with processor 5 = 1224
Number of faces shared with processor 6 = 2178
Number of faces shared with processor 6 = 3
Number of processor patches = 5
Number of processor faces = 4780
Number of boundary faces = 9894
Processor 8
Number of cells = 99552
Number of faces shared with processor 9 = 2040
Number of faces shared with processor 10 = 216
Number of faces shared with processor 11 = 1392
Number of faces shared with processor 24 = 1392
Number of processor patches = 4
Number of processor faces = 5040
Number of boundary faces = 9880
Processor 9
Number of cells = 98677
Number of faces shared with processor 8 = 2040
Number of faces shared with processor 10 = 1152
Number of faces shared with processor 12 = 1480
Number of faces shared with processor 13 = 750
Number of faces shared with processor 24 = 1008
Number of faces shared with processor 29 = 600
Number of processor patches = 6
Number of processor faces = 7030
Number of boundary faces = 8224
Processor 10
Number of cells = 98044
Number of faces shared with processor 8 = 216
Number of faces shared with processor 9 = 1152
Number of faces shared with processor 11 = 1728
Number of faces shared with processor 13 = 528
Number of faces shared with processor 15 = 1260
Number of processor patches = 5
Number of processor faces = 4884
Number of boundary faces = 9534
Processor 11
Number of cells = 98040
Number of faces shared with processor 8 = 1392
Number of faces shared with processor 10 = 1728
Number of processor patches = 2
Number of processor faces = 3120
Number of boundary faces = 11242
Processor 12
Number of cells = 99813
Number of faces shared with processor 1 = 1632
Number of faces shared with processor 9 = 1480
Number of faces shared with processor 13 = 2268
Number of faces shared with processor 29 = 1224
Number of faces shared with processor 30 = 576
Number of processor patches = 5
Number of processor faces = 7180
Number of boundary faces = 8322
Processor 13
Number of cells = 100166
Number of faces shared with processor 1 = 864
Number of faces shared with processor 9 = 750
Number of faces shared with processor 10 = 528
Number of faces shared with processor 12 = 2268
Number of faces shared with processor 14 = 1512
Number of faces shared with processor 15 = 1968
Number of processor patches = 6
Number of processor faces = 7890
Number of boundary faces = 8342
Processor 14
Number of cells = 100368
Number of faces shared with processor 1 = 768
Number of faces shared with processor 13 = 1512
Number of faces shared with processor 15 = 1944
Number of processor patches = 3
Number of processor faces = 4224
Number of boundary faces = 11580
Processor 15
Number of cells = 100364
Number of faces shared with processor 10 = 1260
Number of faces shared with processor 13 = 1968
Number of faces shared with processor 14 = 1944
Number of processor patches = 3
Number of processor faces = 5172
Number of boundary faces = 10312
Processor 16
Number of cells = 96459
Number of faces shared with processor 17 = 2438
Number of faces shared with processor 17 = 2
Number of faces shared with processor 18 = 1392
Number of faces shared with processor 19 = 96
Number of processor patches = 4
Number of processor faces = 3928
Number of boundary faces = 11084
Processor 17
Number of cells = 97625
Number of faces shared with processor 16 = 2438
Number of faces shared with processor 16 = 2
Number of faces shared with processor 19 = 1248
Number of faces shared with processor 22 = 336
Number of faces shared with processor 23 = 1707
Number of faces shared with processor 23 = 2
Number of processor patches = 6
Number of processor faces = 5733
Number of boundary faces = 9469
Processor 18
Number of cells = 97056
Number of faces shared with processor 16 = 1392
Number of faces shared with processor 19 = 1944
Number of faces shared with processor 31 = 1416
Number of processor patches = 3
Number of processor faces = 4752
Number of boundary faces = 9816
Processor 19
Number of cells = 97107
Number of faces shared with processor 16 = 96
Number of faces shared with processor 17 = 1248
Number of faces shared with processor 18 = 1944
Number of faces shared with processor 22 = 1986
Number of faces shared with processor 28 = 1512
Number of faces shared with processor 31 = 528
Number of processor patches = 6
Number of processor faces = 7314
Number of boundary faces = 8090
Processor 20
Number of cells = 95424
Number of faces shared with processor 21 = 1824
Number of faces shared with processor 23 = 1320
Number of processor patches = 2
Number of processor faces = 3144
Number of boundary faces = 11048
Processor 21
Number of cells = 96816
Number of faces shared with processor 20 = 1824
Number of faces shared with processor 22 = 1560
Number of faces shared with processor 23 = 192
Number of faces shared with processor 26 = 96
Number of faces shared with processor 27 = 1512
Number of processor patches = 5
Number of processor faces = 5184
Number of boundary faces = 9412
Processor 22
Number of cells = 96477
Number of faces shared with processor 17 = 336
Number of faces shared with processor 19 = 1986
Number of faces shared with processor 21 = 1560
Number of faces shared with processor 23 = 1584
Number of faces shared with processor 26 = 1848
Number of faces shared with processor 28 = 48
Number of processor patches = 6
Number of processor faces = 7362
Number of boundary faces = 8042
Processor 23
Number of cells = 96844
Number of faces shared with processor 17 = 1707
Number of faces shared with processor 17 = 2
Number of faces shared with processor 20 = 1320
Number of faces shared with processor 21 = 192
Number of faces shared with processor 22 = 1584
Number of processor patches = 5
Number of processor faces = 4805
Number of boundary faces = 9659
Processor 24
Number of cells = 98250
Number of faces shared with processor 8 = 1392
Number of faces shared with processor 9 = 1008
Number of faces shared with processor 25 = 2225
Number of faces shared with processor 29 = 1176
Number of processor patches = 4
Number of processor faces = 5801
Number of boundary faces = 9339
Processor 25
Number of cells = 96462
Number of faces shared with processor 24 = 2225
Number of faces shared with processor 26 = 1224
Number of faces shared with processor 27 = 1176
Number of faces shared with processor 28 = 216
Number of faces shared with processor 29 = 744
Number of processor patches = 5
Number of processor faces = 5585
Number of boundary faces = 9407
Processor 26
Number of cells = 97678
Number of faces shared with processor 21 = 96
Number of faces shared with processor 22 = 1848
Number of faces shared with processor 25 = 1224
Number of faces shared with processor 27 = 2250
Number of faces shared with processor 28 = 2232
Number of processor patches = 5
Number of processor faces = 7650
Number of boundary faces = 8150
Processor 27
Number of cells = 98402
Number of faces shared with processor 21 = 1512
Number of faces shared with processor 25 = 1176
Number of faces shared with processor 26 = 2250
Number of processor patches = 3
Number of processor faces = 4938
Number of boundary faces = 10182
Processor 28
Number of cells = 99528
Number of faces shared with processor 19 = 1512
Number of faces shared with processor 22 = 48
Number of faces shared with processor 25 = 216
Number of faces shared with processor 26 = 2232
Number of faces shared with processor 29 = 1656
Number of faces shared with processor 30 = 192
Number of faces shared with processor 31 = 1536
Number of processor patches = 7
Number of processor faces = 7392
Number of boundary faces = 8294
Processor 29
Number of cells = 98184
Number of faces shared with processor 9 = 600
Number of faces shared with processor 12 = 1224
Number of faces shared with processor 24 = 1176
Number of faces shared with processor 25 = 744
Number of faces shared with processor 28 = 1656
Number of faces shared with processor 30 = 1320
Number of processor patches = 6
Number of processor faces = 6720
Number of boundary faces = 8182
Processor 30
Number of cells = 98856
Number of faces shared with processor 0 = 720
Number of faces shared with processor 1 = 1560
Number of faces shared with processor 12 = 576
Number of faces shared with processor 28 = 192
Number of faces shared with processor 29 = 1320
Number of faces shared with processor 31 = 2136
Number of processor patches = 6
Number of processor faces = 6504
Number of boundary faces = 8838
Processor 31
Number of cells = 98856
Number of faces shared with processor 18 = 1416
Number of faces shared with processor 19 = 528
Number of faces shared with processor 28 = 1536
Number of faces shared with processor 30 = 2136
Number of processor patches = 4
Number of processor faces = 5616
Number of boundary faces = 9486
Number of processor faces = 87968
Max number of cells = 100368 (2.407444% above average 98008.5)
Max number of processor patches = 7 (51.35135% above average 4.625)
Max number of faces between processors = 7890 (43.50673% above average 5498)
Processor 0: field transfer
Processor 1: field transfer
Processor 2: field transfer
Processor 3: field transfer
Processor 4: field transfer
Processor 5: field transfer
Processor 6: field transfer
Processor 7: field transfer
Processor 8: field transfer
Processor 9: field transfer
Processor 10: field transfer
Processor 11: field transfer
Processor 12: field transfer
Processor 13: field transfer
Processor 14: field transfer
Processor 15: field transfer
Processor 16: field transfer
Processor 17: field transfer
Processor 18: field transfer
Processor 19: field transfer
Processor 20: field transfer
Processor 21: field transfer
Processor 22: field transfer
Processor 23: field transfer
Processor 24: field transfer
Processor 25: field transfer
Processor 26: field transfer
Processor 27: field transfer
Processor 28: field transfer
Processor 29: field transfer
Processor 30: field transfer
Processor 31: field transfer
End.
__________________
Se Urso Vires Foge Tocando Gaita Para Hamburgo |
|
|
|
|
|
|
|
|
#40 |
|
New Member
Join Date: Jan 2013
Location: Lisboa-Funchal
Posts: 23
Rep Power: 14 ![]() |
In the previous post the checkMesh output was shown without the decomposition, after decomposing there is an error
Code:
/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.0.1 |
| \\ / A nd | Web: www.OpenFOAM.com |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 2.0.1-51f1de99a4bc
Exec : checkMesh -parallel
Date : Feb 15 2014
Time : 22:38:21
Host : g04
PID : 3429
Case : /home/guilha/testes
nProcs : 32
Slaves :
31
(
g04.3430
g04.3431
g04.3432
g04.3433
g04.3434
g04.3435
g04.3436
g04.3437
g04.3438
g04.3439
g04.3440
g04.3441
g04.3442
g04.3443
g04.3444
g04.3445
g04.3446
g04.3447
g04.3448
g04.3449
g04.3450
g04.3451
g04.3452
g04.3453
g04.3454
g04.3455
g04.3456
g04.3457
g04.3458
g04.3459
g04.3460
)
Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : nonBlocking
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time
Create polyMesh for time = 0
[g04:03433] *** Process received signal ***
[g04:03433] Signal: Segmentation fault (11)
[g04:03433] Signal code: Address not mapped (1)
[g04:03433] Failing at address: 0xfffffffe0351be78
[g04:03433] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2b6814c14480]
[g04:03433] [ 1] /lib/x86_64-linux-gnu/libc.so.6(+0x728fa) [0x2b6814c548fa]
[g04:03433] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x74d64) [0x2b6814c56d64]
[g04:03433] [ 3] /lib/x86_64-linux-gnu/libc.so.6(__libc_malloc+0x70) [0x2b6814c59420]
[g04:03433] [ 4] /usr/lib/x86_64-linux-gnu/libstdc++.so.6(_Znwm+0x1d) [0x2b68144fc68d]
[g04:03433] [ 5] /usr/lib/x86_64-linux-gnu/libstdc++.so.6(_Znam+0x9) [0x2b68144fc7a9]
[g04:03433] [ 6] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam5error10printStackERNS_7OstreamE+0x128b) [0x2b6813f631db]
[g04:03433] [ 7] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam7sigSegv10sigHandlerEi+0x30) [0x2b6813f55ec0]
[g04:03433] [ 8] /lib/x86_64-linux-gnu/libc.so.6(+0x32480) [0x2b6814c14480]
[g04:03433] [ 9] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam18processorPolyPatch10updateMeshERNS_14PstreamBuffersE+0x2da) [0x2b6813dd5f3a]
[g04:03433] [10] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam16polyBoundaryMesh10updateMeshEv+0x2b1) [0x2b6813ddcfc1]
[g04:03433] [11] /opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam8polyMeshC1ERKNS_8IOobjectE+0xd0b) [0x2b6813e2b8bb]
[5] #0 Foam::error::printStack(Foam::Ostream&)[g04:03433] [12] checkMesh() [0x41b1d4]
[g04:03433] [13] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xfd) [0x2b6814c00ead]
[g04:03433] [14] checkMesh() [0x407f79]
[g04:03433] *** End of error message ***
--------------------------------------------------------------------------
An MPI process has executed an operation involving a call to the
"fork()" system call to create a child process. Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your MPI job may hang, crash, or produce silent
data corruption. The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.
The process that invoked fork was:
Local host: g04 (PID 3434)
MPI_COMM_WORLD rank: 5
If you are *absolutely sure* that your application will successfully
and correctly survive a call to fork(), you may disable this warning
by setting the mpi_warn_on_fork MCA parameter to 0.
--------------------------------------------------------------------------
in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[5] #1 Foam::sigSegv::sigHandler(int) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[5] #2 in "/lib/x86_64-linux-gnu/libc.so.6"
[5] #3 Foam::processorPolyPatch::updateMesh(Foam::PstreamBuffers&) in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[5] #4 Foam::polyBoundaryMesh::updateMesh() in "/opt/openfoam201/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[5] #5 Foam::polyMesh::polyMesh(Foam::IOobject const&)--------------------------------------------------------------------------
mpirun noticed that process rank 4 with PID 3433 on node g04 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
__________________
Se Urso Vires Foge Tocando Gaita Para Hamburgo |
|
|
|
|
|
![]() |
| Thread Tools | Search this Thread |
| Display Modes | |
|
|
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| FoamerrorprintStack | mayank | OpenFOAM Running, Solving & CFD | 38 | November 25, 2011 23:58 |