CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > OpenFOAM Running, Solving & CFD

FATAL ERROR:Maximum number of iterations exceeded

Register Blogs Members List Search Today's Posts Mark Forums Read

Reply
 
LinkBack Thread Tools Display Modes
Old   December 8, 2013, 09:13
Default FATAL ERROR:Maximum number of iterations exceeded
  #1
Member
 
赵庆良
Join Date: Aug 2013
Posts: 56
Rep Power: 3
zqlhzx is on a distinguished road
Foamers,
Hi,I have a problem.At firt,I ran my case(CH4 combustion) and there is no problem.The temperature rises and iterations calculate nomally.But when calculated above temperature 1365K,it gave me a fatal error as following:
Quote:
acer@ubuntu:~/OpenFOAM/acer-2.2.2/run/singlefirepoolorg$ [0]
[0]
[0] --> FOAM FATAL ERROR:
[0] Maximum number of iterations exceeded
[0]
[0] From function thermo<Thermo, Type>::T(scalar f, scalar T0, scalar (thermo<Thermo, Type>::*F)(const scalar) const, scalar (thermo<Thermo, Type>::*dFdT)(const scalar) const, scalar (thermo<Thermo, Type>::*limit)(const scalar) const) const
[0] in file /home/opencfd/OpenFOAM/OpenFOAM-2.2.2/src/thermophysicalModels/specie/lnInclude/thermoI.H at line 76.
[0]
FOAM parallel run aborting
[0]
[0] #0 Foam::error:rintStack(Foam::Ostream&) in "/opt/openfoam222/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[0] #1 Foam::error::abort() in "/opt/openfoam222/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[0] #2 Foam::species::thermo<Foam::janafThermo<Foam:erf ectGas<Foam::specie> >, Foam::absoluteEnthalpy>::T(double, double, double, double (Foam::species::thermo<Foam::janafThermo<Foam:er fectGas<Foam::specie> >, Foam::absoluteEnthalpy>::*)(double, double) const, double (Foam::species::thermo<Foam::janafThermo<Foam:er fectGas<Foam::specie> >, Foam::absoluteEnthalpy>::*)(double, double) const, double (Foam::species::thermo<Foam::janafThermo<Foam:er fectGas<Foam::specie> >, Foam::absoluteEnthalpy>::*)(double) const) const in "/opt/openfoam222/platforms/linux64GccDPOpt/lib/libreactionThermophysicalModels.so"
[0] #3 Foam::hePsiThermo<Foam:siReactionThermo, Foam::SpecieMixture<Foam::reactingMixture<Foam::su therlandTransport<Foam::species::thermo<Foam::jana fThermo<Foam:erfectGas<Foam::specie> >, Foam::absoluteEnthalpy> > > > >::calculate() in "/home/acer/OpenFOAM/acer-2.2.2/platforms/linux64GccDPOpt/lib/libmyThermo.so"
[0] #4 Foam::hePsiThermo<Foam:siReactionThermo, Foam::SpecieMixture<Foam::reactingMixture<Foam::su therlandTransport<Foam::species::thermo<Foam::jana fThermo<Foam:erfectGas<Foam::specie> >, Foam::absoluteEnthalpy> > > > >::correct() in "/home/acer/OpenFOAM/acer-2.2.2/platforms/linux64GccDPOpt/lib/libmyThermo.so"
[0] #5
[0] in "/home/acer/OpenFOAM/acer-2.2.2/platforms/linux64GccDPOpt/bin/buoyantFireFoamorg"
[0] #6 __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6"
[0] #7
[0] in "/home/acer/OpenFOAM/acer-2.2.2/platforms/linux64GccDPOpt/bin/buoyantFireFoamorg"
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
*** glibc detected *** buoyantFireFoamorg: corrupted double-linked list: 0x0000000000f984a0 ***
======= Backtrace: =========
/lib/x86_64-linux-gnu/libc.so.6(+0x7eb96)[0x7f820167cb96]
/lib/x86_64-linux-gnu/libc.so.6(+0x7ef00)[0x7f820167cf00]
/lib/x86_64-linux-gnu/libc.so.6(+0x7fb29)[0x7f820167db29]
/lib/x86_64-linux-gnu/libc.so.6(+0x3b968)[0x7f8201639968]
/lib/x86_64-linux-gnu/libc.so.6(+0x3b985)[0x7f8201639985]
/usr/lib/libopen-rte.so.0(orte_ess_base_app_abort+0x20)[0x7f81fef5e200]
/usr/lib/libopen-rte.so.0(orte_errmgr_base_error_abort+0xfd)[0x7f81fef5d73d]
/usr/lib/libmpi.so.0(ompi_mpi_abort+0x255)[0x7f81ff1bbec5]
/opt/openfoam222/platforms/linux64GccDPOpt/lib/libOpenFOAM.so(_ZN4Foam5error5abortEv+0x1c6)[0x7f82025c9cd6]
/opt/openfoam222/platforms/linux64GccDPOpt/lib/libreactionThermophysicalModels.so(_ZNK4Foam7speci es6thermoINS_11janafThermoINS_10perfectGasINS_6spe cieEEEEENS_16absoluteEnthalpyEE1TEdddMS8_KFdddESA_ MS8_KFddE+0x19c)[0x7f82055aca5c]
/home/acer/OpenFOAM/acer-2.2.2/platforms/linux64GccDPOpt/lib/libmyThermo.so(_ZN4Foam11hePsiThermoINS_17psiReact ionThermoENS_13SpecieMixtureINS_15reactingMixtureI NS_19sutherlandTransportINS_7species6thermoINS_11j anafThermoINS_10perfectGasINS_6specieEEEEENS_16abs oluteEnthalpyEEEEEEEEEE9calculateEv+0x2b7)[0x7f81fa865657]
/home/acer/OpenFOAM/acer-2.2.2/platforms/linux64GccDPOpt/lib/libmyThermo.so(_ZN4Foam11hePsiThermoINS_17psiReact ionThermoENS_13SpecieMixtureINS_15reactingMixtureI NS_19sutherlandTransportINS_7species6thermoINS_11j anafThermoINS_10perfectGasINS_6specieEEEEENS_16abs oluteEnthalpyEEEEEEEEEE7correctEv+0x32)[0x7f81fa87d8c2]
buoyantFireFoamorg[0x431e69]
/lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed)[0x7f820161f76d]
buoyantFireFoamorg[0x440a1d]
======= Memory map: ========
00400000-004cb000 r-xp 00000000 07:00 410661 /home/acer/OpenFOAM/acer-2.2.2/platforms/linux64GccDPOpt/bin/buoyantFireFoamorg
006ca000-006cd000 r--p 000ca000 07:00 410661 /home/acer/OpenFOAM/acer-2.2.2/platforms/linux64GccDPOpt/bin/buoyantFireFoamorg
006cd000-006ce000 rw-p 000cd000 07:00 410661 /home/acer/OpenFOAM/acer-2.2.2/platforms/linux64GccDPOpt/bin/buoyantFireFoamorg
00e23000-0fab1000 rw-p 00000000 00:00 0 [heap]
7f81f24ed000-7f81f4bc3000 rw-p 00000000 00:00 0
7f81f4bc3000-7f81f8bc5000 rw-s 00000000 07:00 356545 /tmp/openmpi-sessions-acer@ubuntu_0/2104/1/shared_mem_pool.ubuntu (deleted)
7f81f8bc5000-7f81f8bd2000 r-xp 00000000 07:00 330798 /usr/lib/openmpi/lib/openmpi/mca_osc_rdma.so
7f81f8bd2000-7f81f8dd1000 ---p 0000d000 07:00 330798 /usr/lib/openmpi/lib/openmpi/mca_osc_rdma.so
7f81f8dd1000-7f81f8dd2000 r--p 0000c000 07:00 330798 /usr/lib/openmpi/lib/openmpi/mca_osc_rdma.so
7f81f8dd2000-7f81f8dd3000 rw-p 0000d000 07:00 330798 /usr/lib/openmpi/lib/openmpi/mca_osc_rdma.so
7f81f8dd3000-7f81f8ddd000 r-xp 00000000 07:00 330797 /usr/lib/openmpi/lib/openmpi/mca_osc_pt2pt.so
7f81f8ddd000-7f81f8fdc000 ---p 0000a000 07:00 330797 /usr/lib/openmpi/lib/openmpi/mca_osc_pt2pt.so
7f81f8fdc000-7f81f8fdd000 r--p 00009000 07:00 330797 /usr/lib/openmpi/lib/openmpi/mca_osc_pt2pt.so
7f81f8fdd000-7f81f8fde000 rw-p 0000a000 07:00 330797 /usr/lib/openmpi/lib/openmpi/mca_osc_pt2pt.so
7f81f8fde000-7f81f8ff5000 r-xp 00000000 07:00 330771 /usr/lib/openmpi/lib/openmpi/mca_coll_tuned.so
7f81f8ff5000-7f81f91f4000 ---p 00017000 07:00 330771 /usr/lib/openmpi/lib/openmpi/mca_coll_tuned.so
7f81f91f4000-7f81f91f5000 r--p 00016000 07:00 330771 /usr/lib/openmpi/lib/openmpi/mca_coll_tuned.so
7f81f91f5000-7f81f91f6000 rw-p 00017000 07:00 330771 /usr/lib/openmpi/lib/openmpi/mca_coll_tuned.so
7f81f91f6000-7f81f91f9000 r-xp 00000000 07:00 330770 /usr/lib/openmpi/lib/openmpi/mca_coll_sync.so
7f81f91f9000-7f81f93f8000 ---p 00003000 07:00 330770 /usr/lib/openmpi/lib/openmpi/mca_coll_sync.so
7f81f93f8000-7f81f93f9000 r--p 00002000 07:00 330770 /usr/lib/openmpi/lib/openmpi/mca_coll_sync.so
7f81f93f9000-7f81f93fa000 rw-p 00003000 07:00 330770 /usr/lib/openmpi/lib/openmpi/mca_coll_sync.so
7f81f93fa000-7f81f93ff000 r-xp 00000000 07:00 330769 /usr/lib/openmpi/lib/openmpi/mca_coll_sm.so
7f81f93ff000-7f81f95fe000 ---p 00005000 07:00 330769 /usr/lib/openmpi/lib/openmpi/mca_coll_sm.so
7f81f95fe000-7f81f95ff000 r--p 00004000 07:00 330769 /usr/lib/openmpi/lib/openmpi/mca_coll_sm.so
7f81f95ff000-7f81f9600000 rw-p 00005000 07:00 330769 /usr/lib/openmpi/lib/openmpi/mca_coll_sm.so
7f81f9600000-7f81f9602000 r-xp 00000000 07:00 330768 /usr/lib/openmpi/lib/openmpi/mca_coll_self.so
7f81f9602000-7f81f9801000 ---p 00002000 07:00 330768 /usr/lib/openmpi/lib/openmpi/mca_coll_self.so
7f81f9801000-7f81f9802000 r--p 00001000 07:00 330768 /usr/lib/openmpi/lib/openmpi/mca_coll_self.so
7f81f9802000-7f81f9803000 rw-p 00002000 07:00 330768 /usr/lib/openmpi/lib/openmpi/mca_coll_self.so
7f81f9803000-7f81f9806000 r-xp 00000000 07:00 330767 /usr/lib/openmpi/lib/openmpi/mca_coll_inter.so
7f81f9806000-7f81f9a05000 ---p 00003000 07:00 330767 /usr/lib/openmpi/lib/openmpi/mca_coll_inter.so
7f81f9a05000-7f81f9a06000 r--p 00002000 07:00 330767 /usr/lib/openmpi/lib/openmpi/mca_coll_inter.so
7f81f9a06000-7f81f9a07000 rw-p 00003000 07:00 330767 /usr/lib/openmpi/lib/openmpi/mca_coll_inter.so
7f81f9a07000-7f81f9a0b000 r-xp 00000000 07:00 330766 /usr/lib/openmpi/lib/openmpi/mca_coll_hierarch.so
7f81f9a0b000-7f81f9c0a000 ---p 00004000 07:00 330766 /usr/lib/openmpi/lib/openmpi/mca_coll_hierarch.so
7f81f9c0a000-7f81f9c0b000 r--p 00003000 07:00 330766 /usr/lib/openmpi/lib/openmpi/mca_coll_hierarch.so
7f81f9c0b000-7f81f9c0c000 rw-p 00004000 07:00 330766 /usr/lib/openmpi/lib/openmpi/mca_coll_hierarch.so
7f81f9c0c000-7f81f9c14000 r-xp 00000000 07:00 330765 /usr/lib/openmpi/lib/openmpi/mca_coll_basic.so
7f81f9c14000-7f81f9e13000 ---p 00008000 07:00 330765 /usr/lib/openmpi/lib/openmpi/mca_coll_basic.so
7f81f9e13000-7f81f9e14000 r--p 00007000 07:00 330765 /usr/lib/openmpi/lib/openmpi/mca_coll_basic.so
7f81f9e14000-7f81f9e15000 rw-p 00008000 07:00 330765 /usr/lib/openmpi/lib/openmpi/mca_coll_basic.so
7f81f9e15000-7f81f9e20000 r-xp 00000000 07:00 330762 /usr/lib/openmpi/lib/openmpi/mca_btl_tcp.so
7f81f9e20000-7f81fa01f000 ---p 0000b000 07:00 330762 /usr/lib/openmpi/lib/openmpi/mca_btl_tcp.so
7f81fa01f000-7f81fa020000 r--p 0000a000 07:00 330762 /usr/lib/openmpi/lib/openmpi/mca_btl_tcp.so
7f81fa020000-7f81fa021000 rw-p 0000b000 07:00 330762 /usr/lib/openmpi/lib/openmpi/mca_btl_tcp.so
7f81fa021000-7f81fa0a1000 rw-p 00000000 00:00 0
7f81fa0a1000-7f81fa0a6000 r-xp 00000000 07:00 330761 /usr/lib/openmpi/lib/openmpi/mca_btl_sm.so
7f81fa0a6000-7f81fa2a5000 ---p 00005000 07:00 330761 /usr/lib/openmpi/lib/openmpi/mca_btl_sm.so
7f81fa2a5000-7f81fa2a6000 r--p 00004000 07:00 330761 /usr/lib/openmpi/lib/openmpi/mca_btl_sm.so
7f81fa2a6000-7f81fa2a7000 rw-p 00005000 07:00 330761 /usr/lib/openmpi/lib/openmpi/mca_btl_sm.so
7f81fa2a7000-7f81fa2aa000 r-xp 00000000 07:00 330760 /usr/lib/openmpi/lib/openmpi/mca_btl_self.so
7f81fa2aa000-7f81fa4a9000 ---p 00003000 07:00 330760 /usr/lib/openmpi/lib/openmpi/mca_btl_self.so
7f81fa4a9000-7f81fa4aa000 r--p 00002000 07:00 330760 /usr/lib/openmpi/lib/openmpi/mca_btl_self.so
7f81fa4aa000-7f81fa4ab000 rw-p 00003000 07:00 330760 /usr/lib/openmpi/lib/openmpi/mca_btl_self.so
7f81fa7d8000-7f81fa8dd000 r-xp 00000000 07:00 479396 /home/acer/OpenFOAM/acer-2.2.2/platforms/linux64GccDPOpt/lib/libmyThermo.so
7f81fa8dd000-7f81faadc000 ---p 00105000 07:00 479396 /home/acer/OpenFOAM/acer-2.2.2/platforms/linux64GccDPOpt/lib/libmyThermo.so
7f81faadc000-7f81faae2000 r--p 00104000 07:00 479396 /home/acer/OpenFOAM/acer-2.2.2/platforms/linux64GccDPOpt/lib/libmyThermo.so
7f81faae2000-7f81faae4000 rw-p 0010a000 07:00 479396 /home/acer/OpenFOAM/acer-2.2.2/platforms/linux64GccDPOpt/lib/libmyThermo.so
7f81faae4000-7f81faae8000 r-xp 00000000 07:00 330757 /usr/lib/openmpi/lib/openmpi/mca_bml_r2.so
7f81faae8000-7f81face7000 ---p 00004000 07:00 330757 /usr/lib/openmpi/lib/openmpi/mca_bml_r2.so
7f81face7000-7f81face8000 r--p 00003000 07:00 330757 /usr/lib/openmpi/lib/openmpi/mca_bml_r2.so
7f81face8000-7f81face9000 rw-p 00004000 07:00 330757 /usr/lib/openmpi/lib/openmpi/mca_bml_r2.so
7f81faeed000-7f81faf00000 r-xp 00000000 07:00 330806 /usr/lib/openmpi/lib/openmpi/mca_pml_ob1.so
7f81faf00000-7f81fb100000 ---p 00013000 07:00 330806 /usr/lib/openmpi/lib/openmpi/mca_pml_ob1.so
7f81fb100000-7f81fb101000 r--p 00013000 07:00 330806 /usr/lib/openmpi/lib/openmpi/mca_pml_ob1.so
7f81fb101000-7f81fb102000 rw-p 00014000 07:00 330806 /usr/lib/openmpi/lib/openmpi/mca_pml_ob1.so
7f81fb31e000-7f81fb323000 r-xp 00000000 07:00 330775 /usr/lib/openmpi/lib/openmpi/mca_dpm_orte.so
7f81fb323000-7f81fb522000 ---p 00005000 07:00 330775 /usr/lib/openmpi/lib/openmpi/mca_dpm_orte.so
7f81fb522000-7f81fb523000 r--p 00004000 07:00 330775 /usr/lib/openmpi/lib/openmpi/mca_dpm_orte.so
7f81fb523000-7f81fb524000 rw-p 00005000 07:00 330775 /usr/lib/openmpi/lib/openmpi/mca_dpm_orte.so
7f81fb524000-7f81fb527000 r-xp 00000000 07:00 330808 /usr/lib/openmpi/lib/openmpi/mca_pubsub_orte.so
7f81fb527000-7f81fb726000 ---p 00003000 07:00 330808 /usr/lib/openmpi/lib/openmpi/mca_pubsub_orte.so
7f81fb726000-7f81fb727000 r--p 00002000 07:00 330808 /usr/lib/openmpi/lib/openmpi/mca_pubsub_orte.so
7f81fb727000-7f81fb728000 rw-p 00003000 07:00 330808 /usr/lib/openmpi/lib/openmpi/mca_pubsub_orte.so
7f81fb728000-7f81fb72a000 r-xp 00000000 07:00 329784 /usr/lib/openmpi/lib/libmca_common_sm.so.1.0.0
7f81fb72a000-7f81fb929000 ---p 00002000 07:00 329784 /usr/lib/openmpi/lib/libmca_common_sm.so.1.0.0
7f81fb929000-7f81fb92a000 r--p 00001000 07:00 329784 /usr/lib/openmpi/lib/libmca_common_sm.so.1.0.0
7f81fb92a000-7f81fb92b000 rw-p 00002000 07:00 329784 /usr/lib/openmpi/lib/libmca_common_sm.so.1.0.0
7f81fb92b000-7f81fb92d000 r-xp 00000000 07:00 330793 /usr/lib/openmpi/lib/openmpi/mca_mpool_sm.so
7f81fb92d000-7f81fbb2c000 ---p 00002000 07:00 330793 /usr/lib/openmpi/lib/openmpi/mca_mpool_sm.so
7f81fbb2c000-7f81fbb2d000 r--p 00001000 07:00 330793 /usr/lib/openmpi/lib/openmpi/mca_mpool_sm.so
7f81fbb2d000-7f81fbb2e000 rw-p 00002000 07:00 330793 /usr/lib/openmpi/lib/openmpi/mca_mpool_sm.so
7f81fbb2e000-7f81fbb31000 r-xp 00000000 07:00 330792 /usr/lib/openmpi/lib/openmpi/mca_mpool_rdma.so
7f81fbb31000-7f81fbd30000 ---p 00003000 07:00 330792 /usr/lib/openmpi/lib/openmpi/mca_mpool_rdma.so
7f81fbd30000-7f81fbd31000 r--p 00002000 07:00 330792 /usr/lib/openmpi/lib/openmpi/mca_mpool_rdma.so
7f81fbd31000-7f81fbd32000 rw-p 00003000 07:00 330792 /usr/lib/openmpi/lib/openmpi/mca_mpool_rdma.so
7f81fbd32000-7f81fbd33000 r-xp 00000000 07:00 330791 /usr/lib/openmpi/lib/openmpi/mca_mpool_fake.so
7f81fbd33000-7f81fbf32000 ---p 00001000 07:00 330791 /usr/lib/openmpi/lib/openmpi/mca_mpool_fake.so
7f81fbf32000-7f81fbf33000 r--p 00000000 07:00 330791 /usr/lib/openmpi/lib/openmpi/mca_mpool_fake.so
7f81fbf33000-7f81fbf34000 rw-p 00001000 07:00 330791 /usr/lib/openmpi/lib/openmpi/mca_mpool_fake.so
7f81fbf34000-7f81fbf37000 r-xp 00000000 07:00 330811 /usr/lib/openmpi/lib/openmpi/mca_rcache_vma.so
7f81fbf37000-7f81fc136000 ---p 00003000 07:00 330811 /usr/lib/openmpi/lib/openmpi/mca_rcache_vma.so
7f81fc136000-7f81fc137000 r--p 00002000 07:00 330811 /usr/lib/openmpi/lib/openmpi/mca_rcache_vma.so
7f81fc137000-7f81fc138000 rw-p 00003000 07:00 330811 /usr/lib/openmpi/lib/openmpi/mca_rcache_vma.so
7f81fc138000-7f81fc13a000 r-xp 00000000 07:00 330756 /usr/lib/openmpi/lib/openmpi/mca_allocator_bucket.so
7f81fc13a000-7f81fc339000 ---p 00002000 07:00 330756 /usr/lib/openmpi/lib/openmpi/mca_allocator_bucket.so
7f81fc339000-7f81fc33a000 r--p 00001000 07:00 330756 /usr/lib/openmpi/lib/openmpi/mca_allocator_bucket.so
7f81fc33a000-7f81fc33b000 rw-p 00002000 07:00 330756 /usr/lib/openmpi/lib/openmpi/mca_allocator_bucket.so
7f81fc33b000-7f81fc33d000 r-xp 00000000 07:00 330755 /usr/lib/openmpi/lib/openmpi/mca_allocator_basic.so
7f81fc33d000-7f81fc53c000 ---p 00002000 07:00 330755 /usr/lib/openmpi/lib/openmpi/mca_allocator_basic.so
7f81fc53c000-7f81fc53d000 r--p 00001000 07:00 330755 /usr/lib/openmpi/lib/openmpi/mca_allocator_basic.so
7f81fc53d000-7f81fc53e000 rw-p 00002000 07:00 330755 /usr/lib/openmpi/lib/openmpi/mca_allocator_basic.so
7f81fc53e000-7f81fc54a000 r-xp 00000000 07:00 234599 /lib/x86_64-linux-gnu/libnss_files-2.15.so
7f81fc54a000-7f81fc749000 ---p 0000c000 07:00 234599 /lib/x86_64-linux-gnu/libnss_files-2.15.so
7f81fc749000-7f81fc74a000 r--p 0000b000 07:00 234599 /lib/x86_64-linux-gnu/libnss_files-2.15.so
7f81fc74a000-7f81fc74b000 rw-p 0000c000 07:00 234599 /lib/x86_64-linux-gnu/libnss_files-2.15.so
7f81fc74b000-7f81fc755000 r-xp 00000000 07:00 234508 /lib/x86_64-linux-gnu/libnss_nis-2.15.so
7f81fc755000-7f81fc955000 ---p 0000a000 07:00 234508 /lib/x86_64-linux-gnu/libnss_nis-2.15.so
7f81fc955000-7f81fc956000 r--p 0000a000 07:00 234508 /lib/x86_64-linux-gnu/libnss_nis-2.15.so
7f81fc956000-7f81fc957000 rw-p 0000b000 07:00 234508 /lib/x86_64-linux-gnu/libnss_nis-2.15.so
7f81fc957000-7f81fc95f000 r-xp 00000000 07:00 234598 /lib/x86_64-linux-gnu/libnss_compat-2.15.so
7f81fc95f000-7f81fcb5e000 ---p 00008000 07:00 234598 /lib/x86_64-linux-gnu/libnss_compat-2.15.so
7f81fcb5e000-7f81fcb5f000 r--p 00007000 07:00 234598 /lib/x86_64-linux-gnu/libnss_compat-2.15.so
7f81fcb5f000-7f81fcb60000 rw-p 00008000 07:00 234598 /lib/x86_64-linux-gnu/libnss_compat-2.15.so
7f81fcb60000-7f81fcb64000 r-xp 00000000 07:00 330818 /usr/lib/openmpi/lib/openmpi/mca_routed_binomial.so
7f81fcb64000-7f81fcd63000 ---p 00004000 07:00 330818 /usr/lib/openmpi/lib/openmpi/mca_routed_binomial.so
7f81fcd63000-7f81fcd64000 r--p 00003000 07:00 330818 /usr/lib/openmpi/lib/openmpi/mca_routed_binomial.so
7f81fcd64000-7f81fcd65000 rw-p 00004000 07:00 330818 /usr/lib/openmpi/lib/openmpi/mca_routed_binomial.so
7f81fcd65000-7f81fcd74000 r-xp 00000000 07:00 330796 /usr/lib/openmpi/lib/openmpi/mca_oob_tcp.so
7f81fcd74000-7f81fcf73000 ---p 0000f000 07:00 330796 /usr/lib/openmpi/lib/openmpi/mca_oob_tcp.so
7f81fcf73000-7f81fcf74000 r--p 0000e000 07:00 330796 /usr/lib/openmpi/lib/openmpi/mca_oob_tcp.so
7f81fcf74000-7f81fcf75000 rw-p 0000f000 07:00 330796 /usr/lib/openmpi/lib/openmpi/mca_oob_tcp.so
7f81fcf75000-7f81fcf7a000 r-xp 00000000 07:00 330817 /usr/lib/openmpi/lib/openmpi/mca_rml_oob.so
7f81fcf7a000-7f81fd179000 ---p 00005000 07:00 330817 /usr/lib/openmpi/lib/openmpi/mca_rml_oob.so
7f81fd179000-7f81fd17a000 r--p 00004000 07:00 330817 /usr/lib/openmpi/lib/openmpi/mca_rml_oob.so
7f81fd17a000-7f81fd17b000 rw-p 00005000 07:00 330817 /usr/lib/openmpi/lib/openmpi/mca_rml_oob.so
7f81fd17b000-7f81fd17e000 r-xp 00000000 07:00 330783 /usr/lib/openmpi/lib/openmpi/mca_grpcomm_bad.so[ubuntu:02668] *** Process received signal ***
[ubuntu:02668] Signal: Aborted (6)
[ubuntu:02668] Signal code: (-6)
The error indicated that Maximum number of iterations exceeded.Then I found the thermoI.H according to the error.
There is a inline function in thermoI.H in the following:
Code:
template<class Thermo, template<class> class Type>
inline Foam::scalar Foam::species::thermo<Thermo, Type>::T
(
    scalar f,
    scalar p,
    scalar T0,
    scalar (thermo<Thermo, Type>::*F)(const scalar, const scalar) const,
    scalar (thermo<Thermo, Type>::*dFdT)(const scalar, const scalar)
        const,
    scalar (thermo<Thermo, Type>::*limit)(const scalar) const
) const
{
    scalar Test = T0;
    scalar Tnew = T0;
    scalar Ttol = T0*tol_;
    int    iter = 0;
    do
    {
        Test = Tnew;
        Tnew =
            (this->*limit)
            (Test - ((this->*F)(p, Test) - f)/(this->*dFdT)(p, Test));
        if (iter++ > maxIter_)
        {
            FatalErrorIn
            (
                "thermo<Thermo, Type>::T(scalar f, scalar T0, "
                "scalar (thermo<Thermo, Type>::*F)"
                "(const scalar) const, "
                "scalar (thermo<Thermo, Type>::*dFdT)"
                "(const scalar) const, "
                "scalar (thermo<Thermo, Type>::*limit)"
                "(const scalar) const"
                ") const"
            )   << "Maximum number of iterations exceeded"
                << abort(FatalError);
        }
    } while (mag(Tnew - Test) > Ttol);
    return Tnew;
}
It can be found the reason cause the error is "iter++ > maxIter_".Then I found the difine of maxIter :
Code:
const int Foam::species::thermo<Thermo, Type>::maxIter_ = 100;
.
So I make the const int Foam::species::thermo<Thermo, Type>::maxIter_ = 500,and wmake these class.Then I ran my case again,but the temperatue rise above 1365K,it still gave me the same error.At last,I make maxIter_=50000,and ran my case,it still gave the same error.
Why ?It should work after I change the value of maxIter_.However,it did not work.Can someone give me some advice?Any reply will be appreciated!
zqlhzx is offline   Reply With Quote

Old   December 16, 2013, 04:39
Default
  #2
Member
 
赵庆良
Join Date: Aug 2013
Posts: 56
Rep Power: 3
zqlhzx is on a distinguished road
Does anyone meet the problem like this?I think maybe it is a bug in OpenFoam2.2.x .
zqlhzx is offline   Reply With Quote

Old   December 26, 2013, 01:52
Default
  #3
Member
 
Join Date: Aug 2013
Posts: 34
Rep Power: 3
Antimony is on a distinguished road
Hi,

Have you tried changing the relaxation factors of that equation/variable in the fvSolution file? You might want to give that a shot.

Generally, if you see the "maximum number of iterations" statement, you might have better luck if you were to change the relaxation factor value.

Hope this helps.

Regards,

Antimony
Antimony is offline   Reply With Quote

Old   December 29, 2013, 08:07
Default
  #4
Member
 
赵庆良
Join Date: Aug 2013
Posts: 56
Rep Power: 3
zqlhzx is on a distinguished road
Hi Antimony,
Thank you for your advise!
I do the way as you said changing the relaxation factors of that equation/variable(I change relaxation factors of h, rho,p,U).But it did not work for me.I use LESmodel insteading of RANS,Is it that the relaxation factors of that equation/variable do not works for LESmodel?
For the codes "(mag(Tnew - Test) > Ttol);",I found the the value of mag(Tnew - Test) is about 9.7,and Ttol is only about 0.11.Nomatter which value of maxIter_ I selcected ,the value of mag(Tnew - Test) is about 9.7,even maxIter_=500000.I do not konw why the Newton equation "(this->*limit) (Test - ((this->*F)(p, Test) - f)/(this->*dFdT)(p, Test));"can not convergence when the temperature is about 1365K.Could you give me other advise?Thank you!
zqlhzx is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Sudden jump in Courant number NJG OpenFOAM Running, Solving & CFD 7 May 15, 2014 13:52
SLTS+rhoPisoFoam: what is rDeltaT??? nileshjrane OpenFOAM Running, Solving & CFD 4 February 25, 2013 05:13
why divergence occures in these cases immortality OpenFOAM Running, Solving & CFD 2 January 25, 2013 11:21
Interfoam blows on parallel run danvica OpenFOAM Running, Solving & CFD 16 December 22, 2012 03:09
Problems with simulating TurbFOAM barath.ezhilan OpenFOAM 13 July 16, 2009 05:55


All times are GMT -4. The time now is 03:58.