CFD Online Logo CFD Online URL
Home > Forums > OpenFOAM Post-Processing

Function Object cloudInfo

Register Blogs Members List Search Today's Posts Mark Forums Read

LinkBack Thread Tools Display Modes
Old   March 6, 2014, 07:42
Default Function Object cloudInfo
New Member
Join Date: Dec 2012
Posts: 12
Rep Power: 4
Mentalo is on a distinguished road

if have seen that since OF v2.2.0 a functionality called cloudInfo was implemented. Im very interested in using this tool , but i have no idea of how this is done. Do i have to write a custom pre-processor to use this utility? Help would be appreciated. Thanks in advance.

Mentalo is offline   Reply With Quote

Old   March 7, 2014, 04:06
New Member
Join Date: Dec 2012
Posts: 12
Rep Power: 4
Mentalo is on a distinguished road
Ok now I got it running. But there ist still a strange problem. On a singel core, without decomposing the case, the cloudInfo object works perfectly fine. But by starting the run on MPI, I get an error message:

faceSource cuttingplane_average output:
    areaAverage(sampledSurface) for p = 99991.3
    areaAverage(sampledSurface) for U = (7.71105e-05 -6.53393e-06 20.1689)
    areaAverage(sampledSurface) for T = 280.929
    areaAverage(sampledSurface) for rho = 1.23573
    areaAverage(sampledSurface) for C7H16 = 5.37203e-98

[philipp-Home:3137] *** An error occurred in MPI_Recv
[philipp-Home:3137] *** on communicator MPI_COMM_WORLD
[philipp-Home:3137] *** MPI_ERR_TRUNCATE: message truncated
[philipp-Home:3137] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
mpirun has exited due to process rank 0 with PID 3137 on
node philipp-Home exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
The faceSource object works perfectly fine, either runing single or multi core.

I tried to execute the post processing, after reconstructing the case, via execFlowFunctionObjects, but then OpenFoam gets stuck saying:

Time = 0
    Reading phi
--> FOAM Warning : 
cannot find file

file: /home/philipp/cfd/OF_22x/full_spray_SP1_5/0/phi at line 0.

    From function regIOobject::readStream()
    in file db/regIOobject/regIOobjectRead.C at line 73.

Time = 0.0005
    Reading phi
    Reading U
    Reading p
No finite volume options present

Selecting thermodynamics package 
    type            hePsiThermo;
    mixture         reactingMixture;
    transport       sutherland;
    thermo          janaf;
    energy          sensibleEnthalpy;
    equationOfState perfectGas;
    specie          specie;

Selecting chemistryReader chemkinReader
Reading CHEMKIN thermo data in new file format
and it won't continue
Mentalo is offline   Reply With Quote


Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On

Similar Threads
Thread Thread Starter Forum Replies Last Post
Annoying issue of automatic "Rescale to Data Range " with paraFoam/paraview 3.12 keepfit OpenFOAM Paraview & paraFoam 60 September 18, 2013 03:23
swak4Foam installation problem Claudio87 OpenFOAM 9 May 8, 2013 10:20
Compile problem ivanyao OpenFOAM Running, Solving & CFD 1 October 12, 2012 09:31
BlockMeshmergePatchPairs hjasak OpenFOAM Native Meshers: blockMesh 11 August 15, 2008 07:36
Axisymmetrical mesh Rasmus Gjesing (Gjesing) OpenFOAM Native Meshers: blockMesh 10 April 2, 2007 14:00

All times are GMT -4. The time now is 07:19.