Parallel OpenFOAM to Parallel EnSight (SoS)
OpenFOAM Parallel to EnSight Parallel (SoS) without merging.
In EnSight 10.0 and 10.1, if you had a parallel OpenFOAM dataset, you had basically two options:
a. use 'reconstructPar' to merge the dataset back into a single set of files, and load up into a Single EnSight server, or use AutoDecompose to decompose the dataset you just merged.
b. use the native OpenFOAM reader, and read the parallel dataset into a Single EnSight Server (no SoS allowed).
Option A is long, has steps which are basically doing the opposite operation, and will not perform well (autodecompose).
Option B cannot be used with datasets which require EnSight SoS operation.
** NEW **
In EnSight 10.2.0(a), we have the ability to handle parallel OpenFOAM datasets into parallel EnSight (SoS) without doing a merge, split or reconstructPar on the dataset.
To do this, there are 2 basic steps that have to be done by the user. Automation of the steps is currently left to the user.
After the parallel OpenFOAM solver is done, you have a series of "processor0, processor1, processorN" directories. No reconstructPar is performed.
The user needs to convert each of the processorN partitions into its own set of EnSight Case format files. This is relatively easy, as the OpenFOAM utility 'foamToEnsight' has the '-case' option to convert just that. The user therefore needs to invoke:
foamToEnsight -case processor0
foamToEnsight -case processor1
foamToEnsight -case processorN
At this point, each OpenFOAM partition has been converted into its own set of EnSight Case format files.
EnSight needs a global header file to reference all of the separate files. There are technically 3 options, although only the 3rd option is probably practical.
Option 1 is to modify a single .case file, and use the "APPEND_CASEFILES" capability to add in all of the remaining partition's case files. This is only meant for single EnSight Server operation, so is really only academically useful.
Option 2 is an .sos file, which references each of the partition's case files for their own EnSight Server. This is straight forward, but does require that the number of OpenFOAM partitions == number of EnSight Servers. This is probably not going to be efficient (using way too many Servers).
Option 3 is an .sos file, which utilizes the MULTIPLE_CASEFILES option. You need to specify the location of each of the partitions' case files, but can specify any number of servers to be used.
At this point, the user is ready to load the model up into EnSight HPC as normal (ensight102 -hpc -sos)
What in EnSight10.2 allows this? Prior to EnSight 10.2, all of the partitions of the dataset must contain a full list of parts. But, the conversion of each OpenFOAM partition yields is own 'local' set of parts, which could not be read into EnSight as a single case. EnSight 10.2 now allows the servers to have 'local' part lists.
I've created a small example dataset (90 Mb) which can be used/shared/given out to test the operation. This example dataset has examples of each of the 3 files referenced in "Step 2" above.
To Download file:
I've also created a short (9 minute) video tutorial on using this new capability. I cover both Step 1, and Step 2 above.
I've also created a Python based utility which will perform Step 1 and Step 2 above. Namely, it will attempt to figure out how many processorN directories you have, run the 'foamToEnsight -case processorN' to convert the data into EnSight format, and then create a sample SoS file which utilizes the "MULTIPLE_CASEFILES" option, so that you should be able to load the .sos file into an EnSight HPC session. See the file attachement below (foamCEI_parallel.py)
Usage for this utility:
cpython32 foamCEI_parallel.py <number_of_ensight_servers>
This utility needs to be executed on a machine which has OpenFOAM installed and setup.