manyspheres.py fails in parallel with pvbatch
manyspheres.py fails when run with mpi in parallel with pvbatch. Here is how to replicate:
mpiMagicStuff pvbatch /.../manyspheres.py
And, it fails as follows:
Generating bounding box
Generating all spheres
Assigning colors
source 0: generating 6 spheres from 0 to 6
source 6: generating 6 spheres from 36 to 42
source 4: generating 6 spheres from 24 to 30
source 12: generating 7 spheres from 72 to 79
source 2: generating 6 spheres from 12 to 18
source 11: generating 6 spheres from 66 to 72
source 9: generating 6 spheres from 54 to 60
source 13: generating 7 spheres from 79 to 86
source 3: generating 6 spheres from 18 to 24
source 1: generating 6 spheres from 6 to 12
source 7: generating 6 spheres from 42 to 48
source 10: generating 6 spheres from 60 to 66
source 5: generating 6 spheres from 30 to 36
source 15: generating 7 spheres from 93 to 100
source 14: generating 7 spheres from 86 to 93
source 8: generating 6 spheres from 48 to 54
Repositioning initial camera
Rendering first frame
Saving frame 0 screenshot
Gathering geometry counts
Beginning benchmark loop
(datetime.timedelta(0, 9, 670969), ['CL_DS_RS[0] 283060 / 4451200', 'CL_DS_RS[1] 238852 / 4451200', 'CL_DS_RS[2] 237756 / 4451200', 'CL_DS_RS[3] 235876 / 4451200', 'CL_DS_RS[4] 237636 / 4451200', 'CL_DS_RS[5] 237944 / 4451200', 'CL_DS_RS[6] 237456 / 4451200', 'CL_DS_RS[7] 235208 / 4451200', 'CL_DS_RS[8] 237792 / 4451200', 'CL_DS_RS[9] 238080 / 4451200', 'CL_DS_RS[10] 237292 / 4451200', 'CL_DS_RS[11] 234908 / 4451200', 'CL_DS_RS[12] 237364 / 4451200', 'CL_DS_RS[13] 238132 / 4451200', 'CL_DS_RS[14] 237580 / 4451200', 'CL_DS_RS[15] 234920 / 4451200'])
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 8 in communicator MPI_COMM_WORLD
with errorcode 15.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
Edited by W. Alan Scott