Catalyst: two data pipes with two different scripts fails in coprocessing.py
When using Catalyst, users will sometimes want to create different datasets to be sent to different catalyst python scripts. For example, in SPARC users might send surface/wall data to one script and volume/airflow data to a different script.
In ParaView 5.10 and 5.11.0 this capability fails. It worked in 5.9. The problem is in coprocessort.py. I am attaching my altered coprocessor.py which works. You can diff with the 5.11.0 Wrapping/Python/paraview/coprocessing.py to see the changes I had to make to get it to work. I don't insist on exactly these changes, but our ioss catalyst test suite (that Ben Boeckle can run on SNL HPCs like vortex and eclipse) will need to pass the related two pipe/two script tests.coprocessing.py
I also undid a change which disallowed multiple data pipes to have the same name. In our sierra catalyst implementation we have been using the same name ("input") for all incoming data pipes and it has worked fine. I don't absolutely have to have the capability to give multiple pipes the same name, but it would be nice to be warned formally about the change.