Ghost Filter consumes more memory than it should
We have a user here at LANL who was trying to load a vtm dataset (unstructured grid) made of about 1 billion cells, 1.6 billion points, and 8 cell data arrays. It should be about 80 GB uncompressed. He wanted to load the data, apply Merge Blocks filter, and show a surface colored by a variable. With an allocation of 4 nodes of Trinity (512 GB memory), he would get an out of memory error at the render surface step. With 8 nodes of Trinity (1 TB memory), it worked fine.
It seems like the memory requirements are excessive. Can anything be done to lower the memory used?
One idea that I talked to @patchett2002 about is that right now for vtm (multiblock) files, you need to load all variables at once. It would be nice if you could only load certain variables for vtm files, much like some other file formats.