Memory problem when reading large files
I have a certain problem reading large files that previously have been generated in a vtk application (VTK-7.1.1 64 bits) using Windows 10. In this particular case I generated a model with 500 million points in double precision and a scalar part in single precision. No cell info, just a plain point and scalar set. Memory size ~15GB when saved on disk. This dataset can easily be saved with the vtkDataWriter. But when trying to read it back using vtkDataReader I get this error message: ERROR: In D:\VTK-7.1.1\IO\Legacy\vtkDataReader.cxx, line 430\nvtkDataReader (0000029EDEF98B90): Unable to open file. Is there anyone that can explain this and hopefully give a solution to the problem. If I reduce that file size to 50 millions it all goes fine so maybe there is a memory limit, but I can not find any such information.
It was indicated from Cory Quammen (Kitware): **Since this is on Windows, is your data file > 2.1 GB? I suspect so, and I suspect VTK 7.1.1 does not make use of the versions of file IO functions needed on Windows such as _fseeki64. **
I would very much appreciate a fix for this problem.
Thanks, b.r. Oleg