2021年3月24日星期三

Potential memory leaks in reading binary files

I would like to ask if this part of the code might suffer from memory leaks (I'm quite sure it does, but how severely?).

The "input" variable is a pointer to double, i.e. double* input. The reason I didn't use float (more compatible in this case) is because I wanted to maintain compatibility with other parts of the code.

else if(filetype=="BinaryFile"){  char * memblock;  std::ifstream file(filename1,std::ios::binary | std::ios::in);  file.seekg(0, std::ios::end);   int size = file.tellg();    file.seekg(0, std::ios::beg);  std::cout << "Size=" << size << " [in bytes]" << "\n";  std::cout << "There are overall "<< grid_points <<"^3 = "<< std::setprecision(10)<<  pow(grid_points,3) << " values of the field1, written as float type.\n";  memblock = new char [size];  file.seekg (0, std::ios::beg);  file.read (memblock, size);  file.close();  float* values = (float*)memblock; //reinterpret as float, because the file was saved as float   for(int i=0; i<grid_points*grid_points*grid_points;i++){   input1[i]= (double)values[i]; //cast to double, since input1 is an array of doubles  }  file.close();  delete[] memblock;  }  

The files that I need to work on are quite big, coming from cosmological simulations; for example one file is 4GB and the other could be 20 GB. I'm using the supercomputer infrastructure for that reason.

This kind of reading works for files that have 512^3 float values (e.x. density evaluated on points in a cube of side 512) but memory leaks happen for a file with 1024^3 entries.

I had thought I should delete[] the "values" array, but when I do that, I get even worse memory leaks, crashing my program even in the case where previously all was calculated correctly (512^3).

How could I improve on this code? I would have used the std::vector container but I had to use the FFTW library.

https://stackoverflow.com/questions/66792054/potential-memory-leaks-in-reading-binary-files March 25, 2021 at 10:05AM

没有评论:

发表评论