Accessing hundreds of files as fast as possible
So i have a program that will take in up to 1000 paths to files and the
idea is to read 3 specific bytes to return the date which is all fine.
The problem start when it starts and the memory usage flies up towards the
max and within a few seconds freezes my PC because of that.
I'm guessing the opening process uses up a few meg or something... Any
ideas on how to achieve what i need without this massive memory usage?
NOTE: Files i am opening as something along the line of 15 GB
int main(int argc, char *argv[])
{
string paths[1000] = {};
int date[3] = {0};
cout << "Arg count: " << argc << endl;
if (argc <= 1)
paths[0] = "PRIV.EDB";
else
for(int i = 1;i<argc;i++){
paths[i-1] = argv[i];
}
cout << "Start\n\n";
for (int i=0;i<1000;i++)
{
if (paths[i].empty())
break;
cout << paths[i] << endl;
ifstream pFile;
pFile.open(paths[i], ios::binary);
pFile.seekg(195);
date[0] = pFile.get();
date[1] = pFile.get();
date[2] = pFile.get();
cout << date[0] << " : " << date[1] << " : " << date[2] << " \n";
cout << endl;
pFile.clear();
pFile.close();
}
cout << "Fin\n";
if (argc <= 1)
getchar();
return date[0];
}
No comments:
Post a Comment