There is a script which perfectly worked on localhost for 2 years on different versions of php loads and parses data from a csv file. Actually, the files are not that large — the maximum is 22 MB.
After the next reinstallation of system (win7 64bit ultimate on the same home premium) and reconfigure the web server (installing new minor versions of Apache, php and mysql) import suddenly went wrong — team file, fgets, file_get_contents
and other suddenly started to give only a piece of the file. All is dancing around the numbers in around 65 KB, more accurately does not read, sometimes less.
In the config spelled out memory_limit = 512M, max_execution_time = 1800
Full config PHP: pastebin.com/rTiRr53t
The full configuration of Apache: pastebin.com/uSmpP684
Files are read like this:
$file_loc = 'pathtofile/file.dat';
$lines = file($file_loc);
foreach ($lines as $line_num => $line)
After another moment the string breaks. Google refers to memory_limit and boot parts using various tricks, but I worked and loading at one time, since the files are not huge. Rollback to a previous version of PHP did not help.
What am I missing?