FRIHOST FORUMS SEARCH FAQ TOS BLOGS COMPETITIONS
You are invited to Log in or Register a free Frihost Account!


Can exporting a huge database without mysqli work normally?





jmraker
On my windows laptop (it's a fresh install of wampserver) which is the temporary home of a 3+ million record table I'm trying to make a program to export it all to a single CSV file. I hit a strange problem. It was running out of memory due to mysql_fetch_array() which doesn't seem to free memory of previous records. The only unusual thing about it is one field is a varchar(2000)

The previous home of the database used the same kind of loop that exported the database in the same way, it managed to export 2 million of the records as a CSV file but luckily the program that added the records also appended everything to it's own CSV file.

I have it set to use 512meg of memory which should have been more than enough. When run it goes through 551 thousand records and either dies with no message or it shows the memory allocation fatal error.
Code:

         $sql = 'SELECT * FROM file ORDER BY file_site, file_sid';
         $result2 = mysql_query($sql);
         if(mysql_num_rows($result2) == 0)
            break;
         $num1 = 0;
         while($rec = mysql_fetch_array($result2, MYSQL_ASSOC)){   // Go through data
            $lnum++;
/*            The second query stuff */

            echo number_format($lnum) . ' of ' . number_format($numRecords) . ' (' . number_format($lnum / $numRecords * 100, 3) . '%)  ' . number_format(memory_get_usage(true)) . "\n";
            $rec = array($rec['file_site'], $rec['file_sid'], $rec['file_name'], $tags, $rec['file_rating'], $rec['file_date']);
            fputcsv($this->out, $rec);
         }


It starts off using 299ish meg, and 551,010 records later it goes over 512 meg.
Code:
1 of 3,525,177 (0.000%)  299,106,304
2 of 3,525,177 (0.000%)  299,106,304
3 of 3,525,177 (0.000%)  299,106,304
...
100,000 of 3,525,177 (2.837%)  342,097,920
100,001 of 3,525,177 (2.837%)  342,097,920
100,002 of 3,525,177 (2.837%)  342,097,920
...
200,000 of 3,525,177 (5.673%)  385,351,680
200,001 of 3,525,177 (5.674%)  385,351,680
200,002 of 3,525,177 (5.674%)  385,351,680
...
300,000 of 3,525,177 (8.510%)  428,605,440
300,001 of 3,525,177 (8.510%)  428,605,440
300,002 of 3,525,177 (8.510%)  428,605,440
300,003 of 3,525,177 (8.510%)  428,605,440
...
400,000 of 3,525,177 (11.347%)  471,859,200
400,001 of 3,525,177 (11.347%)  471,859,200
400,002 of 3,525,177 (11.347%)  471,859,200
...
551,007 of 3,525,177 (15.631%)  536,870,912
551,008 of 3,525,177 (15.631%)  536,870,912
551,009 of 3,525,177 (15.631%)  536,870,912

Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 32 bytes) in ...
PHP Fatal error:  Allowed memory size of 536870912 bytes exhausted (tried to allocate 32 bytes) in ...


To fix it I tried to:
. Split it up by querying a few thousand records at a time but each query takes a few minutes.
. Monitor memory and when it's about to use over 500meg it calls mysql_free_result(), reruns the query and calls mysql_data_seek() to where it was to continue, again with a long delay between queries
. Tried using mysqli which has mysqli::use_result() that doesn't eat up the memory, but it has a thing for not allowing other database queries while the result is open. My program has to run another query on every record loop that fails.

The solution I have so far is to open 2 mysqli connections, the 1st for the 3+ million record loop and the 2nd for the query it has to do for every loop. The 1st connection goes through all 3+ millions records in one query and doesn't eat the memory.

Code:
      $sql = 'SELECT * FROM file ORDER BY file_site, file_sid';
      $res1 = $db->query($sql, MYSQLI_USE_RESULT);
      $num1 = 0;
      $db->use_result();
      while($rec = $res1->fetch_array(MYSQLI_ASSOC)){   // Go through data
         $lnum++;
         $tags = $rec['file_tags'];
         $sql2 = 'SELECT tag_title FROM tag WHERE tag_id IN (' . $tags . ') ORDER BY tag_title';
         $tags = array();
         $res2 = $db2->query($sql2);
         while($rec2 = $res2->fetch_array(MYSQLI_ASSOC)){
            $tags[] = $rec2['tag_title'];
         }
         $tags = implode(',', $tags);

         echo "\r" . number_format($lnum) . ' of ' . number_format($numRecords) . ' (' . number_format($lnum / $numRecords * 100, 3) . '%)  ' . number_format(memory_get_usage(true)) . '  ';
         $rec = array($rec['file_site'], $rec['file_sid'], $rec['file_name'], $tags, $rec['file_rating'], $rec['file_date']);
         fputcsv($this->out, $rec);
      }

I've never had a problem like this that only mysqli could handle better.

Is there a way to loop through a huge database table in the "depreciated but still popular" mysql functions without turning off the memory limit?

What do you think is the best way to export such a huge database table to a CSV file in PHP?
Marcuzzo
I had a similar problem with one of my last projects.

a guy had asked me to update/upgrade his website but after looking at the code I knew that it would be impossible to modify the existing code because it was written in a terrible way.
the DB was also a mess, I had never seen a database that was this messy, the person that had written the code didn't have the slightest notion of db normalization whatsoever.

After I had normalized the database and written the php code for the website I needed to migrate the data from the old db to the new one and this was a +5M record database that didn't have any FKC's and used relation tables only.

So I wrote a migration script that would run a select query on the old database, gather the needed information and run an insert query on the new database.
At the beginning I limited the output of the select query for testing purposes and it worked fine, but when I removed the record limit I had memory issues aswel.

I ended up with Dumping the Database with MySql workbench and importing it to a local database that I had running on my machine and used mysqli functions.

I wrote this function to fetch the data from the old db and without it I wouldn't have been able to migrate the data.

Code:
function run_query($con, $query)
{   
  $rows = array();
  if ( $result = mysqli_query($con,$query) )
  {   
    while ($row = mysqli_fetch_assoc($result)) {
      set_time_limit(0);
      $rows[] = $row;
    }   
    mysqli_free_result($result); /* free result set */
  }   
  return $rows;
}
Related topics
my cpanel is not work it ask me that password is wrong
Interview: Derek Liu, Gaia Online Anime Community
A Question About .tk domains
Changing Hosts, and download/restoring databases.
FireFox and Cookies/Sessions
Profile
How to test PHP
Is this code safe, or not?
PHPbb Security
Visual Basic 6
Google AVIs With MPC (weird behaviour)
Running PHP Script without PHP Server?
mysql database table structure
Copy right and YouTube
Reply to topic    Frihost Forum Index -> Scripting -> Php and MySQL

FRIHOST HOME | FAQ | TOS | ABOUT US | CONTACT US | SITE MAP
© 2005-2011 Frihost, forums powered by phpBB.