2

I am here stuck with an problem and really need you people's help. I am trying to export mysql data using a PHP script. But the problem is an error which is below "Fatal error: Allowed memory size of 67108864 bytes exhausted....". I have seen some posts where they have suggested to change the php.ini file. But as I am hosted on a dedicated server I do not have that access. The table that I am trying to export has more the 2.2GB of data. I am here by posting the function that I am using to export those datas. Can you guys please help me in solving this issues?

<?php
set_time_limit(0);
backup_tables('hostname','username','password','databasename');

function backup_tables($host,$user,$pass,$name,$table = 'Table Name')
{

$link = mysql_connect($host,$user,$pass);
mysql_select_db($name,$link);

$result = mysql_query('SELECT * FROM '.$table);
$num_fields = mysql_num_fields($result);

$row2 = mysql_fetch_row(mysql_query('SHOW CREATE TABLE '.$table));
$return.= "\n\n".$row2[1].";\n\n";

for ($i = 0; $i < $num_fields; $i++)
{
while($row = mysql_fetch_row($result))
{
$return.= 'INSERT INTO '.$table.' VALUES(';
for($j=0; $j<$num_fields; $j++)
{
$row[$j] = addslashes($row[$j]);
$row[$j] = ereg_replace("\n","\\n",$row[$j]);
if (isset($row[$j])) { $return.= '"'.$row[$j].'"' ; } else { $return.= '""'; }
if ($j<($num_fields-1)) { $return.= ','; }
}
$return.= ");\n";
}
}
$return.="\n\n\n";
//save file
$handle = fopen('db-backup-'.time().'-'.($tables).'.sql','w+');
fwrite($handle,$return);
fclose($handle);
}
?>
4
  • 5
    Please don't use mysql_* functions in new code. They were removed from PHP 7.0.0 in 2015. Instead, use prepared statements via PDO or MySQLi. See Why shouldn't I use mysql_* functions in PHP? for more information. Commented Sep 14, 2012 at 8:28
  • You probably want to divide the work into different parts, where you export a part of the table each time. Commented Sep 14, 2012 at 8:29
  • 1
    Is there a specific reason you want to do this using PHP, rather than (say) schedule mysqldump to do it for you? Commented Sep 14, 2012 at 8:30
  • Actually as I am using a shared server and they are not allowing me to store more then 2GB data in a single table. So I have move those data to a dedicated server. So want the backup for that. Commented Sep 14, 2012 at 8:40

5 Answers 5

5

I think that you are trying to remake a wheel here. Mysql has a fantastic mysqldump executable that will quickly, quietly and using few resources generate the file you are making. The best thing is that you can actually call it directly from within a php script using exec with something like this:

exec('mysqldump --user=... --password=... --host=... DB_NAME > /path/to/output/file.sql');
Sign up to request clarification or add additional context in comments.

2 Comments

Why the downvote? The OP is making a SQL text file that will re-create his table. This does the same thing, except faster, using very few resources and in a safe manner.
@Aelios It will happily export a 10 row database or one with billions of rows.
1

The table that I am trying to export has more the 2.2GB of data

And you're trying to load it all into a PHP array.

Even if you had enough memory, the performance would be appalling.

Write the rows fetched from the database as you read them...

while ($row = mysql_fetch_row($result)) {
   fputs($outfile, 'INSERT INTO '.$table.' VALUES(');
   ....
}

But a far better solution would be to use mysqldump.

update

I forgot to say that your method for escaping the output is wrong - use mysql_escape_string() not addslashes + ereg_replace() (and if you must use string relpacement functions, str_replace is much faster then [ep]reg_replace).

2 Comments

I have access to phpmyadmin. When I am using the "Export" option. The connection gets reset. So throwing an error when trying with that. Can you suggest something through phpmyadmin . I also have FTP details. But nothing more then that
Then you're going to have to take a very different tack. Regardles of how you implement it, it will take a very long time to extract the data. HTTP is not designed for doing things which take a long time.
0

Try to zip your backup sql statement with Zip->addFromString

http://php.net/manual/fr/book.zip.php

http://php.net/manual/fr/zip.examples.php

<?php

$zip = new ZipArchive();
$filename = "./backup.zip";

if ($zip->open($filename, ZIPARCHIVE::CREATE)!==TRUE) {
    exit("Could not open <$filename>\n");
}
$zip->addFromString("backup.txt", "sql statement");
$zip->close();
?>

Try to divide your backup into several zip file (for example : one table by zip)

Comments

0

Instead of collecting everything into (a huge) $return, move the fopen call to the start of the function and write every piece of result directly using fwrite instead of adding it to $return. The memory problem you're encountering is because $return takes up so much space.

1 Comment

See symcbean's answer. I agree with him and others that mysqldump would be the better option though.
0

hm.. i guess the best way to do this is using sql query "SELECT * INTO OUTFILE..." because php function "exec" can be forbidden by php.ini safe mode.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.