0

I have a text file to read which has around 10000 points separated by, x1,y1 x2,y2 x3,y3 . . 10000 times

I read them using a loop in PHP and then store in an array and then I run a loop and insert one line at a time in my database. It takes really long time. Is there any way I can insert the whole array

for ($i=0; $i<10000; $i++)

{
    $sql = '
        INSERT INTO `firefly`.`FreeFormPoly` 
            (`markedObjectID`, `order`, `x`, `y`) 
        VALUES 
            ('.$markedObjectsID.', '.$order.', '.$valuesx[i].','.$valuesy[i].')';

    $db->query($sql, $markedObjectsID, $order, $values[1], $values[0]);
}
1

4 Answers 4

2

Try using multiple insert statement. Generate one insert and submit the entire statement

INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);

SO:

$sql = 'INSERT INTO `firefly`.`FreeFormPoly` (`markedObjectID`, `order`, `x`, `y`) VALUES';
for($i=0;$i<10000;$i++) {
    if($i != 0) $sql .= ',';
    $sql .= '('.$markedObjectsID.', '.$order.', '.$valuesx[i].','.$valuesy[i].')';                         
}
$db->query($sql);
Sign up to request clarification or add additional context in comments.

2 Comments

There is a limit to how big of an INSERT you can sent. You might need an extra loop around this one. Depending on the size of the SQL, I recall having to limit it to 3,000 grouped inserts, but everybody's will vary.
Perhaps there is a limit. In that case "batch" them up into as many as you can. No doubt the slowdown is from the SQL parsing so many statements.
0

I would do something like this:

$sql = 'INSERT INTO `firefly`.`FreeFormPoly` (`markedObjectID`, `order`, `x`, `y`) VALUES';
for($i=0;$i<length;$i++) {
    $sql .= '('.$markedObjectsID.', '.$order.',  .$valuesx[i].','.$valuesy[i].'),';
}
$sql = substr($sql,0,-1);
$db->query($sql);

Explanation:

The syntax to enter multiple records is

INSERT INTO TABLE_NAME VALUES(VAL1, VAL2, ....), (...), (...);

In the SQL you are concatenating (val1,val2,val3), every time you execute the loop hece you will get an extra , in the last position and substr() trims it off.

More preferably I would do

$sql = 'INSERT INTO `firefly`.`FreeFormPoly` (`markedObjectID`, `order`, `x`, `y`) VALUES ';
for($i=0;$i<length;$i++) {
    $sql .= '('.$markedObjectsID.', '.$order.',  .$valuesx[i].','.$valuesy[i].'),';
}
$sql = substr($sql,0,-1);
$result = mysqli_query($db,$sql)
    or die('Error in querying the database');

Comments

0

You should be able to speed it up considerably by sending BEGIN TRANSACTION before the loop and COMMIT after the loop. One time I had to insert 14,000 data points on SQLite, and it took 20 minutes, but when I put the data in as a transaction it completed in 0.3 seconds.

Comments

0

First off, you can use prepared statements to reduce the network overhead. Assuming PDO:

$stmt = $db->prepare('INSERT INTO mytable (foo, bar, baz) VALUES (:foo, :bar, :baz)');

for ($i = 0; $i < $length; ++$i) {
    $stmt->execute(array(
        ':foo' => $data[$i]['foo'],
        ':bar' => $data[$i]['bar'],
        ':baz' => $data[$i]['baz'],
    ));
}

Second, you could wrap the whole code inside $db->beginTransaction() and $db->commit().

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.