0

I want to upload a csv file into mysql database through php. I already have more than 20,000 records in database. now when i upload csv file containing around 1000 records in it, it takes very much time to upload on local machine itself.

Please help and suggest optimized query to upload csv file into mysql database having large number of records.

Does number of records affect performance of database..??

EDIT from comments

Currently used code:

LOAD DATA INFILE '$file_name' IGNORE 
INTO TABLE import 
FIELDS TERMINATED BY '|' 
LINES TERMINATED BY '\n' 
IGNORE 1 LINES (@srno,@customer_name,@date,@mobno,@city,@state,@type,@telecaller) 
SET customer_name=@customer_name,date=@date,mobno=@mobno,city=@city, state=@state,type=@type,telecaller=@telecaller,datetime='$datetime';
11
  • Do you have many indexes on that database? How do you insert the new entries (code)? Commented May 17, 2013 at 11:35
  • 3
    20,000 and 1,000 are not large numbers of records. You are asking for help with performance, but you are not showing us how you are currently loading data into the database - we can't help without knowing that. Commented May 17, 2013 at 11:36
  • right now i am using LOAD DATA INFILE '$file_name' IGNORE INTO TABLE import FIELDS TERMINATED BY '|' LINES TERMINATED BY '\n' IGNORE 1 LINES (@srno,@customer_name,@date,@mobno,@city,@state,@type,@telecaller) SET customer_name=@customer_name,date=@date,mobno=@mobno,city=@city, state=@state,type=@type,telecaller=@telecaller,datetime='$datetime'; This query to upload csv into mysql database. but it takes about an half an hour to upload csv containig more than 2000 records Commented May 17, 2013 at 11:37
  • i dont have too much indexes. i only have 8 columns and id as primary key and mobno as unique Commented May 17, 2013 at 11:38
  • 2
    How could it possibly take too long to insert a measly 1000 line CSV file? How long is it taking, and why do you need it to be done faster? Commented May 17, 2013 at 11:43

2 Answers 2

1

USE command line tool of mysql: LOAD DATA INFILE

http://dev.mysql.com/doc/refman/5.1/en/load-data.html

Sign up to request clarification or add additional context in comments.

2 Comments

i am currently using load data infile query. but it takes too much time to upload csv file
Please check the index of your table.. If possible first remove index.. then import data.. then add index again..
0

Use fgetcsv() function to load data from csv file using php, format it the way you want and create the query to submit to mysql db. I have used it to create a db with 1.5M rows using csv files containing more than 10000 records in each file without any problem. 1000 records shouldn't be a problem.

Example:

      $h = fopen("your_file.csv", "r");
      $data = fgetcsv($h, "10000", ",");

You will have first line in your csv file in $data[0]. $data[0][0] will contain the first word delimited with ",". For example: if you have: "cat, dog, rat" then,

    $data[0][0] = "cat"
    $data[0][1] = "dog"

etc. Now since you have the records in the $data array, you can use it to make sqls and insert them to db

2 Comments

Can u please put an example. It will help me alot.
That would not be faster in most cases (when not using MyISAM tables). LOAD DATA INFILE is faster than bulk INSERT, as there is less overhead to parse the data, however the difference is negligible. See this related answer

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.