0

Keep in mind, I am not an experienced database person.

I have a csv flatfile that I want to import into a SQL database. Its not THAT big, 35 mbs or so with 2500 rows or so with 3900 columns both rows and columns are UNIQUE. 1st row is the header.

I am having a hard time importing this to the local MySQL database that I am using with Wamp. It either times out or I have to deal with a spinning death wheel for almost an hour before I begin to get impatient about the fact that I am uploading 35 mbs and cancel the upload.

I also find it hard to accept that I have to add each column one by one by typing INSERT for EACH column.

Is there a way to upload this to MySQL efficiently? Thanks in advance.

8
  • 3900 columns - seriously? Commented Apr 4, 2020 at 15:07
  • yea....is that a bad thing? Commented Apr 4, 2020 at 15:08
  • You should read dev.mysql.com/doc/refman/8.0/en/column-count-limit.html. Commented Apr 4, 2020 at 15:12
  • from what I gather I basically have to divide this into smaller sets/tables? Commented Apr 4, 2020 at 15:19
  • If you are going to use a relational database like mysql you should normalise your data. You could chop up the csv file into digestible chunks, load them to staging tables and then push the data into their final tables or if you may chose to chop up your data to match the final db tables before you import. Either way it looks like you have a lot of work to do. Commented Apr 4, 2020 at 15:23

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.