0

I have multiple thousands of csvs that I need to dump into postgres as a table each. But the problem is, these tables are not identical in structure. I'm looking for a way I can create the table on the fly from the structure of the csv and dump the csv into the table created. If I were to do it manually, it would involve 2 steps: 1. Create the table based on the csv data structure 2. Dump the csv data into the created table

But since I have thousands of these csvs, it would extremely inefficient to do this manually. I'm looking for a way I could dynamically create a postgres table based on a csv structure and dump data into this table and automate this entire process for thousands of files.

Most of the research I did points me to PG commands to dump data into a single existing table, but those solutions won't work here because the number of tables is huge here.

Efficient way to import a lot of csv files into PostgreSQL db - This pointed me to a similar problem, but his tables are all identical in structure, unlike mine.

2
  • extract the header from the csv and then use that to make your columns. Commented Aug 23, 2018 at 19:22
  • you can use pandas to read the csv files and create the tables without having to specify individual schemas. the specific functions are pandas.read_csv & pandas.to_sql Commented Aug 23, 2018 at 19:44

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.