9

I'd like to write a Pandas dataframe to PostgreSQL table without using SQLAlchemy.

The table name should correspond to the pandas variable name, or replace the table if already exists. Data types need to match as well.

I'd like to avoid SQLAlchemy's to_sql function for several reasons.

import pandas as pd
from getpass import getpass
import psycopg2

your_pass = getpass(prompt='Password: ', stream=None)
conn_cred = {
    'host': your_host,
    'port': your_port,
    'dbname': your_dbname,
    'user': your_user,
    'password': your_pass
}
conn = psycopg2.connect(**conn_cred)
conn.autocommit = True

my_data = {'col1': [1, 2], 'col2': [3, 4]}

def store_dataframe_to_postgre(df, schema, active_conn):
    # df = pandas dataframe to store as a table
    # schema = schema for the table
    # active_conn = open connection to a PostgreSQL db
    # ...
    # Bonus: require explicit commit here, even though conn.autocommit = True


store_dataframe_to_postgre(my_data, 'my_schema', conn)

This should be the result in the Postgre db:

SELECT * FROM my_schema.my_data;
   col1  col2
     1     3
     2     4
2
  • here is what you need -> initd.org/psycopg/docs/… Commented Feb 4, 2019 at 16:07
  • I'm not sure how to handle the data types (the data won't have any custom types) Commented Feb 4, 2019 at 16:54

1 Answer 1

3

you can try but this code in your:

 cursor = conn.cursor()  
 cur.copy_from(df, schema , null='', sep=',', columns=(my_data))

reference code: copy dataframe to postgres table with column that has defalut value

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.