Autocommit pandas dataframe rows into IRIS
Hi,
I'm getting an unexpected behavior when using pandas function to_sql(), which uses sqlalchemy-iris. After the first execution, a transaction seems to be opened and all rows inserted are lost after closing the connection:
engine = create_engine(f"iris://{args['username']}:{args['password']}@{args['hostname']}:{args['port']}/{args['namespace']}")
conn = engine.connect()
# rows are kept after close connection
train_df.to_sql(name='table1', con=conn, if_exists='replace', index=False)
# rows **aren't** kept after close connection
train_df.to_sql(name='table2', con=conn, if_exists='replace', index=False)
conn.close()
engine.dispose()I did some research and based on this stackoverflow post and this and this doc pages, I changed the connection and it worked:
conn = engine.connect().execution_options(isolation_level="AUTOCOMMIT")Is this the best way to achive the desired behavior?
Comments
Hi,
If i remember correctly, the default behavior of the to_sql method is to use a transaction to insert the data.
What i do is using with statement to ensure that the transaction is commited and closed after the insert:
with engine.connect() as conn:
train_df.to_sql(name='table1', con=conn, if_exists='replace', index=False)
Otherwise, you can commit the transaction manually:
conn = engine.connect()
train_df.to_sql(name='table1', con=conn, if_exists='replace', index=False)
conn.commit()
conn.close()
That's what i do, hope it helps.
It looks better than my approach! Thanks for sharing! :)