In 'normal' mode SQLAlchemy maintains a connection pool and releases connections from the pool to the application as needed, tracks them, and tries to keep everything tidy. When the need arises for a 'native' DBAPI connection [for this example I'm using PostgreSQL] it is possible to explicitly check a connection out from the pool - after which it is yours to deal with, to track the isolation, close, etc..
Assuming the database connection has already been created and bound to the session factory with something like:
from sqlalchemy import create_engine, Session
...
engine = create_engine( orm_dsn, **{ 'echo': orm_logging } )
Session.configure( bind=engine )
- then sessions can be taken from the connection pool simply by calling "db = Session( )". When the "db.close( )" is performed that session goes back into the pool.
If a connection is to be used outside of all the mechanics that SQLAlchemy provides it can be checked out, as mentioned before, using a rather verbose call:
conn = Session( ).connection( ).connection.checkout( ).connection
Now "conn" is a 'native' DBAPI connection. You can perform low level operations such as setting the isolation level and creating cursors:
conn.set_isolation_level( psycopg2.extensions.ISOLATION_LEVEL_AUTOCOMMIT )
curs = conn.cursor( )
curs.execute( .... )
This is not a frequent need, but for very database backend specific it is the simplest approach [otherwise you can extend SQLAlchemy...]. One use case is using PostreSQL's asyncronous notify channels to capture database events; for this purpose an application needs to select on the DBAPI connection, there's no need for an ORM in the mix when you are just capturing events.