In Python with sqlite is it necessary to close a cursor?

Here is the scenario. In your function you're executing statements using a cursor, but one of them fails and an exception is thrown. Your program exits out of the function before closing the cursor it was working with. Will the cursor float around taking up space? Do I have to close the cursor?

Additionally, the Python documentation has an example of cursor use and says: "We can also close the cursor if we are done with it." The keyword being "can," not "must." What do they mean precisely by this?


Solution 1:

It's probably a good idea (although it might not matter much with sqlite, don't know there, but it'll make your code more portable). Further, with recent Python (2.5+), it's easy:

from __future__ import with_statement
from contextlib import closing

with closing(db.cursor()) as cursor:
    # do some stuff

Solution 2:

You're not obliged to call close() on the cursor; it can be garbage collected like any other object.

But even if waiting for garbage collection sounds OK, I think it would be good style still to ensure that a resource such as a database cursor gets closed whether or not there is an exception.

Solution 3:

Interestingly, the Python 3.0 doc says "We can also close the cursor if we are done with it", while the Python 2.7 and 3.6 doc says "We can also close the connection if we are done with it".

The Python 2.7 and 3.0-3.4 docs don't describe the cursor .close() method. But the Python 3.5 and 3.6 docs describe the cursor .close() method:

Close the cursor now (rather than whenever __del__ is called).

The cursor will be unusable from this point forward; a ProgrammingError exception will be raised if any operation is attempted with the cursor.

Solution 4:

All,

I was experiencing a gradual memory leak with my code (Python 3.8) using sqlite3. I traced the likely cause to my database classes. As it turns out, I would open and use a cursor but never closed it. The database remained open during the life of the program (a Windows Service) and would be closed on exit.

Once I began closing the cursors in all of my db operations which used them, my memory leak stopped, and the memory footprint became steady.

I would therefore suggest that you take the time to close your cursors. It makes the code more consistent and apparently, helps control memory consumed.

Here's an example of how I close the cursor:

def write_to_db(self, cache_item:CacheEntry):
        '''Write a single cache entry to the database'''
        crsr = self._db_con.cursor()

        # Load some data elements
        fax_line_path = cache_item._dir_part
        phone_line = cache_item._phone_line
        sub_folder = cache_item._subfolder
        fname = cache_item._fname
        work_done = cache_item.get_workdone()

        try:
            crsr.execute(FilenameCacheDB.INSERT_CACHE,
                             (fax_line_path, 
                              phone_line, 
                              sub_folder, 
                              fname, 
                              work_done))

        except Exception as e:
            LOG.warning(f"Could not write {cache_item} to db because {e}")
            raise e

        finally:
            #
            # I was *not* closing the cursor prior
            #
            crsr.close()
            self._db_con.commit()