Is there a need to close files that have no reference to them?

Solution 1:

The pythonic way to deal with this is to use the with context manager:

with open(from_file) as in_file, open(to_file, 'w') as out_file:
    indata = in_file.read()
    out_file.write(indata)

Used with files like this, with will ensure all the necessary cleanup is done for you, even if read() or write() throw errors.

Solution 2:

You asked about the "basic concepts", so let's take it from the top: When you open a file, your program gains access to a system resource, that is, to something outside the program's own memory space. This is basically a bit of magic provided by the operating system (a system call, in Unix terminology). Hidden inside the file object is a reference to a "file descriptor", the actual OS resource associated with the open file. Closing the file tells the system to release this resource.

As an OS resource, the number of files a process can keep open is limited: Long ago the per-process limit was about 20 on Unix. Right now my OS X box imposes a limit of 256 open files (though this is an imposed limit, and can be raised). Other systems might set limits of a few thousand, or in the tens of thousands (per user, not per process in this case). When your program ends, all resources are automatically released. So if your program opens a few files, does something with them and exits, you can be sloppy and you'll never know the difference. But if your program will be opening thousands of files, you'll do well to release open files to avoid exceeding OS limits.

There's another benefit to closing files before your process exits: If you opened a file for writing, closing it will first "flush its output buffer". This means that i/o libraries optimize disk use by collecting ("buffering") what you write out, and saving it to disk in batches. If you write text to a file and immediately try to reopen and read it without first closing the output handle, you'll find that not everything has been written out. Also, if your program is closed too abruptly (with a signal, or occasionally even through normal exit), the output might never be flushed.

There's already plenty of other answers on how to release files, so here's just a brief list of the approaches:

  1. Explicitly with close(). (Note for python newbies: Don't forget the parens! My students like to write in_file.close, which does nothing.)

  2. Recommended: Implicitly, by opening files with the with statement. The close() method will be called when the end of the with block is reached, even in the event of abnormal termination (from an exception).

    with open("data.txt") as in_file:
        data = in_file.read()
    
  3. Implicitly by the reference manager or garbage collector, if your python engine implements it. This is not recommended since it's not entirely portable; see the other answers for details. That's why the with statement was added to python.

  4. Implicitly, when your program ends. If a file is open for output, this may run a risk of the program exiting before everything has been flushed to disk.