The Best Memory Leak Definition [closed]

There are two definitions (at least for me):

Naive definition: Failure to release unreachable memory, which can no longer be allocated again by any process during execution of the allocating process. This can mostly be cured by using GC (Garbage Collection) techniques or detected by automated tools.

Subtle definition: Failure to release reachable memory which is no longer needed for your program to function correctly. This is nearly impossible to detect with automated tools or by programmers who are not familiar with the code. While technically it is not a leak, it has the same implications as the naive one. This is not my own idea only. You can come across projects that are written in a garbage collected language but still mention fixing memory leaks in their changelogs.


Allocated memory that cannot be used because the reference to it has been lost.


Definition: Failure to release memory after allocation.


The process in which memory resource are allocated and not properly released once no longer required, often introduced through bad coding practices.

There are built in ways in some languages to help prevent them, although the best way to avoid them is through diligent observation of code execution paths and code reviews. Keeping methods short and singularly purposed helps to keep resource usage tightly scoped and less prone to get lost in the shuffle, as well.


There are two ways a memory leak may be defined.

First, if data is not freed when there are no longer has any references to it, that data is unreachable (unless you have some corrupt pointer or read past the data in a buffer or something). Basically, if you don't free/delete data allocated on the heap, it becomes unusable and simply wastes memory.

There may be cases where a pointer is lost but the data is still accessible. For example, if you store the pointer in an int, or store an offset to the pointer (using pointer arithmetic), you can still get the original pointer back.

In this first definition, data is handled by garbage collectors, which keep track of the number of references to the data.

Second, memory is essentially leaked if it is not freed/deleted when last used. It may be referenced, and immediately free-able, but the mistake has been made not to do so. There may be a valid reason (e.g. in the case where a destructor has some weird side effect), but that indicates bad program design (in my opinion).

This second type of memory leaking often happens when writing small programs which use file IO. You open the file, write your data, but don't close it once you're done. The FILE* may still be within scope, and easily closeable. Again, there may be some reason for doing this (such as locking write access by other programs), but to me that's a flag of bad design.

In this second definition, data is not handled by garbage collectors, unless the compiler/interpreter is smart (or dumb) enough to know it won't be used any longer, and this freeing the data won't cause any side effects.