try setting up your environment with a virtualenv, and install in there only the required libraries

some details on working with virtual env are here: https://virtualenv.pypa.io/en/stable/


For me, it is a simple case of using pandas that the exe is huge.

Though removing certain directories was helpful, as was UPXING that helped a great deal also.

I got it reduced a lot and it was not doing this by default.

That being said, the final and most import solution is talked about here: Importing Python modules from a select location . So there was a feature that did all this, but for now there is some manual handling involved because: multipackage-bundles is broken.

Now to the simple solution for lots of exe's

If you have many executables, I highly recommend this approach:

pyinstaller -F abc.py --onedir (Have all imports of both scripts)
pyinstaller -F abd.py --onedir (Have all imports of both scripts)

Now put abd.exe in the one directory of abc.py folder as well as any other external scripts. Be sure they are differently named or only one script will run.

This works really well because all the dependencies are in one folder. This is how it should be. So in this example say you had a 40mb one folder. For each additional exe afterwards, it will only be +5mb(or how big the exe is) rather than 40mb each.


The python interpreter and all imported modules are included in the executable.

You can try adding modules you want to exclude to the excludes list under Analysis in your spec file.

You could also try compressing the executable using UPX. See A note on using UPX


I use the Anaconda environment and so the virtualenv solution isn't an option. my way was to exclude unnecessary modules in my spec file, ex.:

in Analysis(...)

excludes=['pandas', 'numpy'],

(this are modules whose raise the size of the files extraordinarily)

For every build i'm using this adjusted spec file to create the exe.

pyinstaller "mySpec.spec" --distpath="<path>"