ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject
Importing from pyxdameraulevenshtein gives the following error, I have
pyxdameraulevenshtein==1.5.3,
pandas==1.1.4 and
scikit-learn==0.20.2.
Numpy is 1.16.1.
Works well in Python3.6, Issue in Python3.7.
Has anyone been facing similar issues with Python3.7 (3.7.9), docker image - python:3.7-buster
__init__.pxd:242: in init pyxdameraulevenshtein
???
E ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject
I'm in Python 3.8.5. It sounds too simple to be real, but I had this same issue and all I did was reinstall numpy. Gone.
pip install --upgrade numpy
or
pip uninstall numpy
pip install numpy
try with numpy==1.20.0
this worked here, even though other circumstances are different (python3.8 on alpine 3.12).
Indeed, (building and) installing with numpy>=1.20.0
should work, as pointed out e.g. by this answer below. However, I thought some background might be interesting -- and provide also alternative solutions.
There was a change in the C API in numpy 1.20.0
. In some cases, pip
seems to download the latest version of numpy
for the build stage, but then the program is run with the installed version of numpy
. If the build version used in <1.20
, but the installed version is =>1.20
, this will lead to an error.
(The other way around it should not matter, because of backwards compatibility. But if one uses an installed version numpy<1.20
, they did not anticipate the upcoming change.)
This leads to several possible ways to solve the problem:
- upgrade (the build version) to
numpy>=1.20.0
- use minmum supported numpy version in
pyproject.toml
(oldest-supported-numpy
) - install with
--no-binary
- install with
--no-build-isolation
For a more detailed discussion of potential solutions, see https://github.com/scikit-learn-contrib/hdbscan/issues/457#issuecomment-773671043.