ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1056)

TL;DR

The remote website seems to be the problem, not Python. There is likely no fix for this other than to fix the website.

Longer Explanation

The website/server your are dealing with is apparently configured incorrectly. This has nothing directly to do with Python. That said, you can ignore any certificate errors with e.g.:

r = requests.get(url=URL, params=PARAMS, verify=False)

or you can otherwise try to point Python at the missing certificates (as pointed out by @dave_thompson_085 in the comments).

However, this is unlikely to do any good as the server then apparently responds with a 500: Internal Server Error (verified with curl) and a Content-Length: 0, which would seem to indicate an error in the processing of api.php itself (i.e. there is no JSON to process anyway).


I don't think the server is necessarily the problem. I'm doing something similar, but my first two lines are the following.

import pandas as pd

BCD = pd.read_csv('https://archive.ics.uci.edu/ml/machine-learning-databases/breast-cancer/breast-cancer.data')

I'm doing this simultaneously on MacBook Pro with Mojave 10.14.6 and a Microsoft Surface with Windows 10 Enterprise, 10.0.17134, both using Jupyter. My Python installation is 3.7.3 on both.

Both are accessing the Internet through the same home Wi-Fi.

The Surface grabbed it on the first try. The MacBook gives me the same error as the OP.

So, it's really not likely that the UCI database's server is the problem.


In our case the issue was related to SSL certificates signed by own CA Root & Intermediate certificates. The solution was - after finding out the location of the certifi's cacert.pem file (import certifi; certifi.where()) - was to append the own CA Root & Intermediates to the cacert.pem file. Of course, those own certificates were in PEM format.