Profile Photo

Jamie Skipworth

Technology Generalist | Software & Data


Working at any large enterprise means you’ll more than likely be stuck behind a proxy server that restricts your access to the internet. It’s also likely to intercept your web requests and perform a man-in-the-middle (MITM) inspection of your encrypted data. There’s no privacy on corporate networks!

This can drive developers bonkers. You might be blocked from accessing resources you need. I’ve seen companies block access to sites like docker hub and stackoverflow for ridiculous perceived ‘security risks’.

If your friendly neighbourhood proxy also performs MITM inspection as mentioned above, then another thing it will do is break your HTTPS requests. Why’s that?

First, a little primer on SSL/TLS.

SSL/TLS certificates ensure that communications between devices is secure, and that the service you’re talking to is ligitimate. This is achieved by the service (eg, website) having an SSL/TLS certificate which is issued or signed by a trusted Certificate Authority (CA). Most browsers and systems come with a bundle of trusted Root CA certificates pre-installed. So when you visit, Google’s certificate will be signed by one of these trusted Root CAs that both validates that you’re talking to Google and that your connection is encrypted securely.

Back to Your Proxy

Your organisation will use a self-signed certificate which isn’t signed by any external trusted root CA. So when you make HTTPS requests to external sites or APIs you’ll get a ton of security errors.

A quick example. Here’s what it looks like when I use openssl to connect to directly:

$ openssl s_client -connect
depth=2 OU = GlobalSign Root CA - R2, O = GlobalSign, CN = GlobalSign
verify return:1
depth=1 C = US, O = Google Trust Services, CN = Google Internet Authority G3
verify return:1
depth=0 C = US, ST = California, L = Mountain View, O = Google LLC, CN = *
verify return:1
Certificate chain
 0 s:/C=US/ST=California/L=Mountain View/O=Google LLC/CN=*
   i:/C=US/O=Google Trust Services/CN=Google Internet Authority G3
 1 s:/C=US/O=Google Trust Services/CN=Google Internet Authority G3
   i:/OU=GlobalSign Root CA - R2/O=GlobalSign/CN=GlobalSign

You can see that there are 2 certificates in the chain. At the top (0) we have the subject (domain) certificate issued by the Google Trust Services intermediate certficate. Next in the chain (1) is the intermediate certificate which is issued by the trusted GlobalSign root CA.

Now let’s try the same thing via the proxy:

$ openssl s_client -proxy proxy.service.vandalay:80 -connect
depth=2 DC = com, DC = anz, DC = global, CN = Vandalay Global CA 01 v2
verify error:num=20:unable to get local issuer certificate
verify return:0
Certificate chain
 0 s:/C=US/ST=California/L=Mountain View/O=Google LLC/CN=*
   i:/C=AU/ST=Somewhere/L=Nice/O=Vandalay Industries Limited/OU=Technology/CN=Vandalay Global SSL Visibility CA v2
 1 s:/C=AU/ST=Somewhere/L=Nice/O=Vandalay Industries Limited/OU=Technology/CN=Vandalay Global SSL Visibility CA v2
   i:/DC=com/DC=vandalay/DC=global/CN=Vandalay Global CA 01 v2
 2 s:/DC=com/DC=vandalay/DC=global/CN=Vandalay Global CA 01 v2
   i:/DC=com/DC=vandalay/DC=global/CN=Vandalay Global Root CA v2

As before the top one is the domain certificate, but this time the issuer (the certificate authority) is Vandalay Industries, which is in-turn issued by the Vandalay Global CA, which is finally issued by the Vandalay Root CA.

This is what causes issues. Vandalay Industries certificates are self-signed and aren’t trusted outside of the organisation.

I hit these issues using Google Cloud Platform’s client libraries for Python. Here’s what the problem looks like:

Test code:

from googleapiclient.discovery import build
from oauth2client.client import GoogleCredentials
credentials = GoogleCredentials.get_application_default()
dataflow = build('dataflow', 'v1b3', credentials=credentials, cache_discovery=False)


<snip giant stack-trace>
File "/Users/skip/.pyenv/versions/3.6.6/lib/python3.6/", line 689, in do_handshake
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:841)

I hate these sorts of errors, because they give you zero information on how to fix them without digging around online. What we need to do is tell our machine to trust these dodgy-looking self-signed certificates.

First we’ll need to create a certificate authority file - basically just a big list of these certificates. One way to get them all is to export them from a web browser like Firefox (Preferences > Certificates > View Certificates > Authorities), and export anything with your organisation name.

Firefox Certificates

Mine list of certificates like this:


Then we concatenate them all together:

cat Vandalay*.crt > Vandalay-ca-bundle.pem

Now you have a CA bundle. The Google googleapiclient client library uses httplib2 that can be told to use a specific CA certificates file by setting the following environment variable:

export HTTPLIB_CA_CERTS_PATH=/path/to/Vandalay-ca-bundle.pem

Some additional python libraries for httplib2 are also required:

pip install httplib2.system-ca-certs-locater pbr

This fixes any CERFIFICATE_VERIFICATION_ERRORs when using httplib2 requests. If you use the requests library anywhere you’ll need to set a different environment variable:

export REQUESTS_CA_BUNDLE=/path/to/Vandalay-ca-bundle.pem

It’s also probably a good idea to have the following installed. These contain certificates and other SSL/TLS features that might be missing.

pip install certifi urllib3[secure] requests[security]

If you just want to use curl then you can use the following

export CURL_CA_BUNDLE=path/to/Vandalay-ca-bundle.pem

All fixed!