Working at any large enterprise means you’ll more than likely be stuck behind a proxy server that restricts your access to the internet. It’s also likely to intercept your web requests and perform a man-in-the-middle (MITM) inspection of your encrypted data. There’s no privacy on corporate networks!
This can drive developers bonkers. You might be blocked from accessing resources you need. I’ve seen companies block access to sites like docker hub and stackoverflow for ridiculous perceived ‘security risks’.
If your friendly neighbourhood proxy also performs MITM inspection as mentioned above, then another thing it will do is break your HTTPS requests. Why’s that?
First, a little primer on SSL/TLS.
SSL/TLS certificates ensure that communications between devices is secure, and that the service you’re talking to is ligitimate. This is achieved by the service (eg, website) having an SSL/TLS certificate which is issued or signed by a trusted Certificate Authority (CA). Most browsers and systems come with a bundle of trusted Root CA certificates pre-installed. So when you visit Google.com, Google’s certificate will be signed by one of these trusted Root CAs that both validates that you’re talking to Google and that your connection is encrypted securely.
Back to Your Proxy
Your organisation will use a self-signed certificate which isn’t signed by any external trusted root CA. So when you make HTTPS requests to external sites or APIs you’ll get a ton of security errors.
A quick example. Here’s what it looks like when I use openssl to connect to googleapis.com directly:
$ openssl s_client -connect www.googleapis.com:443 CONNECTED(00000007) depth=2 OU = GlobalSign Root CA - R2, O = GlobalSign, CN = GlobalSign verify return:1 depth=1 C = US, O = Google Trust Services, CN = Google Internet Authority G3 verify return:1 depth=0 C = US, ST = California, L = Mountain View, O = Google LLC, CN = *.googleapis.com verify return:1 --- Certificate chain 0 s:/C=US/ST=California/L=Mountain View/O=Google LLC/CN=*.googleapis.com i:/C=US/O=Google Trust Services/CN=Google Internet Authority G3 1 s:/C=US/O=Google Trust Services/CN=Google Internet Authority G3 i:/OU=GlobalSign Root CA - R2/O=GlobalSign/CN=GlobalSign
You can see that there are 2 certificates in the chain. At the top (0) we have the subject (domain) certificate issued by the Google Trust Services intermediate certficate. Next in the chain (1) is the intermediate certificate which is issued by the trusted GlobalSign root CA.
Now let’s try the same thing via the proxy:
$ openssl s_client -proxy proxy.service.vandalay:80 -connect www.googleapis.com:443 CONNECTED(00000005) depth=2 DC = com, DC = anz, DC = global, CN = Vandalay Global CA 01 v2 verify error:num=20:unable to get local issuer certificate verify return:0 --- Certificate chain 0 s:/C=US/ST=California/L=Mountain View/O=Google LLC/CN=*.googleapis.com i:/C=AU/ST=Somewhere/L=Nice/O=Vandalay Industries Limited/OU=Technology/CN=Vandalay Global SSL Visibility CA v2 1 s:/C=AU/ST=Somewhere/L=Nice/O=Vandalay Industries Limited/OU=Technology/CN=Vandalay Global SSL Visibility CA v2 i:/DC=com/DC=vandalay/DC=global/CN=Vandalay Global CA 01 v2 2 s:/DC=com/DC=vandalay/DC=global/CN=Vandalay Global CA 01 v2 i:/DC=com/DC=vandalay/DC=global/CN=Vandalay Global Root CA v2
As before the top one is the googleapis.com domain certificate, but this time the issuer (the certificate authority) is Vandalay Industries, which is in-turn issued by the Vandalay Global CA, which is finally issued by the Vandalay Root CA.
This is what causes issues. Vandalay Industries certificates are self-signed and aren’t trusted outside of the organisation.
I hit these issues using Google Cloud Platform’s client libraries for Python. Here’s what the problem looks like:
from googleapiclient.discovery import build from oauth2client.client import GoogleCredentials credentials = GoogleCredentials.get_application_default() dataflow = build('dataflow', 'v1b3', credentials=credentials, cache_discovery=False)
<snip giant stack-trace> File "/Users/skip/.pyenv/versions/3.6.6/lib/python3.6/ssl.py", line 689, in do_handshake self._sslobj.do_handshake() ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:841)
I hate these sorts of errors, because they give you zero information on how to fix them without digging around online. What we need to do is tell our machine to trust these dodgy-looking self-signed certificates.
First we’ll need to create a certificate authority file - basically just a big list of these certificates. One way to get them all is to export them from a web browser like Firefox (Preferences > Certificates > View Certificates > Authorities), and export anything with your organisation name.
Mine list of certificates like this:
VandalayGlobalCA01v2.crt VandalayGlobalCA02v2.crt VandalayGlobalRootCAv2.crt VandalayGlobalSSLDecryptionCAv2.crt VandalayGlobalSSLVisibilityCAv2.crt VandalayGlobalSSLVisibilityECCCAv2.crt VandalayGlobalTestCA01v2.crt
Then we concatenate them all together:
cat Vandalay*.crt > Vandalay-ca-bundle.pem
Now you have a CA bundle. The Google
googleapiclient client library uses httplib2 that can be told to use a specific CA certificates file by setting the following environment variable:
Some additional python libraries for
httplib2 are also required:
pip install httplib2.system-ca-certs-locater pbr
This fixes any
CERFIFICATE_VERIFICATION_ERRORs when using
httplib2 requests. If you use the
requests library anywhere you’ll need to set a different environment variable:
It’s also probably a good idea to have the following installed. These contain certificates and other SSL/TLS features that might be missing.
pip install certifi urllib3[secure] requests[security]
If you just want to use
curl then you can use the following