I started web development around late 1994. Some of my earliest paid web work is still online (dated June 1995). Clearly, that was a simpler time for content! I went on to be ‘Webmaster’ (yes, for those joining us in the last decade, that was a job title once) for UWA, and then for Hartley Poynton/JDV.com at time when security became important as commerce boomed online.
At the dawn of the web era, the consideration of backwards compatibility with older web clients (browsers) was deemed to be important; content had to degrade nicely, even without any CSS being applied. As the years stretched out, the legacy became longer and longer. Until now.
In mid-2018, the Payment Card Industry (PCI) Data Security Standard (DSS) 3.2 comes into effect, requiring card holder environments to use (at minimum) TLS 1.2 for the encrypted transfer of data. Of course, that’s also the maximum version typically available today (TLS 1.3 is in draft 21 at this point in time of writing). This effort by the PCI is forcing people to adopt new browsers that can do the TLS 1.2 protocol (and the encryption ciphers that permits), typically by running modern/recent Chrome, Firefox, Safari or Edge browsers. And for the majority of people, Chrome is their choice, and the majority of those are all auto-updating on every release.
Many are pushing to be compliant with the 2018 PCI DSS 3.2 as early as possible; your logging of negotiated protocols and ciphers will show if your client base is ready as well. I’ve already worked with one government agency to demonstrate they were ready, and have already helped disable TLS 1.0 and 1.1 on their public facing web sites (and previously SSL v3). We’ve removed RC4 ciphers, 3DES ciphers, and enabled ephemeral key ciphers to provide forward secrecy.
Web developers (writing Javascript and using various frameworks) can rejoice — the age of having to support legacy MS IE 6/7/8/9/10 is pretty much over. None of those browsers support TLS 1.2 out of the box (IE 10 can turn this on, but for some reason, it is off by default). This makes Javascript code smaller as it doesn’t have to have conditional code to work with the quirks of those older clients.
But as we find ourselves with modern clients, we can now ask those clients to be complicit in our attempts to secure the content we serve. They understand modern security constructs such as Content Security Policies and other HTTP security-related headers.
There’s two tools I am currently using to help in this battle to improve web security. One is SSLLabs.com, the work of Ivan Ristić (and now owned/sponsored by Qualys). This tool gives a good view of the encryption in flight (protocols, ciphers), chain of trust (certificate), and a new addition of checking DNS records for CAA records (which I and others piled on a feature request for AWS Route53 to support). The second tool is Scott Helm’s SecurityHeaders.io, which looks at the HTTP headers that web content uses to ask browsers to enforce security on the client side.
There’s a really important reason why these tools are good; they are maintained. As new recommendations on ciphers, protocols, signature algorithms or other actions become recommended, they’re updated on these tools. And these tools are produced by very small, but agile teams — like one person teams, without the bureaucracy (and lag) associated with large enterprise tools. But these shouldn’t be used blindly. These services make suggestions, and you should research them yourselves. For some, not all the recommendations may meet your personal risk profile. Personally, I’m uncomfortable with Public-Key-Pins, so that can wait for a while — indeed, Chrome has now signalled they will drop this.
So while PCI is hitting merchants with their DSS-compliance stick (and making it plainly obvious what they have to do), we’re getting a side-effect of having a concrete reason for drawing a line under where our backward compatibility must stretch back to, and the ability to have the web client assist in ensure security of content.