Amazon Linux, EC2, S3, Perl, SSL Wildcard Certificates

Amazon Linux, one of the distributions that is recommended for Amazon EC2 customers, recently had an update — 11.09. In this there was an update to a whole raft of libraries, including the Perl LWP (libwww) library in perl-libwww-perl-5.837 (previously 5.8.33), and other related modules.

One of the changes that happened is a change of the default for “verify hostname” in the SSL protocol when using LWP::UserAgent; previously verification of the certificate to the hostname given was default disabled, and in an effort to improve security, this was turned on. You’ll see this mentioned in LWP::UserAgent documentation “The no checks behaviour was the default for libwww-perl-5.837 and earlier releases”. What’s unusual is the no-checks behaviour change is DIFFERENT in Amazon Linux’s package of 5.8.37 compared to this statement – I suspect this one line got back ported into 5.8.37 to change this default ‘in the interst of security’.

Unfortunately, this breaks a lot of scripts and other modules/libraries out there, one of which is the Amazon-issued S3 libary. S3 is the Amazon Simple Storage Service (SSS => S3), with which a user (customer) has their data arranged in “buckets”, with data in objects identified by ‘keys’ (like a file name). All data is put to, and read from the S3 service over HTTPS – it’s not locally mounted (though some cleaver fuse stuff may make that look possible – but it is still over HTTPS.

A bucket in S3 has a name, and for the example I have, the name looks like a domain name (images.foo.com). When accessing this bucket, the Amazon S3 Perl library connects to an alias hostname (CNAME) made up combining the bucket name above with “s3.amazonaws.com“, so our example here becomes “images.foo.com.s3.amazonaws.com“. This site is using a wildcard certificate for “*.s3.amazonaws.com” (you can see it as an Alternate Subject Name extension in the SSL certificate). This permits the certificate to be considered as valid for any hostname directly under the s3.amazonaws.com domain. However, subject to RFC 2818, the only thing permitted before “s3.amazonaws.com” is a single name – not a (seemingly valid) dotted domain name. So “com.s3.amazonaws.com” is OK with a wildcard certificate, but “images.foo.com.amazonaws.com” is not.

There are several solutions. The easiest is to turn off SSL certificate verification again in your script. A handy ENV environment variable may be set to do this: $ENV{PERL_LWP_SSL_VERIFY_HOSTNAME}=0. Alternatively, if you are using LWP directly, you can pass an initalisation parameter to LWP of ssl_options => { verify_hostname => 0}. Both effectively abandon any certificate verification.

Somewhat more complicated, you can define a custom validation callback (procedure) to further determine if the certificate is valid. This is in contravention to RFC 2818, and seems like a lot more hassle to work around.

Perhaps the easiest solution here is to avoid using period/dot/’.’ in Bucket Names in S3, thereby removing the conflict between the strict checking.

The most important thing is how lax we have been at verifying SSL certificates, and have come to rely on that just working. It is good to verify the SSL certificate matches the host in scripts: I don’t want to start communicating authentication information over an SSL channel if we can easily see we’ve been duped on the remote end. I was not familiar with wildcard certificates only being valid for one component of a domain name; this kind of reduces their effectiveness in my mind in some sense.They’ve always been more expensive than standard certificates, but being better aware of the FQDNs they will validate on is useful.

I’ve seen several other instances outside of this S3 example where invalid certificates have blindly been accepted by scripts (a CloudWatch example I saw with a redirect ‘hop’ through an SSL site); this default change from lax to legitimate certificates may actually encourage better adoption of the security that SSL can give — when we’re already paying for SSL certs — or lead us (as developers and architects) to acknowledge when we’re actively ignoring that layer of protection.

It’s early days now but as this default change filters into Linux distributions (and Perl distributions on other platforms) then we’ll start to see a lot of FAQs on this.

Rusty’s talk at PLUG

What a week for PLUG. After months of organisation, we were honoured by Rusty Russell flying to Perth for PLUG. He presented a talk entitled “Coding: lets have fun“, which showed the simplicity and beauty of a regular expression engine in around 20 lines of C, to a wireframe Flight Sim from a recent IOCCC where the code itself was formated in the outline of an aircraft, and then a dotted history of his experiences and where he has found joy in coding.

After a pizza dinner break for the 46 (or thereabouts) people present, Rusty was then corraled into a panel discussion with Dr Chris McDonald from UWA CompSci, and Assistant Professor Robert Cunningham from UWA Law for a chat on various topics; seems like cloud computing was on everyone’s thoughts.

The PLUG AV crew streamed this event live, and recorded it: videos of the talk (93 MB mp4) and the panel (115 MB mp4) are now available (both are around an hour and a quarter). Older videos are here.

Rusty was very generous in refusing to accept the collected funds for the expenses, so we have money now to repeat this exercise of flying in another speaker. It’s up to PLUGGers to try and decide who they would like to see next! Time-wise its likely to be Q2 next year as PLUG has a full schedule until then.

Big thanks to Chris, Robert and Rusty for speaking – they were all excellent. Also to Daniel Hamrsworth for co-ordinating tickets, the AV crew for their recording, and for everyone who put their hand in their pocket to help the event come together.

New a new PC. Time for a desktop?

My 2 year old Dell Studio 1558 is doing it again: slowing to a snails pace, heating to an inferno, and then spontaneously powering off (which I think is a saftety set at CPU temperature reaching 100*C).

I had Dell come and replace parts on this laptop about 9 months ago when similar symptoms developped. I originally purchased this unit while I was in the UK, around January 2010 I think it was. I was hoping to get 3 years out of it. Sadly, at around 20 months old, I’m getting too frustrated to put up with it. I’m now living in Australia, and having any PC multi-national company honour their warranty internationally is a challenge. Heck, worse offender in this scenario is Sony, who want £20 to answer the phone!

Now that I’m no longer living in a flat with a very transient lifestyle (lots of travel having gone, and replaced by a 1 year old boy), I’m much more rooted to my home office desk. So, in light of this, I’m thinking of getting a desktop with a reasonable screen. I saw Russell Coker’s post about a 27″ whopper from Dell for AU$899 or so, and was wondering what to pair that with, or if to go for a slightly smaller screen. Then comes the questions of the all-in-ones, and the touchscreens that are around.

What I’d like is something thats got a few (2?) USB 3 ports for the next few years of my accessory usage, SATA 3 so I can throw in a fast SSD. I’d potentially run Debian on this, so possibly don’t want a Windows license.4 GB RAM minimum, possibly 8.

So looking around its a quagmire of detaisl that 15 years ago I used to thrive on. Do I care about UEFI instead of a traditional BIOS. DO I really need SATA 3 instead of 2? What about legacy (!) 1394? HDMI connector – yes please – do I still want a VGA port? What about a second HDMI? Hm. That 27″ screen’s native res is more than most on-board graphics can drive… perhaps drop to a 24″ screen. What size should this be: ATX, mini ITX, smaller?

Then comes the pre-built or custom built. Dell, pretty I’m upset about your product quality right now. HP, you’ve (a) killed my DreamScreen recently, and (b) put your entire business in up the creek with indications that the PC business is going away/sold off. Lenovo? Acer?

So I’m at a computing crossroads. I can’t be bothered to build my own PC again – I’ve been living on laptops for almost a decade now. But they are expensive, and when something goes wrong, the there’s very little to salvage. Laptops suck, but do desktops suck less. Vendors suck, but then so does the time waste on building your own? I think Tablets suck for doing lots of data input (programming). All in ones – not sure. Touchscreens – probably a gimmick.

I am registered for LCA!

Yay – not only did the programme come out, but also registration for Linux.conf.au 2012 opened within hours of my last post – well done LCA team! And I’m now registered and paid and ready. Just need to sort out flights… its been a few years but I’m looking forward to it.

LCA 2012 – registration opening soon, hopefully

Looking forward to getting myself sorted for Linux.Conf.Au 2012 this January in Ballarat, Victoria. A heap of mini confs have been added before hand – now comes the problem of choosing between them.  Registrations were slated to open early september – hopefully soon, as I want to confirm my ticket and accomodaiton before booking flights… and they get more expensive as time passes. So, I guess everyone is watching the LCA web site intently!

May have a few days in Melbourne afterwards with my Mrs and son… we’ll see. 😉