Software security is a social good

Building software where security is not embedded at every layer is straightforwardly negligent.
Back to Blog

I’ve been putting together the outline of a new book. For those who have never put together a book outline before, it’s just like any piece of writing – it has to tell a story with a beginning, a middle, and an end. A technical book though tends to stub out building blocks that you build the story around. The particular book I want to write is an architecture book, so that requires laying out different common architectural components.

So, with pen in hand I cheerily drew a box, labelled it “Security”, and looked at the page thinking, “that’s not right – that can’t just stand on its own like that…”

I’m not sure how we got to this point in terms of security of IT systems. It’s clear looking at the press that we, as an industry, just don’t get it. If you search on Google News for the word “breach”, every single hit relates to some IT security breach.

Yet we knew how to do sort this out. Windows XP SP1 was a more-or-less rewrite of Windows and IE to try and reset the problem that Windows was leaky as anything. That was back in 2001. This should have sent out a clear message to the industry that security was everyone’s problem.

Today though we have problems like the burgeoning IoT industry building a layer of network devices with zero security, the German government banning smartwatches for kids, and Uber paying off hackers. And then on the other side we never hear about banks being hacked in the same causal way. Systems that are built around a culture of security do not get hacked. Banks don’t get hacked because from the day they opened their doors in 1694, the whole industry’s culture has been about security.

This tells us that either no one gives a stuff about building secure systems, or that no one is willing to pay for it. I think it’s probably the former, because building secure systems isn’t expensive per se. A mindset change that “assume every call into your system is lying” combined with “assume the hackers are already on your network” doesn’t actually cost anything, and gets you a long way through the problem.


Case in point – I recently was reviewing a vehicle tracking solution for a client. The whole thing – from the comms of the tracking units to the app to the web-based front-end – none of it used TLS for the comms. I challenged them on it and they said it was “on the list”. In what alternate reality is TLS an optional extra for a vehicle tracking application? The total cost of that for the would be a certificate, and some server configuration. If they’d rolled it out like that on Day One of v1, the management time involved would have been negligible.

This is why the box on my architecture diagram for “security” was in the wrong place. It needs to be part of the DNA of whatever the business does. In that car tracking example, the entire tech team would have been justified to say to the business “it ain’t going out like that”. And if the tech team didn’t know how to do something that basic, they shouldn’t have been allowed to do the project. (I should say the actual solution was excellent, so this disconnect was weird. The team could obviously execute, just this one bit of security was missing.)

The actual issue here though is bigger than all this, and that’s as these systems permeate more and more into our daily lives, we have a responsibility to look after people’s data.

We’ve had data protection legislation for a long time, but those are misunderstood. Nothing about data protection within the UK does nor or will change to set out to make it unlawful for the developers in the vehicle tracking above to roll out a system without TLS. I’m not advocating that laws are created in the way I’m going to imply, but laws do exist to stop businesses being negligent in ways that protected the public good. When we as a society make laws, we do it because we want society to be somehow improved.

Building software where security is not embedded in the team, through the design, through execution, and test, through release and maintenance should be regarded as straightforwardly negligent. A breach like Uber’s recent one should be seen as if Uber erected a crane in the middle of London only to have it collapse having not bothered checking that the bolts were tightened – it’s dangerous and unreasonable behaviour that harms the public good. It’s simply extraordinary that we behave the way we do around security in IT systems.

Navigation

Social Media