A group of Australian university students recently “cracked” the “encryption” used by public transport tickets in New South Wales (NSW), Australia. The public transport ticket format’s security relied on people not knowing the format of data on the ticket’s magnetic stripe and not having access to equipment to create tickets. In other words, it relied on “security through obscurity“.
Security through obscurity is not a new concept. In 1883, Auguste Kerckhoff theorised that the only truly secure parts about a system (computerized or otherwise) were the hidden or secret parts. This became known as Kerckhoff’s principle. Over half a century later, Claude Shannon rephrased it as assume “the enemy knows the system”, called Shannon’s Maxim. These are as true now in computer security, particularly cryptography and encryption, as they were then.
Ultimately, poor computer security for the public transport ticket system is not the programmers’ or engineers’ fault. The customer and management decided the security provided was sufficient. It is similar to the Year 2000 issue. The programmers knew that two digit years were not going to work after the century changed but why should the manager responsible spend resources to fix it when he or she is unlikely to be around in the year 2000 to take credit for the work?
When people initially approach computer security, they start at cryptography and software security. If we build more secure encryption systems; if we fix every buffer overflow and every injection attack; if every object has the appropriate authentication, access control, auditing and so on, the “bad guys” will be helpless.
A lot of progress has been made in improving computer security. Organizations like OWASP and SAFECode provide software security guidance for software developers. Many vendors provide frameworks, such as Microsoft’s Security Development Lifecycle, and software patching is a regular occurrence. The US government has FIPS 140-2, a standard for hardware or software that performs cryptography, and Common Criteria is an attempt by multiple governments to provide a framework for software security assurance. A plethora of software security tools are also available with interesting and technical sounding names like “static analysis tools” and “fuzzers”.
That said, computer security practitioners eventually realise it is a matter of economics instead – as Allan Schiffman said in 2004, “amateurs study cryptography; professionals study economics.” It is not that we do not know how to build secure systems. However, if the customer cannot differentiate the security (or quality) of two otherwise seemingly identical products, the cheaper (and likely less secure) product will win. Similarly, most software is sold “as is”, meaning the customer assumes all the risk. See Bruce Schneier’s “Schneier on Security” or “Freakonomics” by Steven D. Livett and Stephen J. Dubner for more discussion.
Others take a different direction and focus on the sociology and behavioural patterns behind security. Although most people recognise computer security is important, the first reaction of most when confronted with a security warning is to ignore or bypass it. After all, social engineering attacks are behind many of the headline grabbing computer security incidents over the last few years, such as Operation Aurora and Stuxnet. See Bruce Schneier’s “Liars and Outliers” for more discussion.
That said, there is little impetus to improve computer security without people willing to push the boundaries of systems and reveal flaws. Kudos and credit to the students for not only discovering the public transport ticket format and “encryption” mechanism (if true encryption was actually ever used) but revealing it in a responsible (or “white hat”) manner, such as by allowing the targeted state to reveal itself. Remember that just because the ticket format has been discovered by the students does not mean others have not already done so and used it for less than honest purposes.
Interestingly, the NSW public transport ticket system is approximately 20 years old, introduced when computers were less powerful and less accessible and the required security standards were much lower. I wonder how many systems being built today are going to be considered secure in 20 years time, when computers are going to be presumably exponentially more powerful and more ubiquitous and the security standard much higher.
Home cooked encryption algorithm is the weakest link of cryptography system. People tend to believe that their genius head can cook a better algorithm than say RSA or AES. A part of that to blame on engineers/architects too who should have proved their point. As they say, security never sells, its fear that sells. If engineer knows that there is a problem, sell its fear and solution will get accepted,
All good points. However, I might argue that people are often the weakest link in any system, though, and I think the non-standard cryptography is far less prevalent now because well-researched algorithms are accessible. With the almost ubiquity of SSL and the algorithms it relies on, for example, I would hope there is little need to implement their own.
As for fear selling, I think that also may be less so now, at least in computer security. It is still a political motivator, particularly in the USA, but the marketing of computer security firms now tends to focus on more positive and abstract points like “confidence” and “safety”. Similarly, computer security within organizations is primarily motivated by compliance. Fear and threats need actual incidents to back them up or they lose credibility like the Aesop’s fable “The Boy Who Cried Wolf“. Companies that have suffered major breeches, such as Sony and Hartford, have recovered from their losses and are still in business. As long as organizations can prove they did something or had some process in place, they can escape relatively unscathed from an incident, and this is the focus of the blog post, too.