Wired Magazine’s current issue discusses critical vulnerabilities in Brinks’ CompuSafe Galileo. On its website, Brinks describes the safe as “a closed-loop cash management solution that helps stores eliminate deposit discrepancies, reduce theft, and free staff from the time-intensive task of counting, recounting and auditing cash.” It also “eliminates deposit discrepancies and increases security.”

 

 

Retrofitting-security.jpg

Unfortunately, in deference to ease of use, the safe has an external USB port on the side of the touch-screen (it’s on the left-hand side, not visible in the photo), which, if you happen to have a thumb drive, gives you an interface for launching a script-driven malicious attack on the safe and whatever it’s connected to, which could, thanks to Brinks’ enhanced customer services, be a lot. Security firm Bishop Fox demonstrated in early August at the Def Con hacker conference in Las Vegas that they could not only open the safe to remove its money in 60 seconds, but could also alter or obliterate all records at the backend.

 

Security is usually classified as one of the NFR aspects of quality. NFR means “non-functional requirements”. I build you a door that opens and closes. If you want a key to lock it, that’s a non-functional requirement, because your door will open or close with or without the key. It’s a distinction that looks flimsier the longer you look at it. It allows software houses to treat security, performance, usability, accessibility, and other NFR’s as “extras.” But the name of this object, “safe”, suggests that at least one NFR is not extra at all. In fact, as Dan Petro of Bishop Fox says, “In the instance of the CompuSafe Galileo, digitalization hasn’t magically rendered it more secure. Rather, it’s accomplished the exact opposite: It’s turned a safe into an (un)safe.”

 

The CompuSafe has laudable functionality. Not only can money be counted, posted, and audited digitally, but Brinks offers customers a service that tracks all cash inputs locally, regionally, and internationally. But if Bishop Fox can alter and delete financial records on the backside from its thumb drive, Brinks’ own corporate assets are at risk. According to Rule #1 of security engineering, any system is only as solid as its weakest link. Like the Brinks CompuSafe, all devices in IoT (Internet of Things) must be secure against malicious attacks. If devices are connected, they must be secure because, thanks to connectivity, I can get from this device to that device.

 

IoT designers must build that security into both hardware and software. Test-and-fix is a weak and expensive solution because any fixes are not part of the original design. Japan’s Poka-Yoke approach to manufacturing, developed in the 16th century, and which surfaced in America in the 1990’s in the lectures and books of people like Boris Beizer and James Tierney, asks designers to block users from making mistakes (that’s the prevention part), and make all tasks (i.e., functions) testable.

 

The problem facing CompuSafe and IoT in general is that to design for security, engineers must work from both attack scenarios and user scenarios. A USB port for thumb drive access makes life a little easier for retail clerks and armored truck drivers, but – as though Brinks had not even considered it– it makes life much, much easier for coding-savvy robbers and terrorists.