Sunday, February 23, 2014
Apple, Microsoft, Google all face big system testing issues to make sure their code doesn't have security holes or "back doors" : Columbia University article on testing
I could reasonably put this story in either the Internet Safety or IT jobs blog, but the following piece on systems testing from Columbia University in NYC shows how “bad logic” (in this case, a redundant “Goto Fail”) can leave an open back door for hackers, and also explains why it is so hard to detect in normal system or betat testing “test plans” as typically set up by system architects. The link is here.
The code is part of the TLS implementation in Apple’s iOS, an operating system for the iPhone and iPad. I expect I’ll get a mandatory update soon for my iPad to fix it. I have to be careful about these when I plan to take the iPad on a trip and depend on it for connectivity “on the road” (Jack Kerouac style).
There is a detailed blog post on the Apple bug by Adam Langley here. ("Imperial Violet").
The programming languages are C, mostly Objective C, and sometimes Objective C++. Objective C is considereded an OOP (Wiipedia here).
At ING-ReliaStar in Minneapolis, we coded screen-emulations in C, the GUI in PowerBuilder, the Data Access Layer in Java (with data collectors and data manipulators), the “Bridge” to the GUI in C++, and batch replications (in MVS-OS390 COBOL) from mainframe legacy systems, with some direct connect in DB2. It turns out that the person who ran my own ISP supporting my book wrote all the Javadoc for the data access layer and it got called “Dan’s Web Site”. In fact, everybody knew the difference between than an “Bill’s Web Site”) for the books, which was all flat HTML in those days.