High reliability knocks down the door.

Updated: Jul 3, 2018

My first experiences with programming computers were using punched cards which were uploaded onto mainframe computers, such as the IBM-360/370's which used large arrays of core memory. The picture below shows a piece of core memory with about 100 rings, and each ring is laced with three wires. That comprises a modest 100 bits of memory.

Core memory, about 100 bits.

These technologies were largely obsolete by the time I arrived on the computing scene. While in college I worked for a bank while in college, operating a rather large mainframe computer, the NCR-8500 "Criterion". Most weekends I could be found in the bank either running the Saturday's business, monitoring the ATM machines, printing statements, or sorting checks. The aforementioned tasks consumed most of Saturday. By the time Sunday came around I could work on homework or just experiment with programming languages such as FORTRAN, NCR Neat/3, RPG-II or just "bare metal" assembly languages (of various types). Over about 10 years I used these types of systems, in banking and aerospace applications.


Then, I closed my eyes one day, and those hardware technologies were obsolete. I eventually found myself working for an oil company in Houston developing pipeline control software on microprocessors which were only fractionally as powerful as either the IBM 360/370 or the old NCR-8500. This was the early 1980's and these computers used Intel's newest processors, 8080's, 80186's, and the new "hot rod" 80286. What these computers lacked in power, they inversely made up for in terms of size. In other words they were small and compact. This was also the time that the legendary figure Richard Stallman launched the GNU "free software" project at MIT. Like that meteor of extinction cutting the night sky over the heads of the dinosaurs, I looked up at the sky, and did not comprehend what was happening.


Then, I again closed my eyes one day, and those technologies of the 1980's were gone. But not the GNU Project; like Mt. St. Helens, this GNU project was a volcano with a growing magma pool of "free software" that was just simmering under the surface, waiting to explode on the scene.


I eventually found myself owning a small consulting company in Cincinnati designing electronic systems, firmware systems and device drivers, using ever more powerful computers and ever smaller machines. There this consultancy operated, with a dozen some odd employees, and a half dozen regular customers -- all oblivious to the dangers from the growing pool of "free software" residing in that hidden magma chamber.


As we rolled into the second decade following the new millennium I initially felt that size/weight and power of these computer systems would be driving factors of customer success and professional advancement. After all, we could now purchase a microprocessor (e.g. many Cortex M) for a buck which had the same power as the NCR-8500 or the IBM 360/370. The reduction of size and weight, with quantum leaps in computing power, had made what was difficult in the past, look easy in the present. I should have kept my eyes open, because something remarkable happened, and until recently I missed it.


I opened my eyes and discovered that the GNU project had exploded onto the stage. The narrative of free software (and other forms of free intellectual property) was an impossible attraction for many companies to resist. And free to those not experienced in the world of technology also means devoid of value. Some industry players became lemmings, running their technology organizations off the cliff in order to reduce the populations of engineers.


In what seemed to be a mass hysteria of cost cutting, many solid and respected industry players sent high value engineering into third world venues where the skills of some with high level university degrees were less capable than some in the US with just high school degrees. (P.S. --- If you doubt this narrative, you should check the academic credentials of Messrs . Bill Gates, Mark Zuckerberg, or Thomas Edison). In earlier decades US firms had largely and used engineering talent emerging from the third world judiciously (e.g. the emergence of engineering talent in Taiwan in the 1980's, or in Japan in the 1960's), but going into the new millennium this was too often not the case.


Often key decision makers developed a mindset where engineers were seen as interchangeable parts -- and parts where the only distinguishing characteristic was cost. Sadly, I saw many fine business ventures suffer greatly because of this mindset. Generally these were business where engineering was deprived of a voice "at the table" of decision making. It was not a problem of engineering, nor a problem of talent from third world venues, it was a problem of decision making.


In the world of today (summer 2018) many excellent GNU / public domain systems (e.g. Linux, Beaglebone, Git, and more) live side by side in the online world with vast numbers of electronic and software systems which have less quality than a 6th grade volcano science project. Using these low quality systems won't get value into customer products, won't get value into engineering consultancies, and won't get value in engineering salaries. Value in software systems is often no longer seen in the acquisition price, because it's now "free" (thank you Richard Stallman). From my viewpoint, the future value of electronic and software systems no longer derives from the software or electronic designs themselves, but rather in the evidence of quality which accompanies those designs. That evidence takes on many forms; e.g. test procedures and test reports, qualification, demonstration, analysis, requirements, review, ... and numerous other forms.


Regardless of whether you're designing a medical device, an autonomous robot or just a desktop printer, evidencing quality through methods of high reliability engineering is now key to building value and enabling success.


I'll discuss this more in future posts, but in the mean time if you have an interest, you can explore online discussions around standards such as the RTCA's DO-178B/C (for aviation software), DO-254 (for aviation hardware), or EIC 62304 (medical software). And in the interim, I think I'll keep my eyes open for a while. SC


0 views

© 2018 by Constellation Data Systems, Inc.