Lessons Will Be Avoided: A Critical Perspective on Software Development Not Having Learnt from Mistakes

Les Hatton

Emeritus Professor of Forensic Software Engineering at Kingston University UK

A recurrent theme in software engineering is that it isn't soft and there is no engineering in it. This may be a little cruel, but it makes a perfect T-shirt. In truth, one of the principle defining characteristics of successful 'engineering' is that by analysing why it screws up, it learns from its mistakes. It is abundantly clear that apart from isolated pockets of resistance, software 'engineering' does not. We are used to the mantra 'Lessons must be learned' coming from political descriptions of failed systems and procedures, only to be ignored later, but it is harder to understand why something nominally technical such as software development demonstrably does not learn from its mistakes either. Using a number of case studies, I will try to understand whether this is pervasive throughout the software development hierarchy or is confined to suits with spreadsheets. These may be strong words but will resonate with anybody who has read the background behind the Boeing 737 MAX debacle.

About Les Hatton

Les Hatton is Emeritus Professor of Forensic Software Engineering at Kingston University, London. Educated in mathematics at King's College, Cambridge 1967-1970 and the University of Manchester where he received the degree of PhD in 1973 for his work on tornadoes, he was awarded the Conrad Schlumberger Award of the European Association of Geoscientists and Engineers in 1987. Later he became interested in software reliability but now wishes he hadn’t bothered. Apart from a spell as a peripatetic music teacher, he has published as a jobbing mathematician in meteorology, geophysics, computer science, sports biomechanics, computational reproducibility and latterly, information theory in biology.

Sponsored by