If we look into airplane, the pilot rely heavily on the software, where the data input are from the sensors. The data later being processed and shown to the pilot to make decision on the navigation or if in auto-pilot, the system/software itself will make the decision.
What happen if bad decision made by the system? The next situation will be catastrophic.
Boeing 737-MAX having major bug in the system that killed more than 300 people with two major crashed that happened within few months gap that made governements to ground the model. This effect a lot to the airplane companies, the business, the people and the domino effect were huge.
So, how did this major bug was not detected earlier, despite all the documentation, use cases, planning and software testing?
This is because, no one in the team from the planner to the developer, to the tester reliazed this situation. Some time, after deployment, we as a software engineer only found out the bug. This bug was not realized during the development, or during the testing period. Sadly, this bug that on board with 737-MAX was a major bug, that involve lives. As of today, 13 April 2019, BOIENG making statement that they are testing the new upgraded software and trying to regain trust of their customer. How can the customer trust them now? They need to simulate the situation that lead to the crash and make a signoff that this will not happen again. If this happen again, crash and burn will happen.
— Adam —