Following an urgent request from our customer for the future mission capability rates of a particular aircraft population, we quickly responded with an analysis in an audience-tailored presentation. The analysis illustrated both the platform’s deterioration over time and that based on current actuals, our predictive models generated from past data were highly accurate. Tired team but very happy and reassured customer!
On a Thursday afternoon, we received a call from our OEM customer requesting immediate analysis for an executive leadership meeting the following day. We held a quick telecon with our customer to discuss the problem and the format of the output. We were quite familiar with the analysis because we had performed a very similar task a year prior. The customer needed a time series forecast of mission capability rates by block (configuration group) of aircraft. The analysis would help give executives evidence to support their assertions that if improvements were not made within the program, readiness was going to decline.
The customer desired to have the most recent dataset to evaluate the mission capability differences between two different blocks of aircraft. The average mission capability by block needed to be forecasted 5 years into the future. They also requested that we find the previous year’s analysis and compare it to current data, so they could show their customer (the aircraft end-user) how well we could predict future mission capability rates by block.
After receiving the new data, we realized that a large amount of work was needed to reformat the data. The new data had not yet been cleaned and imported into a database. The data also had missing attributes that were required in order to perform the analysis properly. We were able to mitigate these issues due to a strong understanding of the platform and its data. Once the information was in order we were able to start our analysis.
We started by aggregating the data to obtain an average mission capability by block. We considered adding standard deviations to show the distribution of values but knew from previous experience that it would inhibit understanding for the target audience so we decided against it. A plot of the data revealed a negative trend for both blocks of aircraft. It appeared as if two models could be applicable to the data, a linear model and an exponential model.
When fitting all of the data to a model the linear model performed better. However, when splitting the data into training and test sets, the exponential model had a better performance on the test dataset. A training set is used to train a model, the model then predicts data from the test set in order to measure the accuracy of the predictions on real data. The exponential model also made sense from a theoretical perspective, if we continued with a linear model eventually the mission capability predictions would reach zero and negative values. Negative mission capability is not possible so the linear model does not make sense from a real-world perspective. By contrast, an exponential model has an asymptote which allows the prediction to stabilize around a positive, non-zero value. This specific phenomenon, along with better predictive power, led us to choose the exponential model for the forecast.
Since we had performed a similar analysis the year before, we decided that it would be a good idea to take the previous year’s forecast and show our customer how well the predictions performed. As it turned out, we were only 3%-4% off of our predicted values on average, which made our customer very happy. This allowed them to feel confident in the forecast when presenting to their leadership and gave their leadership even more evidence to support their position with the end user.
In addition to the forecast, we found a significant anomaly in the trend. In the last year of the data, the entire fleet had deviated significantly from the previous three-year trend. The previous analysis that was performed would have been even more accurate if this anomalous behavior did not exist. It seemed as if part of the fleet had dropped an average of 5% and would not be returning to the previous mission capability range. As we did not have time to explore this behavior before our customer’s meeting the following day, we noted the deviation in the graph and developed the final report.
The work performed had taken us several hours into the night, but we knew the report would probably need to be reworked by the customer in the morning. Typically, our reports include comments and additional details that are very useful for our customer but too much detail for their leadership. The details help to give more confidence when our customer presents to their leadership so it is necessary, but it takes time to rework the slides. We decided to create a slide deck with graphs on the current performance, future projections, and quality of the forecast from the previous year. We also noted the trend deviation in mission capability in the last year. We produced an Excel file of all the data because the customer was more comfortable with Excel. This allowed them to change the graph quickly without needing our input. We also included an email that detailed the analysis and highlighted the need to investigate the trend deviation in the last year. The trend deviation was especially important to us because our report was going to become evident to support the notion of a persistent deterioration of availability within the fleet. We felt that if we could find out why there was a significant loss in mission capability, we could find a solution to help increase the performance of the platform.
Early the next morning we conducted a meeting with the customer to review the results. The customer was very pleased with the results and understood the output clearly. In addition to the block forecasts the customer request a forecast of the entire fleet. The analysis for the fleet was quite simple as the data needed to be aggregated one level higher. The data and graphs were returned quickly and the customer was able to meet with their leadership to discuss the results.
After a week we reached out to the customer to review the outcome from the executive meeting. They were delighted to inform us that the executive leadership received exactly what they desired and our customer was praised for the quick turnaround. In addition to our customer receiving praise, executive leadership worked quickly to reward our team for the work we had performed.