There's a notion going around that ...
“I don’t believe you can make a model of a possible future.”
This ignores the principles of predictive analytics, and the direct management actions taken to produce outcomes from those analytical models found everywhere from project management to grocery store management and the model-based systems engineering. Those holding this view really say, we don't understand what models are for, how they're built, and how to apply them the model possible futures, without exploring outside of personal anecdotes.
All Models are Wrong, Some Models are Useful - George E. P. Box
This quote is used many times to avoid asking and answering questions around models, forecasting, and assessment of possible future states of systems, a project being a system. The actual quote is from Science and Statistics, George E. P. Box, Journal of the American Statistical Association, December 1976, pp. 791-799. The book that contains that paper and provides the specific approach to modeling the future is Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building, George E. P. Box, Willaim G. Hunter, and J. Stuart Hunter. You can get this book on Amazon for $10.00, read it and confirm that the opening conjecture is a fallacy.
What George Box says about models is summaries as
2.3 Parsimony
Since all models are wrong the scientist cannot obtain a "correct" by excessive elaboration. On the contrary, following William of Occam, he should seek an economical description of natural phenomena. Just as the ability to devise simple but evocative models is the signature of the great scientist so over-elaboration and overparameterization is often the mark of mediocrity.
Modeling is at the heart of Program Planning and Controls. PP&C lives in the domain of Project Management and Controls, which is a Technical Management Processes of ISO 15288.
It's the Planning and Controls processes working together with the Technical Development processes that provide the ability to forecast what should happen in the future if we keep going like we are going now using a Model of the possible future outcomes. The will happen of course is dependent on externalities, some under our control some not. Those under our control need to be part of the model. Those not under of control need to have alternative plans should they come true. In ISO 15288 this is the role of the Risk Management Processes.
A second Critical Success Factor is the ability to predict what will happen in the future given the model of the project's activities and risks and the alternative designed as well as emerging designs and external processes.
It's the creation of a Model of the future that is the starting point for increaseing the probability of project success. This model is then the steer toward guidance for success. Data from the past is useful but that data is just that - data from the past. It's can inform the decision makers about the validating of the past decisions, but it must be applied to a model of the future to be of any use in informing the decision makers about possible outcomes in the future so the can make choices before that future arrives.
The paradigm used to deliver actionable information to the decision makers through this model of the Future is Predictive Analytics.
The model of the project or program tells us what the cost, schedule, and performance - Effectiveness and Performance - Need to be for the project to be a success. This is the desired outcomes model. The supporting model is the possible outcomes model which is developed using Risk Management applied to the desired outcomes model to create a model of a possible future.
So the open question is...
Which comes first? Past performance data or the Model of the needed Future performance of the project?
Performance Analysis and Prediction
Performance of cost, schedule, and technical outcomes are primary measures of success for the future outcomes of projects. Both characteristics can be considered along with other dimensions of project performance.
Understanding needed, current, and possible cost, schedule, and technical performance, the current and possible approaches to achieving and maintaining these performance values is the first step towards improving it. For approaches that have been implemented on existing systems, obtaining such understanding may require measurement and analysis. For scenarios where the project under consideration does not yet exist, performance prediction using analytical modeling or simulation is necessary.
These performance measures, along with other factors impacting project performance can all be modeled [2]. Modeling of projects with Systems Dynamics can be done with Open Source tools like Vensim at www.vensim.com
With a Systems Dynamics tool, models of how the project works can easily be made, simulated and assessed to produce the probability of success. This model can be built BEFORE there is data from past projects. The outcomes from the simulation can then be compared to current data, data from other projects, and used to produce future data to guide the management of the project. [18] Here's an example of the impacts on project performance in a Systems Dynamics model that can be used to forecast future cost, schedule, and technical performance.
Model-Based Project Management
Let's start with a definition
- Systems Engineering is an interdisciplinary, holistic approach to realize successful systems. It often involves a combined effort of a team of professionals from different disciplines and backgrounds. [1]
The life-cycle processes of a project can be defined by ISO15288 as.
The Project Planning Process is where project management is anchored. The interactions between Systems Engineering are shown here [1] It's in the Project Management and Control where forecasts of future cost, schedule, and technical performance take place. This starts with a Model of what these attributes need to be for the project to be a success.
Measuring Progress to Plan and Forecasting (Predicting) The Future
The traditional approaches to project performance management and production of the Estimate to Complete and Estimate at Completion make use of past performance. This is the basis of Earned Value Management. Using past performance to produce a Cost Performance Index (CPI) and a Schedule Performance Index (SPI) and use those to forecast future performance. This approach uses past performance to predict future performance.
But those indices are applied to a static model of the project - the Performance Measurement Baseline. This static model is usually encapsulated in the Integrated Master Schedule (IMS). The IMS is then placed on a baseline in the PMB. Changing the PMB is discouraged in the EVM compliance world. But the IMS and its structure IS the model of the projects work. The IMS shows what work needs to be performed in what order, with what measures of performance and effectiveness must be for success. But when we consider this model as static - baselined - then we are missing a powerful tool for keeping the project moving toward those measures of effectiveness and performance.
This is a fundamental problem with traditional project management processes when we focus on cost and schedule compliance first, then only Effectiveness and Performance compliance.
The project work sequence planned and baselined may be considered static in principle, but in practice, all project work is dynamic, since this work operated in the presence of reducible (Epistemic) and irreducible (Aleatory) uncertainties. With changes needed to respond to the emerging conditions of the project. In the construction business, this is well understood. In our aerospace and defense business - not so much.
So for the chicken or the egg question, the answer has to be the model comes first, since the model is the to be future performance needed to be successful. Then data from past performance can be used to modify the model. Our traditional approach - in EVM - is to use the past data to forecast the future. This new paradigm - predictive analytics - uses the model to guide the work to succeed.
The predictive analytics approach is a leading indicator of the work processes, rather than reporting past performance. But these leading indicators must inform the model of the project to show what needs to be changed in the execution of the work, according to the model, to arrive at success. This means we need a model to start and only then make use of past performance and model produced performance forecasts.
This means comparing current performance against the predicted performance and revealing where to change the execution processes to keep the program green. This approach goes beyond just the reporting processes. It means using the model of the project to provide corrective actions to keep the program Green.
Build the Model First, then identify the data needed to confirm the model and the data needed to take corrective and preventive actions to assure the project turns out like the Model.
With this approach, we can cause the outcomes to be what we want them to be when we can control the Epistemic and Aleatory uncertainties. (This, of course, makes a huge assumption that we've identified all the uncertainties, have corrective or preventive actions plans and these plans are effective). But this is a principle as a starting point.
It's the Ontological uncertainties that cause projects to fail - uncertainties we didn't see coming.
Resources for Model-Based Design and Forecasting
Some of these resources require memberships (IEEE, INCOSE). Links are provided for those that are not so easy to find. Google will find the rest. Each of these resources will have a bibliography that can be found. This is the basis of the first course in graduate school - research methods, where before any idea is considered for study or any opinion is considered credible, a literature search is required. Only then will you colleagues or professors consider your opinion.
- Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities, Fourth Edition, INCOSE, John Wiley & Sons"
- "Modeling Project Behaviour," Alberto De Marco, Time Management, Methodology, Cost Control, Skill Development, 2006.
- "Introduction To Model-Based System Engineering (MBSE) and SysML," Laura E. Hart, Delaware Valley INCOSE Chapter Meeting
July 30, 2015. - "Agile Model-Based Systems Engineering," Bill Schindel Rick Dove, 26th Annual INCOSE International Symposium.
-
"Simulation of Strategic Management Process of Software Projects: A Dynamic Approach," Masood Uzzafer, University of Nottingham, 2014
- "Is There a Complete Project Plan? A Model-Based Project Planning Approach," Amira Sharon, Olivier de Weck, and Dov Dori, INCOSE International Symposium, Session 1, Track 1, Model-Based Systems Engineering, 2014.
- "Model-Based Systems Engineering and SAFe"
- "Model‐Based Project‐Product Lifecycle Management and Gantt Chart Models: A Comparative Study," Amira Sharon and Dov Dori, Systems Engineering, 20, 5, (447-466), (2017). Wiley Online Library
- "Interactive Models as a System Design Tool: Applications to System Project Management," Paul T. Grogan, Olivier L. de Weck, Adam M. Ross, and Donna H. Rhodes, Procedia Computer Science, 44, (285), (2015).
- "A Project–Product Model-Based Approach to Planning Work Breakdown Structures of Complex System Projects," A. Sharon and D. Dori, IEEE Systems Journal, 9, 2, (366), (2015).
- "Introduction to the Agile Systems Engineering Life Cycle MBSE Pattern,"
- "Improving Project-Product Lifecycle Management with Model-Based Design Structure Matrix: A joint project management and systems engineering approach," Amira Sharon, Olivier de Weck, and Dov Dori, Systems Engineering 16(4), pp. 413-426, 2013.
- "Simulation-Based Acquisition: An Affordable Mechanism for Fielding Complex technologies," Dr. Patricia Sanders, Program Management, September - October 1997, pp. 72-77.
- "Modeling the Management of Systems Engineering Projects," Daniel Spencer and Shaun Wilson, Aerospace Concepts, DTSO-GD-0734, Proceedings of the 2012 Model-Based Systems Engineering Symposium, 27-28 November 2012.
- "Estimating Models for Program Management," Norman Womer and Jeff Camm, Final Technical Reports, N00014-00-1-0280, University of Mississippi, 2003.
- "Model-Based Project-Product Lifecycle Management and Gantt Chart Models: A Comparative Study," Dov Dori and Amira Sharon, Systems Engineering 20 (2/3), September 2017.
- "STARDUST: Implementing a New Manage-to-Budget Paradigm," Bredt Martin, Kenneth Atkins, Joseph Velingam and Rick Price, 1998 IEEE Aerospace Conference. (Rick Price is a colleague I work with at Lockheed Martin on several space flight programs)
- "Application of Model-Based Systems Engineering Methods to Development of Combat System Architectures," John M.Greenn, 6th Annual Acquisition Research Symposium of the Naval Post Graduate School: Volume 1: Defense Acquisition in Transition, May 13-14, 2009.
- "Prediction of Project Performance - Development of prediction model for predicting future performance of an OG&C project in EPC environment,” Naresh Kaushik, Thesis Report, Delft University.
- Visualizing Project Management: Models and Frameworks for Mastering Complex Systems, Kevin Forsberg, Hal Mozz, and Howard Cotterman
- "The Prediction of Success in Project Management – Predictive Project Analytics," Jochen Fauser, Markus Schmidthuysen, and Benjamin Scheffold, BDU
- "A Predictive Analytics Approach to Project Management: Reducing Project Failures in Web & Software Development Projects," Tazeen Fatima, Int'l Conf. Data Mining | DMIN'17
- "Predicting Project Success in Construction Using an Evolutionary Gaussian Process Inference Model," Min-Yuan Cheng, Chin-Chi Haung, Andreas Frankie Van Roy, Journal of Civil Engineering and Management, January 2014.
- "One Model, Many Interests, Many Views," Zane Scott and David Long, Vitech Whitepaper, 2018, www.vitech.com
- "The Systems Perspective," Zane Scott, Vitech Whitepaper, 2016, www.vitech.com
- "Applying the Principles of Systems Dynamics to Project Risk Management or 'The Domino Effect'," Mark Gray and Azin Shahidi, Risk Management, Quality Management, Methodology, Complexity 22 October 2011.
- "Modeling and Simulation of Project Management through the PMBOK Standard Using Complex Networks," Luz Stell Cardona-Meza and Gerard Olivar-Tost, Complexity Volume 2017.
- “Developing analytics models for software project management,” Morakot Choetkiertikul, University of Wollongong, September 9, 2018.
- "A review of analytical models, approaches and decision support tools in project monitoring and control," Oncu Hazir, International Journal of Project Management, September 2014, 33(4)
Background on Probability and Statistics Used for Model-Based Design
Modeling projects is based on probability and statistics of stochastic emerging processes - all project work is driven by uncertainty - Epistemic and Aleatory - that creates reducible and irreducible risk. Here are some resources for the probability and statistics topic as applied to project performance forecasting. So again ...
When you hear someone conjecture I don’t believe you can make a model of a possible future request they produce the evidence in support of that conjecture. No Evidence? Then it's just an unsubstantiated personal opinion.
- How to Lie with Statistics, Darrell Huff - this is the first book you should have on your shelf, no matter what role you play at work or in life.
- Discover Probability: How to Use it, How to Avoid Misusing It, and How It Affects Every Aspect of Your Life, Arueh Be-Naim
- Chances Are: Adventures in Probability, Micael Kaplan and Ellen Kaplan
- Bayesian Methods for Hackers: Probabilistic Programming and Bayesian Inference, Cameron Davidson-Pilon
- Reasoning about Uncertainty, Joseph Y. Halpern
- Predictability Irrational: The Hidden Forces That Shape Our Decisions, Dan Ariely
- How to Take a Chance, Darrell Huff and Irving Geis
- Dueling Idiot's and Other Probability Puzzles, Paul J. Nahin
- Probability and Statistics, Julius Blum and Judan Rosenblatt - this was a graduate school text that got me started on the path of applying probability and statistics to everything I do.
- Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building, George E. P. Box, William Hunter, and J. Stuart Hunter - this is the book where Box states All Models are Wrong, Some are Useful. This is another grad school book that set me on my way to understanding how to apply statistics in decision making.
- Regression Analysis by Example, Samprit Chatterjee and Ali S. Hadi
- Applied Regression Analysis Third Edition, Norman Draper and Harry Smith
- Forecasting: Methods and Applications, Spyros Makridakis, Steven C. Wheelwright, and Rob J. Hyndman - in our program planning and controls domain ARIMA (Autoregressive Integrated Moving Average) is a powerful tool for forecasting future project behaviors based on past performance.
- How to Predict the Unpredictable: The Art of Outsmarting Almost Everyone, William Poundstone - this book is the practical side of Kahneman and Tversky's representativeness behavioral bias.
- Probability Methods for Cost Uncertainty Analysis: A Systems Engineering Perspective, Paul R. Garvey - this book is fundamental to managing cost and the supporting schedules on all the projects I work. Systems Engineering is the basis of all we do, so this book enables the execution of the programs driven by Systems Engineering.
- Practical Nonparametric Statistics, W. J. Conover - Nonparametric statistics is a statistical method where data is not required to fit a normal distribution. Nonparametric statistics uses data that is often ordinal, meaning it does not rely on numbers, but rather a ranking or order of sorts. This is found many times in project work around risk management and other performance measures that are ordinal rather than cardinal.
- Standard Deviations: Flawed Assumptions, Tortured Data, and Other Ways to Lie with Statistics, Gary Smith - as a companion to How to Lie With Statistics, this books shows how we make fallacious decisions based on bad statistics.
- Guesstimation: Solving the World's Problem on the Back of a Cocktail Napkin, Lawrence Weinstein and John A. Adam
- Advanced Statistics DeMystified, Dr. Larry J. Stephens - this is a good introduction to applying statistics to engineering problems.
- Principles of Statistics, M. G. Bulmer - a classic statistic book
- The Practical Cheating Statistics Handbook: The Sequel (2nd Edition) - this is a handy book when you encounter the need to solve a problem without actually having to do the heavy work
- Flaws and Fallacies in Statistical Thinking, Stephen K. Campbell - we all make fallacious decisions based on statistics or know people who do. This is one of those mandatory reading books that will show why those fallacies exist and how to avoid them. Especially on our projects where uncertainty abounds.
- Statistics: A Very Short Introduction, David J. Hand - a handy book with summaries of all the principles.
- Probability, Statistics, and Queuing Theory with Computer Science Applications, Arnold O. Allen - this was a grad school text as well. Long before Kanban (a queuing method actually) that we took a course needed for performance management on a particle accelerator data flow from a remote site onto our data server over a slow dial-up line in the late 70's.
- Introduction to Stochastic Process, Paul G. Hoel, Sidney C. Port, Charles J. Stone - this is a graduate school text where I learned everything in the universe is a non-stationary stochastic process.
- The Art of Modeling Dynamic Systems: Forecasting for Chaos, Randomness and Determinism, Foster Morrison - when you hear we can't forecast the future, read this book to find out how that is a fallacy and how to do it with the proper limitations of available modeling tools and processes in our engineering domain.
- Introduction to Stochastic Models, Second Edition, Roe Goodman - another stochastical process book needs for managing project work in the presence of uncertainty