These repositories support the Bayesian Probability, Analysis and Causality courses at the Bauer Graduate Business School and HP Enterprise Data Science Institute.
Enterprises operate in an economy that is complex and dynamic. The scope of analyses includes processes throughout the value-chain, considering customer, competitor, supplier, product, and many other dimensions and drivers (drivers being dimensnions that have a significant effect and possibly cause changes in operational outcome).
Data sources range from structured transactions and databases, to data stored in documents and audio/video files, to the opinions and intuition of experienced experts. Regardless of data source and quality, analysts must build models that integrate relevant dimensions and evaluate their effect and reliablity. Bayesian analysis is both a philosophy and a methodology for building models that comprehend all this data, updating as new data is created. The tools of a Bayesian Analyst include:
-
Integration of Opinions, Experience and Evidence Often a model requires information that is not well defined, or doesn't yet exist (e.g., new products/markets). Expert opinion brings a dimension to models that data often doesn't comprehend - it maybe less reliable, but more relevant. Beliefs, experience, evidence... can be integrated into probabilistic models that quantify relevance and reliablity, and evolve as new data are observed and integrated.
-
Explanatory Parameters Bayesian models produce effect parameters with probabilities that can be used to assess causality and reliability, and build projections and evaluate scenarios or interventions. A major weakness with maching learning and AI applications (which are great at building models that produce accurate predictions, but not the causes)
-
Multilevel Analysis with Effects Business value-chains and transaction environments are almost always multilevel/hierarchical and effects will cascade in ways that must be modeled by level to understand the options for change/intervention (another major weakness in manchine learning and AI)
-
Granular Control over Model Design and Tuning Model effects can be tuned (or generalized) down to specific levels and parameters.
-
Bayesian Updating. Bayesian models have a unique structure that can quickly integrate new data (often in near real-time). This makes a huge difference in adjusting models to business dynamics.
Site Repositories (recommend cloning as the content is frequently updated):
-
Foundations. Basic statistics and linear algebra concepts necessary for Bayesian analysis (in R and Python). These are assumed to be prerequisite and will only be reviewed in brief (you can complete course assignments in either R or Python, or both - this is not a programming course)
-
Introduction. Introduction to Bayesian Analysis with focus on conditional probability, and distributions and regression modeling basics (intro level statistics is also an assumed prerequisite).
-
Modeling. Advanced Modeling (Multilevel, Pooling, Effects, Extended Equations) with Stan.
-
Applications. Case studies will include planning (e.g., planning orders, pricing policy, supply chain agreements, and industry trends), operations and process automation (e.g., pricing products, assets and derivatives) assurance (e.g., project deliverables quality testing, controls testing and financial statement assertions).
About Professor Terry
Ellen Terry began her career in M&A analysis at Deloitte, focusing on forecasting and valuation modeling. She later joined Microsoft, leading development of industry models and solutions (i.e., cross platform templates), where she was also involved in research work at the Santa Fe Institute. Post-Microsoft, she joined GE as Director of Planning and Programs, and later, JP Morgan as VP - Data Science. She's currently a Professor at Bauer and an independent consultant. She is also a jazz musician ( ellynsong ).