Buscar

ModelOps_For_Dummies_9781119850489

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes
Você viu 3, do total de 66 páginas

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes
Você viu 6, do total de 66 páginas

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes
Você viu 9, do total de 66 páginas

Faça como milhares de estudantes: teste grátis o Passei Direto

Esse e outros conteúdos desbloqueados

16 milhões de materiais de várias disciplinas

Impressão de materiais

Agora você pode testar o

Passei Direto grátis

Você também pode ser Premium ajudando estudantes

Prévia do material em texto

These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
http://www.modelop.com/contact
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
ModelOps
ModelOp Special Edition
by Stu Bailey
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
ModelOps For Dummies®, ModelOp Special Edition
Published by: John Wiley & Sons, Inc., 111 River St., Hoboken, NJ 07030-5774, www.wiley.com
Copyright © 2022 by John Wiley & Sons, Inc.
No part of this publication may be reproduced, stored in a retrieval system or transmitted in any 
form or by any means, electronic, mechanical, photocopying, recording, scanning or otherwise, 
except as permitted under Sections 107 or 108 of the 1976 United States Copyright Act, without the 
prior written permission of the Publisher. Requests to the Publisher for permission should be 
addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 
07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permissions.
Trademarks: Wiley, For Dummies, the Dummies Man logo, The Dummies Way, Dummies.com, 
Making Everything Easier, and related trade dress are trademarks or registered trademarks of 
John Wiley & Sons, Inc. and/or its affiliates in the United States and other countries, and may not 
be used without written permission. ModelOp and the ModelOp logo are registered trademarks of 
ModelOp. All other trademarks are the property of their respective owners. John Wiley & Sons, 
Inc., is not associated with any product or vendor mentioned in this book.
LIMIT OF LIABILITY/DISCLAIMER OF WARRANTY: WHILE THE PUBLISHER AND AUTHORS HAVE 
USED THEIR BEST EFFORTS IN PREPARING THIS WORK, THEY MAKE NO REPRESENTATIONS 
OR WARRANTIES WITH RESPECT TO THE ACCURACY OR COMPLETENESS OF THE CONTENTS OF 
THIS WORK AND SPECIFICALLY DISCLAIM ALL WARRANTIES, INCLUDING WITHOUT LIMITATION 
ANY IMPLIED WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. 
NO WARRANTY MAY BE CREATED OR EXTENDED BY SALES REPRESENTATIVES, WRITTEN 
SALES MATERIALS OR PROMOTIONAL STATEMENTS FOR THIS WORK. THE FACT THAT AN 
ORGANIZATION, WEBSITE, OR PRODUCT IS REFERRED TO IN THIS WORK AS A CITATION AND/
OR POTENTIAL SOURCE OF FURTHER INFORMATION DOES NOT MEAN THAT THE PUBLISHER 
AND AUTHORS ENDORSE THE INFORMATION OR SERVICES THE ORGANIZATION, WEBSITE, OR 
PRODUCT MAY PROVIDE OR RECOMMENDATIONS IT MAY MAKE. THIS WORK IS SOLD WITH 
THE UNDERSTANDING THAT THE PUBLISHER IS NOT ENGAGED IN RENDERING PROFESSIONAL 
SERVICES. THE ADVICE AND STRATEGIES CONTAINED HEREIN MAY NOT BE SUITABLE FOR 
YOUR SITUATION. YOU SHOULD CONSULT WITH A SPECIALIST WHERE APPROPRIATE. FURTHER, 
READERS SHOULD BE AWARE THAT WEBSITES LISTED IN THIS WORK MAY HAVE CHANGED 
OR DISAPPEARED BETWEEN WHEN THIS WORK WAS WRITTEN AND WHEN IT IS READ. 
NEITHER THE PUBLISHER NOR AUTHORS SHALL BE LIABLE FOR ANY LOSS OF PROFIT OR ANY 
OTHER COMMERCIAL DAMAGES, INCLUDING BUT NOT LIMITED TO SPECIAL, INCIDENTAL, 
CONSEQUENTIAL, OR OTHER DAMAGES.
For general information on our other products and services, or how to create a custom For 
Dummies book for your business or organization, please contact our Business Development 
Department in the U.S. at 877-409-4177, contact info@dummies.biz, or visit www.wiley.com/go/
custompub. For information about licensing the For Dummies brand for products or services, 
contact BrandedRights&Licenses@Wiley.com.
ISBN: 978-1-119-85047-2 (pbk); ISBN: 978-1-119-85048-9 (ebk). Some blank pages in the print 
version may not be included in the ePDF version.
Publisher’s Acknowledgments
Some of the people who helped bring this book to market include the following:
Project Manager and Development Editor: 
Carrie Burchfield-Leighton
Acquisitions Editor: Ashley Coffey
Sr. Managing Editor: Rev Mengle
Business Development Representative: 
William Hull
ModelOp Acknowledgments
This book is made possible in part by the leading and market defining efforts of Royal Bank 
of Canada in ModelOps and more broadly Enterprise AI. ModelOp would like to give a special 
thanks to Gunjan Modha, managing director and head of banking technology & architecture.
http://www.wiley.com
http://www.wiley.com/go/permissions
mailto:info@dummies.biz
http://www.wiley.com/go/custompub
http://www.wiley.com/go/custompub
mailto:BrandedRights&Licenses@Wiley.com
Table of Contents iii
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Table of Contents
INTRODUCTION ............................................................................................... 1
About This Book ................................................................................... 1
Foolish Assumptions ............................................................................ 2
Icons Used in This Book ....................................................................... 3
Beyond the Book .................................................................................. 4
CHAPTER 1: Realizing that Models are Eating 
the World ............................................................................................ 7
Tracing the Evolution of Models ......................................................... 8
Understanding the Differences Between Models 
and Software ....................................................................................... 10
Seeing the Tip of the AI-Iceberg ........................................................ 12
Framing ModelOps for the Exec Team and CEO ............................ 13
CHAPTER 2: Understanding the Role Played 
by ModelOps in Enterprise AI .......................................... 15
Knowing the Key Requirements for ModelOps .............................. 16
Recognizing that Models Are Enterprise Assets ............................. 18
Understanding the Differences between ModelOps 
and MLOps .......................................................................................... 19
Introducing the Enterprise AI Team ................................................. 21
Identifying the key stakeholders ................................................. 21
Understanding the needs of each stakeholder ......................... 23
CHAPTER 3: Architecting Your Enterprise 
ModelOps Capability ............................................................... 25
Understanding Model Life Cycles ..................................................... 26
Assigning Enterprise Ownership for ModelOps ............................. 29
Understanding Key Aspects of an Enterprise 
ModelOps Platform ............................................................................ 32
Identifying the key requirements................................................ 32
Answering the key questions ...................................................... 32
Integrating with Your Enterprise Processes and Systems ............. 34
Understanding and Meeting Regulatory and Compliance 
Requirements ..................................................................................... 35
iv ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Establishing ModelOps as the Foundation for 
Model Governance ............................................................................. 38
Visualizing the Results ....................................................................... 39
CHAPTER 4: Putting ModelOps into Action ......................................... 41
Benchmarking Your Place in the AI Journey.................................... 41
Choosing Where to Start ................................................................... 42
Learning from Real-WorldUse Cases .............................................. 43
Achieving straight-through processing ...................................... 43
Reducing fraud and bootstrapping ModelOps ......................... 44
Optimizing bond trading with AI ................................................. 45
CHAPTER 5: Ten Things to Know about ModelOps ....................... 49
Models Are Automating Decisions at Unprecedented Scale ........ 49
Models Are Among an Enterprise’s Most Valuable Assets ............ 50
Model Governance Is a Make-or-Break Issue ................................. 50
Ineffective Model Operations Is the Leading Cause 
of Failed AI Initiatives ......................................................................... 50
Treating Models Like Conventional Software Leads 
to Trouble ............................................................................................ 51
Know Where Your Models Are and What They 
Are Doing ............................................................................................. 51
Model Development Is a Business Unit Function; 
ModelOps Is an Enterprise Function ................................................ 51
Keeping ModelOps Separate from Data Science Makes 
Both Better .......................................................................................... 52
Good ModelOps Prevents Shadow IT .............................................. 52
Every Journey Starts with the First Step .......................................... 53
Introduction 1
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Introduction
There’s never been a more exciting time for artificial intelli-gence (AI) technology. After decades of unfulfilled dreams, AI has moved solidly into the mainstream, and organiza-
tions of all types are betting big on AI’s transformational power. 
But many organizations have struggled to see returns on their 
investments in AI, and worse, are seeing themselves falling 
behind others who seem to have cracked the code. As many have 
found, the problem isn’t so much in figuring out how to turn data 
into models that demonstrate the ability to create value but rather 
in getting those models out of development and into production 
where they can actually deliver value, and even more importantly, 
making sure that their models don’t bear risks that can damage or 
even sink the ship.
About This Book
This book explains an enterprise-level operational discipline called 
ModelOps, which has emerged as a key to unlocking the power of 
AI.  ModelOps is a new enterprise capability that integrates and 
automates all the business, technical, and compliance stakeholders 
and activities across the organization to ensure that AI models — 
and all types of models — are governed, operated efficiently, and 
monitored continuously, producing value while remaining com-
pliant. Because ModelOps is new, many organizations aren’t fully 
aware of what it does, how to do it well, and where it fits in the 
organization. But many are at least coming to the realization that 
ModelOps has become a must-have for organizations that will 
survive and thrive in the coming model-driven era.
ModelOps For Dummies, ModelOp Special Edition, consists of five 
chapters that explore the following:
 » How models have evolved, why AI models pose significant 
new challenges, and why ModelOps is a critical capability for 
enterprises (Chapter 1)
 » The key requirements and stakeholders for ModelOps and 
how to structure ModelOps as an enterprise capability 
(Chapter 2)
2 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
 » Implementing the most effective ModelOps capability 
(Chapter 3)
 » How to assess the maturity of your AI initiatives and when 
and where to start your ModelOps journey, with examples 
from successful organizations that have implemented 
ModelOps (Chapter 4)
 » Ten key fun facts about ModelOps (Chapter 5)
Foolish Assumptions
I made some assumptions about you, the reader, when I wrote 
this book. Mainly, I assume the following:
 » You have a stake in the success of your organization’s AI 
initiatives and work in one of the following:
• A business unit that wants to unlock the power of AI
• An operational team such as DataOps, ITOps, or DevOps
• Risk management and compliance
• A C-level executive with accountability to the board
 » You’re past the point of deciding if AI is important. You’ve 
done pilot projects and proven that you can generate 
significant value from automating decisions using AI and 
other types of models. Now you’re ready to scale up.
 » You’ve seen the challenges with operationalizing AI 
models. You know you’re missing out on value because it’s 
taking too long to get models deployed in production. And 
you’re becoming aware of the significant risks associated 
with industry regulations and consumer expectations 
regarding the use of AI in decisions that impact people’s 
lives.
 » You’re asking or being asked questions and not getting 
good answers. How many models do we have in produc-
tion? Where are they running? Are they generating value? 
What’s the return on investment (ROI)? What’s our exposure? 
Do we have visibility and control over all of our model-driven 
initiatives?
Don’t be afraid to become your organization’s expert in ModelOps! 
ModelOps by definition tightly integrates a variety of complex 
Introduction 3
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
and mature enterprise and business unit functions, including the 
following:
 » Business analysis
 » Data science
 » Enterprise digital architecture and strategy
 » DataOps
 » IT
 » Digital security
 » Risk
 » Business intelligence
 » DevOps
 » Corporate governance
While you may have a background and detailed experience in 
one or two of these disciplines, most readers won’t have com-
plete depth and experience in all the required disciplines that 
come together to fulfill a mature, capstone ModelOps capability. 
As you use this book to gain a full understanding of ModelOps, 
engage the experts, early and often, in your organization who are 
responsible for various existing capabilities that come together 
with ModelOps. Very quickly you can become your organization’s 
expert in ModelOps and gain a much fuller appreciation for the 
depth and complexity of all the disciplines necessary to ensure 
that your organization has a world class Enterprise AI capability. 
Remember, the capstone of ModelOps becomes the foundation for 
Enterprise AI.
Icons Used in This Book
Throughout this book, you see icons in the margins that draw 
your attention to certain kinds of information. Here’s what the 
icons mean.
This book is a reference, but some nuggets of information are so 
important that you should commit them to your memory. With 
this content, I use the Remember icon.
4 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
The Tip icon indicates information that saves you time or money 
or just makes your life a little easier.
The Warning icon alerts you to issues in ModelOps that could 
cause you a headache. I give you this information so you can avoid 
it at all cost.
Sometimes I like to include statistical information or other pieces 
of information that may seem technical to some folks. When I do 
that, I include the Technical Stuff icon.
Beyond the Book
This book focuses on the high-level concepts regarding the role of 
ModelOps, how to architect an effective ModelOps platform, and 
how to structure the capability within your organization. If you 
want even more information, check out the following resources:
 » State of ModelOps2021: This report summarizes perspec-
tives on the state of ModelOps from senior executives in 
large, global enterprises. For more information, visit www.
modelop.com/wp-content/uploads/2021/04/State-of- 
ModelOps-2021.pdf.
 » 4 Steps to Successful Model Operations: This guide walks 
you through four steps that any organization can take to 
successfully operationalize AI and machine learning (ML) or 
any other type of model using ModelOps best practices. 
Head to www.modelop.com/wp-content/uploads/ 
2021/03/modelop_ebook_4-Steps-1.pdf for more info.
 » ModelOp technical education: This series of videos 
includes master classes for ModelOps and the ModelOp 
Center platform. You take deep dives into a range of 
subjects, including how to manage AI/ML model risk; how to 
build a model life cycle with automated monitoring, remedia-
tions, and retraining; how to create model life cycles that 
measure and monitor bias for AI/ML models in production; 
how to report on model revenue contribution; and more. To 
start watching, visit www.modelop.com/learning.
https://www.modelop.com/wp-content/uploads/2021/04/State-of-ModelOps-2021.pdf
https://www.modelop.com/wp-content/uploads/2021/04/State-of-ModelOps-2021.pdf
https://www.modelop.com/wp-content/uploads/2021/04/State-of-ModelOps-2021.pdf
http://www.modelop.com/wp-content/uploads/2021/03/modelop_ebook_4-Steps-1.pdf
http://www.modelop.com/wp-content/uploads/2021/03/modelop_ebook_4-Steps-1.pdf
https://www.modelop.com/learning/
Introduction 5
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
 » Events and webinars: This page shows upcoming events 
and also has archives of past events, including fireside chats 
with industry practitioners, panel discussions, and the 2021 
ModelOps Summit. Visit www.modelop.com/events.
 » The ModelOps Saga: This short video takes a comical look at 
critical C-level concerns with AI and how ModelOps is key to 
addressing them. Visit bit.ly/ModelOpsSaga-YouTube.
 » Stu Bailey’s insights on Gartner Market Guide for AI 
Trust, Risk, and Security Management: This video shares 
key highlights from the first Gartner market guide that offers 
some useful perspectives on the topic of AI trust and risk 
and security management, and gives you a list of representa-
tive vendors, including ModelOp for the ModelOps pillar. 
Watch the video at www.linkedin.com/pulse/gartner- 
market-guide-ai-trust-risk-security-stu-bailey.
https://www.modelop.com/events/
https://bit.ly/ModelOpsSaga-YouTube
https://www.linkedin.com/pulse/gartner-market-guide-ai-trust-risk-security-stu-bailey
https://www.linkedin.com/pulse/gartner-market-guide-ai-trust-risk-security-stu-bailey
CHAPTER 1 Realizing that Models are Eating the World 7
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Chapter 1
IN THIS CHAPTER
 » Recognizing how models have evolved
 » Understanding why AI models are 
different from conventional software
 » Seeing the approaching AI iceberg
 » Framing ModelOps for the CEO and key 
execs
Realizing that Models 
are Eating the World
In 2011, noted venture capitalist Marc Andreesen famously observed that “software is eating the world.” History proved him right. Low-cost computing in the cloud and pervasive con-
nectivity have enabled entire industries to be transformed by 
software.
Today, an even greater transformation is underway driven by 
special kinds of software called models. By leveraging advances in 
artificial intelligence (AI) technologies and the vast pools of data 
now available in most organizations, models are playing a lead-
ing role in transforming companies and industries. Models have 
powered companies like YouTube, Netflix, and, more recently, 
TikTok to extraordinary heights. But these well-known exam-
ples are barely the start of a phenomenon that’s deepening and 
spreading across all aspects of our lives. In this new era, one can 
say that “models are eating the world.”
Models aren’t new and have been used for decades in many indus-
tries, primarily to assist human decision making. But something 
new is happening: Models have become much more powerful and 
are increasingly being used to actually make decisions automati-
cally without human intervention. This has profound implications 
8 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
because models, just like people, have to be accountable for the 
decision they make.
While many organizations are making huge investments to 
develop AI models, few are able to deploy, manage, and govern 
them at scale. AI models have unique requirements, and as they 
move from business unit (BU) data science teams into produc-
tion, they expose big gaps in most organizations’ operational 
capabilities. Even those enterprises that have been using conven-
tional models in their businesses for years are being challenged 
to operationalize today’s more complex models. Many organiza-
tions find that it can take 6 to 12 months to get a model that’s 
deemed “ready” deployed into production and generating value 
for use-cases such as detecting fraud, deciding prices, or opti-
mizing manufacturing machinery. Of even greater concern, most 
organizations don’t have enterprise-level visibility into which 
models are in use in the business, how they’re performing, or if 
they’re in compliance. With the number of use cases for models 
skyrocketing, the rate of model development accelerating, and the 
attention of regulators growing, organizations are realizing that 
their success will be heavily dependent on their abilities to oper-
ationalize and govern models at enterprise scale.
In this chapter, you discover how models have evolved, under-
stand why AI models pose significant new challenges, and learn 
why ModelOps is a critical capability for enterprises that seek to 
maximize the power of AI while managing the risks.
Tracing the Evolution of Models
Models are software artifacts that drive decisions based on data. 
For the purposes of this book and discussion, I define models as 
digital representations of processes or systems. Given a set of data 
inputs, a model produces an output, usually a value or set of val-
ues, that’s then used to dictate decisions and actions for the busi-
ness use case. For example, a model might output, “yes” a person 
will receive a loan, or “$1,500” is the amount a credit line will be 
increased, or “29, 12, 4” are the inputs that a machine operator 
will enter into the factory equipment at this time.
Many enterprises have employed models for decades to add 
speed, accuracy, and consistency to human decision making. 
CHAPTER 1 Realizing that Models are Eating the World 9
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Some conventional model types that have been in use for many 
years include
 » Rules-based models: A typical example is an if-then-else 
decision tree.
 » Algorithmic models: These encode mathematical opera-
tions, such as linear regression.
Rules-based and algorithmic models are effective when the 
underlying processes and systems that they represent can be 
explicitly described and when the outputs for a given set of inputs 
are deterministic  — that is, the same every time. But much of 
the universe is neither easy to describe nor orderly, and there-
fore deterministic models can become unwieldy for encoding the 
behavior of anything other than the most straightforward pro-
cesses and systems. As an example, consider trying to write a set 
of explicit instructions for how to recognize a picture of a cat or 
how to drive an autonomous car.
Another class of models, known as stochastic or statistical models, 
uses probabilities and statistics to capturethe randomness found 
in the real world. Creating these models involves fitting the data 
observed in real systems to probability distributions, which are 
then used to predict the outcomes for new data inputs. Stochastic 
models have also been in use for many years. One common exam-
ple is the actuarial tables used by insurance companies to predict 
life spans and calculate premiums.
The newer classes of AI models that are generating so much 
excitement include the following:
 » Machine learning (ML) models: This category covers a 
broad range of stochastic models that are, in essence, 
algorithms created by training rather than by programming.
 » Natural language processing (NLP) models: NLP models 
are specialized ML models that incorporate the rules of 
language along with ML to enable software programs to 
communicate with people using writing or speech that 
mimics human interactions.
ML represents a major breakthrough in AI technology. With ML, 
it isn’t necessary to explicitly identify and code the complex sta-
tistical relationships between variables. Instead, using the tools 
10 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
and techniques of data science, models are created by highly skilled 
data scientists using a process called training, in which pre-selected 
data sets are presented to the model, the outputs are compared 
with known results, and the algorithm is tweaked until it produces 
accurate results. For example, an image recognition model may be 
fed with data from pictures of cats labeled as “cat,” and other 
animals labeled appropriately. With this approach, ML models 
have been trained to recognize cats — and just about everything 
else for which a training data set can be created.
Many types of ML models exist, and their applications go far 
beyond image recognition and other so-called classification use 
cases to include the following examples:
 » Recommendation engines that choose the next item in your 
social media feed or video playlist, or suggest a product to 
purchase
 » Diagnostic models that predict equipment failures or the 
onset of disease
 » Financial models that drive stock and asset trades
 » Behavioral models that predict customer retention or churn
In practice, many applications use composite models, which link 
together the outputs and inputs of multiple AI and conventional 
model types to automate complex decisions.
Understanding the Differences Between 
Models and Software
Software automates processes. Models automate decisions.
Of course, most models are software in the sense that they’re 
implemented on computers by executing a series of instructions. 
The outputs of models, known as scores or inferences, are typi-
cally consumed by software applications that then take actions 
based on the model’s output. Initially, model outputs were used 
to aid humans making the actual decisions, but increasingly, 
model-driven applications are using a fully automated decision-
ing process that minimizes and even bypasses human interven-
tion. The rapid trend toward fully automated decisioning in large 
CHAPTER 1 Realizing that Models are Eating the World 11
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
and complex businesses, a phenomenon called Enterprise AI, is 
powerful and transformative but also raises significant issues 
because models are different from conventional software in many 
key ways:
 » Models encode an organization’s most valuable intellec-
tual property, which includes the behaviors of custom-
ers, employees, suppliers, products, partners, and 
markets. While much of the enterprise software that was 
previously proprietary has been commoditized, many of the 
most valuable models don’t readily commoditize. This makes 
models, even more than conventional software, critical 
corporate assets that require extremely effective curation 
and management.
 » ML models don’t execute deterministic rules. Their 
operation is often opaque to some degree; for example, with 
some types of ML models, it isn’t possible to determine 
precisely how or why a model generated a particular output. 
This issue has many ramifications, especially as models are 
increasingly subject to regulations to ensure their efficacy 
and fairness.
 » Models require direct accountability to the business 
organizations that employ them. Because the decisions 
made by models have direct impacts on the top and bottom 
lines, business owners need real-time visibility and control 
over models in production to a much greater degree than is 
typical with conventional software applications.
 » Models require significant governance and oversight to 
limit organizational exposure and risks. The decisions 
that models automate can significantly impact people’s lives. 
Examples include whether a person qualifies for a loan, a job 
interview, a raise, or a medical procedure, or if an obstacle in 
the road is a cardboard box or a person. In some industries, 
like banking and insurance, models are already subject to 
government regulations, and many countries are working to 
implement broad-based regulation of AI models.
 » Models require strict monitoring to ensure that they’re 
producing valid inferences. If the data being fed to the 
model in production reflects a different world than what was 
reflected in the training data, the model will produce bad 
inferences and cause bad decisions. For example, without 
appropriate monitoring of a municipal bond pricing model, 
12 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
an error in market reporting data from an external source or 
an arithmetical error in data preparation of the input to a 
model may cause a pricing model to consume data well 
outside of the allowable range and either fail outright at a 
critical time or — worse — produce an output that’s outside 
of corporate tolerances, causing devastating losses similar to 
those caused by a rogue trader.
 » Unlike conventional software, models go stale. 
Depending on the use case and how fast behaviors can 
change, a model may need to be retrained annually, 
quarterly, monthly, weekly, daily, or even hourly. In situa-
tions where a significant, rapid change occurs in the 
environment — the COVID-19 pandemic being a prime 
example — ML models can rapidly lose their predictive 
efficacy.
Treating models as if they’re conventional software slows model 
deployment, lowers their efficacy, increases costs, and exposes 
the organization to increased risk.
Seeing the Tip of the AI-Iceberg
It’s quite common today for organizations to foresee and plan for 
a world in which hundreds or thousands of models are making 
most or all the actual decisions. But even that level of scale may 
severely underestimate the scope of the coming challenge. Imag-
ine what happens when any app, even any spreadsheet, has the 
ability to create models that need to be monitored, managed, and, 
in many cases, regulated. In that sense, the challenges today are 
just the tip of a looming AI Iceberg.
Even with a relatively small number of models in production, few 
organizations have achieved excellence in their abilities to gov-
ern, deploy, monitor, and continuously manage them. In fact, 
many organizations don’t actually know which models, or even 
how many, are driving which decisions. This is especially con-
cerning for several reasons:
 » Data science tools are becoming more powerful. Data 
scientists are able to create more models in less time.
 » More people will gain the power to create models. New 
tools will enable people without deep technical backgrounds 
CHAPTER 1 Realizing that Models are Eating the World 13
These materialsare © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
in AI — so-called citizen data scientists — to further add to the 
flood of models.
 » Models may gain the power to create models. In a 
concept called generative AI, models can create new models.
 » Mainstream applications will gain the ability to create 
models. Business analytics applications and even spread-
sheets may include the ability to generate ML models.
 » The rapid proliferation of models is contributing to 
“AI-Driven Shadow IT.” As model development outpaces 
deployment, BUs and departments are using Shadow IT to 
get their models into production — in many cases bypassing 
corporate security and governance controls.
If the IT, security, and compliance organizations don’t have vis-
ibility to or control over models, they can’t protect the enterprise 
from their associated risks.
Framing ModelOps for the Exec Team 
and CEO
Models have played important roles for years in many companies 
and industries, but in the coming AI-driven era, models may 
be  the most valuable of all corporate assets. To be successful, 
organizations need to achieve excellence in their ability to gov-
ern, deploy, and manage models at scale.
Models are different from conventional software in many 
significant ways; therefore, the capabilities that organizations 
have developed for managing software can’t simply be extended 
to support models. A new capability called ModelOps is emerging 
to address the unique requirements of governing, operationaliz-
ing and managing AI models, and all decisioning assets across the 
enterprise, at scale.
ModelOps is a foundational capability for Enterprise AI. ModelOps 
ensures that data science investments are operational, working 
properly, generating business value, and being held accountable.
Figure 1-1 shows you an example of an executive-level dashboard 
of the business, operational, and compliance status of all AI ini-
tiatives across an enterprise. This screenshot helps you visualize 
14 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
the power of a mature, enterprise ModelOps capability. It displays 
the cumulative value produced by selected models in produc-
tion, which is key to determining the ROI for AI investments. The 
dashboard also highlights operational and compliance metrics 
that may be out of threshold or nearing threshold, which enables 
functional teams and senior executives to identify issues quickly 
and drive them to resolution.
Models are very specific to particular use cases and as such are 
typically created or sourced at the BU level. However, models 
have implications for business performance, resource use, and 
risk that are enterprise-level concerns. ModelOps is therefore a 
corporate-level capability that ensures all models, regardless of 
where they’re created or where they execute, are deployed, mon-
itored, managed, and governed appropriately. Like other major 
corporate functions such as IT, Finance, and HR, ModelOps is 
an enterprise-accountable function. For a deeper look at the key 
stakeholders and their needs, check out Chapter 2.
A mature ModelOps capability enables end-to-end governance of 
all AI and model-driven initiatives.
FIGURE 1-1: An executive-level dashboard.
CHAPTER 2 Understanding the Role Played by ModelOps in Enterprise AI 15
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Chapter 2
IN THIS CHAPTER
 » Getting to know the key requirements 
for ModelOps
 » Seeing models as enterprise assets
 » Delineating the differences between 
ModelOps and MLOps
 » Identifying the primary stakeholders
Understanding the Role 
Played by ModelOps 
in Enterprise AI
The first steps in developing an enterprise ModelOps capa-bility are to detail its requirements and determine how to organize and manage the effort. Doing so can be harder 
than expected. Discussions about ModelOps often conjure an 
image of the proverbial elephant and seven blind men, each 
asserting that the particular part of the elephant that he holds — 
ear, tusk, tail, trunk, foot, and so on — represents the entire ani-
mal. In a similar way, each of the various groups that have a stake 
in production artificial intelligence (AI) models have particular 
needs driven by their functional roles and perspectives. A further 
complication stems from the tension between the need for speed 
and flexibility at the business-unit (BU) level and the need for 
accountability at the enterprise level. A further challenge is that, 
in most organizations today, no single group has responsibility 
for models as assets or ModelOps as an enterprise-wide capabil-
ity. As a result, many enterprises lack a clear roadmap for imple-
menting ModelOps and also struggle to determine how to structure 
it organizationally. Addressing these issues is the critical first 
step in the ModelOps journey.
16 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
In this chapter, you discover the key requirements for ModelOps, 
identify the stakeholders and their needs, and see how to struc-
ture ModelOps as an enterprise capability.
Knowing the Key Requirements 
for ModelOps
ModelOps is the discipline by which enterprises scale and gov-
ern their AI initiatives. A well-designed and managed ModelOps 
capability enables an enterprise to do the following:
 » Provide visibility and accountability for all model-driven 
initiatives for all stakeholders.
 » Manage and mitigate model risks.
 » Automate and orchestrate the policies and processes that 
keep models performant and compliant.
 » Get production-ready models of all types deployed into 
applications and delivering value as quickly as possible.
 » Ensure that all models are meeting their statistical, opera-
tional, business, and compliance key performance indicators 
(KPIs).
It’s important to note what’s not included in the list of key 
 ModelOps requirements: Specifically, model creation isn’t a part 
of ModelOps. Of course, data scientists are key stakeholders in 
ModelOps; however, the process of model development isn’t a part 
of ModelOps and, is generally a BU-level concern. For this reason, 
rather than standardizing on a single data science platform, a 
variety of data science tools and platforms are being used to make 
models in most organizations.
ModelOps is separate and distinct from data science. The primary 
objective of data science is to create models in response to specific 
business requirements. The primary objective of ModelOps is to 
scale and govern AI initiatives across the enterprise.
ModelOps starts when a business use case is defined and con-
tinues until a model is retired. ModelOps envelops but doesn’t 
include model development. This process is shown in Figure 2-1.
CHAPTER 2 Understanding the Role Played by ModelOps in Enterprise AI 17
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Separating ModelOps as a distinct discipline from data science is 
important for several reasons:
 » Data scientists have unique skillsets for building models. 
They typically have years of training in their field and often 
hold PhD degrees. They generally have little or no experi-
ence with or interest in operating an enterprise production 
environment. Using data scientists to do ModelOps is like 
using a Formula 1 car engineer to manage the logistics for 
moving the team and equipment from race to race.
 » Models bear enterprise-level risks. While models are 
typically created for a particular BU, they need to be 
FIGURE 2-1: ModelOps encompasses all elements of themodel life cycle 
(MLC) pre- and post-model development.
18 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
accountable at the enterprise level because of the scale of 
the business impact they provide, the resources they 
consume, and the risks they expose.
 » Models are increasingly subject to internal compliance 
controls and external regulations; therefore, model 
governance is a core pillar of ModelOps. For model 
compliance controls and audits to be effective, they need to 
be implemented by a group distinct from the group that 
developed the model. Indeed, in regulated industries, it’s 
required that the group responsible for compliance is 
separate from the group being audited. Otherwise, it’s like 
having students grade their own papers.
Recognizing that Models Are 
Enterprise Assets
Models are developed to address specific business needs, and 
these needs frequently originate within a particular BU.  In that 
sense, models may be thought of as BU assets. But models are also 
enterprise assets in the following sense:
 » The decisions that models drive at the BU level expose the 
organization to corporate-level brand risks, external regula-
tory compliance risks, and corporate-level financial risks.
 » The infrastructure in which models execute, such as a data 
center, private cloud, or public cloud, are enterprise assets 
managed by the ITOps team.
 » The data consumed by models is provided by the DataOps 
team, which is usually structured as a central, enterprise 
resource.
 » Models expose the organization to enterprise-level risks.
For example, a model that makes financial decisions can lose 
millions if the model isn’t monitored and maintained 
properly. And as concerns about bias and fairness in AI gain 
broader attention, organizations face the possibility of 
significant brand damage if they’re found to employ models 
that don’t meet ethics standards. This is a key reason why risk 
management is nearly always an enterprise-level function.
CHAPTER 2 Understanding the Role Played by ModelOps in Enterprise AI 19
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
This interplay between models as highly local to the BU and yet 
also enterprise assets often creates tensions within organiza-
tions that are trying to scale their AI initiatives. BUs are hiring 
data scientists and pressing them to develop and deploy models 
as quickly as possible. Data science tools are evolving and chang-
ing rapidly, and data scientists want the freedom to use the most 
appropriate techniques and tools to address each business chal-
lenge. Understandably then, data scientists and their business 
owners are concerned that a centralized, corporate ModelOps 
function that’s not accountable to the BU will be unresponsive 
and will impose restrictions and bulky processes that limit inno-
vation and slow them down.
The concerns around locating ModelOps as a centralized, 
enterprise- level capability are valid and must be addressed in 
order for ModelOps to deliver on its role in scaling and governing 
AI initiatives. (Check out Chapter 3 to avoid the pitfalls that can 
make ModelOps a hindrance rather than a help.) However, in light 
of the substantial opportunities and risks associated with mod-
els, most organizations treat models as enterprise-accountable 
assets, and this is a key pillar of any ModelOps strategy.
The first step in untangling the ModelOps knot is to recognize 
that models are enterprise assets in the same way that cash, 
employees, and data are enterprise assets. This means that key 
operational questions regarding models — including how many 
are in production, how they’re contributing to the business, and if 
they’re in compliance — are C-level executive concerns.
Understanding the Differences between 
ModelOps and MLOps
The term MLOps is sometimes used interchangeably with 
 ModelOps; however, they’re distinct disciplines that serve differ-
ent purposes and constituencies. Confusing one with the other is 
a detriment to both and to the enterprise as a whole.
20 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
The term MLOps stands for Machine Learning Operations, and as the 
name suggests it’s limited to ML models. This is the first key dis-
tinction between MLOps and ModelOps because the latter encom-
passes all types of models, not just ML models. But the differences 
only begin there.
The primary purpose of MLOps is to support data scientists during 
the model development process by automating those parts of the 
development cycle, specifically algorithm selection and training, 
that involve running the model against data sets and analyzing 
the results. However, MLOps has taken on additional significance 
and scope in recent years in response to the problem of model 
operationalization.
Many organizations have found that over 50 percent of the mod-
els they develop never get into production. They also report that 
it can take as much as six months or a year for a model to be 
fully deployed. Additionally, once in production, models may not 
be appropriately monitored, managed, and governed. In response, 
data science and machine learning (DS/ML) platform and cloud 
AI providers have extended their systems with additional deploy-
ment, monitoring, and, in some cases, governance features under 
the category of MLOps.
The challenge with using MLOps tools for ModelOps, beyond the 
fact that it only encompasses ML models, is that the needs of 
data scientists are vastly different from the core constituencies 
for ModelOps, which include the chief information officer, chief 
financial officer, vice president of IT, and the chief risk officer. 
Data scientists are primarily concerned with creating high-value 
models as fast as possible. To do so they need to be able make 
changes and try new approaches with maximum speed and flex-
ibility. However, managing production operations requires strict 
adherence to processes and procedures. Approvals, sign-offs, and 
audits, which are critical for operating production applications 
at scale, are unwelcome in the development process. So there’s 
a fundamental conflict between making a great MLOps platform 
and a great ModelOps platform.
Table  2-1 shows you the key differences between MLOps and 
ModelOps.
CHAPTER 2 Understanding the Role Played by ModelOps in Enterprise AI 21
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Introducing the Enterprise AI Team
Enterprise AI is a team sport that requires a carefully orchestrated 
interplay among multiple groups and consideration of their needs. 
This section explains both.
Identifying the key stakeholders
The key stakeholders in your organization include the following:
 » The BU: Every model serves a specific business need, such 
as increasing revenue, reducing cost, increasing customer 
satisfaction, and so on. BUs often sponsor development of 
models and are ultimately accountable for demonstrating 
the return on investment (ROI) for the models’ development 
and operating costs.
TABLE 2-1	 The Differences between ModelOps 
and MLOps
ModelOps MLOps
What is it? An enterprise capability that 
provides governance and 
operations for models in 
production
A feature in data science platforms 
that provides rapid experimentation 
and deployment of ML models 
during the data science process
Who owns 
it?
CIO/IT Data scientists within the line of 
business
Primary 
users
Enterprise risk, enterprise IT, 
or line of business 
operations
Data scientists
Primary 
capabilities
Enterprise-wide product 
model inventory; complete 
model life cycleautomation; 
360-degree enterprise 
visibility, monitoring, 
auditability for all models in 
production
Tight integration with specific data 
science platform; rapid deployment 
of models for experimentation and 
testing during model development; 
data science performance focused 
monitoring integrated with specific 
data science platforms
22 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
 » Data scientists: These folks are the model builders. They’re 
highly skilled people with deep backgrounds in data 
analytics, statistics, and AI techniques such as ML. Data 
scientists are often embedded in BUs to be closer to the 
business problems that they’re addressing.
 » DataOps: Data is the life blood of models. Because data is a 
corporate asset, DataOps is generally an enterprise capabil-
ity under the chief data officer. The DataOps team collects, 
processes, curates, protects, and feeds data to data 
scientists at the BU level during model creation and to the 
ITOps team back at the enterprise level for models in 
production.
 » DevOps: The DevOps team develops and operates the 
applications that consume the models’ inferences and take 
corresponding actions, such as placing an ad in a user’s 
newsfeed, recommending a product, or approving an 
insurance claim. Enterprise-level DevOps processes such as 
containerization and service-level unit testing are often 
critical and mandatory procedures required to get various 
models into production.
 » Central Technology/IT/ITOps: This team is responsible for
• Enterprise architecture
• Organization-wide digital security
• Enterprise systems — such as databases and identity 
management
• The infrastructure on which the models and applications 
run, whether in the company’s data centers, in private or 
public clouds, in end-use devices such as smartphones or 
Internet of Things (IoT) products, or in some combination
 » Compliance: Also called risk management, the compliance 
team is charged with protecting the organization against 
financial and reputational risks by ensuring that models 
operate within internal guidelines and external regulations.
Figure  2-2 shows you these key groups that have a role in the 
ModelOps.
CHAPTER 2 Understanding the Role Played by ModelOps in Enterprise AI 23
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Understanding the needs of each 
stakeholder
Each team has its own perspective, or context, through which 
they view models:
 » The BU is concerned with understanding how the model is 
contributing to its intended business outcome, such as 
higher sales or reduced customer churn, and if the model is 
generating a favorable ROI.
 » Data scientists are concerned with how the model is 
performing technically and if it’s delivering valid statistical 
inferences.
 » DataOps views the model as a data consumer that needs 
data of a particular type, quality, volume, and speed.
 » The DevOps team sees the model as a software component 
with unique requirements, such as periodic retraining, that 
are different from conventional software.
FIGURE 2-2: ModelOps requires participation and coordination among a 
diverse set of enterprise stakeholders.
24 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
 » The enterprise technology team sees the model as a 
software asset that requires a tightly controlled operating 
environment and an associated allocation of storage, 
compute, and networking resources sufficient to meet 
performance targets. They also see ModelOps platforms as 
highly integrated systems that must integrate with existing 
systems and align with present and future state enterprise 
architectures.
 » The compliance team sees the model as a risk-bearing asset 
that must be proven to operate within company standards 
and regulatory guidelines.
An effective ModelOps capability provides all stakeholders with 
the degree of visibility and control of models appropriate to their 
roles. In Chapter  3, you see how to architect your enterprise 
 ModelOps capability so it meets the requirements of the entire 
organization and avoids the key pitfalls.
CHAPTER 3 Architecting Your Enterprise ModelOps Capability 25
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Chapter 3
IN THIS CHAPTER
 » Defining model life cycles
 » Assigning ownership for ModelOps
 » Identifying the key requirements for an 
enterprise ModelOps platform
 » Working with your enterprise processes 
and systems
 » Addressing regulatory and compliance 
requirements
 » Looking at the foundation for model 
governance
 » Seeing the end results
Architecting Your 
Enterprise ModelOps 
Capability
Companies that were “born digital,” with businesses built from the ground up around software and data, are often structured in a way that offers a relatively straight path to 
becoming model-driven. But non-digital native companies, espe-
cially in regulated industries, find it much more challenging to 
operationalize their data science investments, not because of 
technical challenges but because of a lack of clear organizing 
principles regarding how ModelOps should be structured, which 
groups should fund and manage it, and how it should relate to 
data science at the business-unit (BU) level and all the other 
enterprise-level functions that it touches.
26 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
In this chapter, you see how to implement the most effective 
ModelOps capability by architecting it as an independent, enter-
prise function accountable at the highest level of the company and 
with models at the center.
Understanding Model Life Cycles
Operating a model in production requires a carefully orches-
trated set of processes that touch multiple teams and systems. 
For example, once a model is declared “ready for deployment” by 
data scientists, a number of steps need to take place:
1. Integrate with code repository, ticketing, risk manage-
ment, and other enterprise systems.
2. Prepare a runtime image that can be operated at scale 
by the ITOps team in the target execution environment.
3. Deploy into a quality assurance (QA) runtime environ-
ment for testing and validation.
4. Verify that the model has passed checks for statistical 
accuracy and compliance with standards and 
regulations.
5. Complete security scans.
6. Instrument the model for monitoring.
7. Integrate with production data pipelines.
8. Approvers sign-off to verify that all steps have been 
executed.
After the model is in production, it must be monitored continu-
ously and measured against technical, business, and regulatory 
targets. Variations from key performance indicators (KPIs) must 
be flagged and trigger remediation processes that are driven to 
resolution. For example, ML models require periodic retraining 
to maintain efficacy. For high-value models, data science teams 
may continually produce “challenger” models that will replace 
the “champion” model in production if the challenger demon-
strates better results. Computing resources may need to be scaled 
in order to meet operating KPIs for throughput and latency. Mod-
els must also be reviewed against ethical standards for bias and 
fairness.
CHAPTER 3 Architecting Your Enterprise ModelOps Capability 27
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
These activities span multiple teams including the BU business 
owners, data scientists, DataOps,DevOps, ITOps, and compliance, 
and all actions must be exhaustively documented to satisfy inter-
nal and external audits. Any missed steps along the way can result 
in diminished or harmful business results and can also expose 
the business to liabilities. Without automation, the complexity of 
these tasks won’t scale with the desired increase in the number of 
models making decisions in the organization.
The key tool used for managing the complexities of models in 
production is the model life cycle (MLC), which is a blueprint that 
captures the steps required to get a model in production and keep 
it operating within targets. It also codifies the responsibilities of 
each team and identifies the hand-offs between them.
An MLC defines the requirements and processes for operation-
alizing a model at the enterprise level. It includes detailed pro-
cess workflows with well-defined steps for operating, governing, 
and maintaining the model throughout its post-development 
life cycle, until it’s retired, and ensures that all these steps are 
auditable.
A typical MLC can have hundreds of steps and parameters. A 
powerful way to capture and visualize these steps is using the 
flowchart format used for business process management notation 
(BPMN), shown in Figure 3-1.
FIGURE 3-1: A subsection of an MLC showing the steps involved with model 
validation and approval.
28 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Figure 3-1 captures just a small portion of a complete MLC — in 
this case, a section focused on model risk management (MRM). In 
practice, MLCs have multiple sections that cover all the different 
aspects of the model’s life cycle, including the following:
 » Use-case definition and documentation initiation 
processes: Captures the business justification for spending 
resources to develop models and identifies the enterprise 
risk classifications that the model or models for the use case 
will carry through their life cycle, consequently dictating any 
appropriate downstream model risk processes for the 
models; associates the profit and loss (P&L) context appro-
priate for assessing model return on investment (ROI)
 » Model registration processes: Captures all artifacts of a 
model, including code, training data, production runtime 
requirements, operational data requirements, approvals, 
and so on in a production model inventory
 » MRM processes: Captures all regulatory requirements for 
the model and implements the periodic testing necessary to 
demonstrate compliance, collects all related data, and 
generates reports to fulfill audit requirements
 » Operationalization processes: Orchestrates the delivery of 
model runtimes into the execution environment, drives and 
tracks change management, and provides the interfaces to 
supporting enterprise IT service management systems
 » Monitoring processes: Ensures that models are performing 
optimally across business, technical, operational, and 
compliance KPIs; triggers remediations as necessary; 
includes champion/challenger testing, data and concept drift 
monitoring, model retraining, interpretability, fairness 
testing, and so on
The various processes that models must journey through in their 
lifetime have unique MLCs. In fact, the same model deployed in 
multiple applications may traverse different MLCs for each use 
case depending on the business, technical, and compliance condi-
tions that apply. The key is that the model — not the data, appli-
cation, execution environment, or anything else — is always at 
the center of every MLC.
CHAPTER 3 Architecting Your Enterprise ModelOps Capability 29
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Developing MLCs requires a unique blend of skills covering a wide 
range of disciplines, including business management, data sci-
ence, IT operations, and compliance. This requirement has led to 
the development of a new role: the Enterprise AI architect. This 
person works with all stakeholders to develop and document 
appropriate MLCs for each model and also architects the pro-
cesses and tooling used to automate the MLCs and integrate them 
with the organization’s existing processes and tool stack.
Assigning Enterprise Ownership 
for ModelOps
In many enterprises, key questions remain as to which part of the 
organization has responsibility, and budget, for ModelOps.
AI initiatives frequently start with experimentation in a corporate 
AI Center of Excellence and then move to one or more BUs that 
see potential to use AI to address specific business challenges or 
opportunities. After a successful pilot project, there’s often great 
excitement, and pressure, to get the new models into production 
and producing value. It’s at this point where things can start to 
go wrong. If the use case, risk, and target business justification 
aren’t captured prior to the start of data science activity then the 
post development problems are dramatically amplified.
In some cases, models are released by the data science team to 
the operations teams with the expectation that they’ll be put into 
production using the same processes used for conventional soft-
ware. In practice, this approach often doesn’t work well because 
the requirements for operationalizing models are more complex 
than those for conventional software. The existing operational 
teams — DataOps, DevOps, ITOps, compliance, and so on — don’t 
have the scope or expertise to address all the requirements for 
model operationalization. They also don’t spontaneously organize 
themselves into cross-functional groups with clearly defined 
processes and associated tooling to manage the processes. This 
lack of clear roles and accountability contributes to the long 
delays and low success rates experienced with many AI initia-
tives and is also responsible for the rise of AI-driven Shadow IT, 
30 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
in which individual BUs take model operationalization into their 
own hands. Read the sidebar “Avoiding AI-driven Shadow IT” for 
more information.
In some cases, the degree of corporate risk or the lack of clear 
business justification isn’t exposed until after a model is in pro-
duction, thereby compounding the impacts of any failures in 
model productionization and ongoing operation. In the worst 
case, a model is deemed to be a liability rather than an asset. 
In that unfortunate case, key stakeholders may find themselves 
making statements like “If everyone understood the business risk 
of this model failing, we would have never authorized the data 
science work to develop it.”
So who owns ModelOps? The most effective structure for ModelOps 
in the enterprise is assigning it under the chief information offi-
cer (CIO). Figure 3-2 shows a high-level view of this structure.
AVOIDING AI-DRIVEN SHADOW IT
Model operationalization challenges reduce the value derived from AI 
initiatives and put pressure on BUs to justify their data science invest-
ments. In response, some BUs put models into production using the 
MLOps capabilities of data science platforms, which can be done rela-
tively easily if models are deployed into a public cloud. The problem is 
that this activity can easily bypass the central IT, security, compliance 
and other organizations, resulting a kind of AI-driven Shadow IT.
Further, the MLOps capabilities provided by model development plat-
forms and hyperscale vendors are rarely comprehensive and don’t 
orchestrate full MLCs with enterprise-level visibility and governance. 
Using Shadow IT for model deployment is an understandable reaction 
to delays in getting models into production, but it adds significant risk 
to AIdeployments.
The key to avoiding Shadow IT is to implement an effective, enterprise- 
level ModelOps capability that gives data scientists the freedom to 
innovate and solve problems with their preferred tools and doesn’t 
require them to manage production and compliance details.
CHAPTER 3 Architecting Your Enterprise ModelOps Capability 31
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Under the CIO are two new roles:
 » The Enterprise AI architect works across all stakeholders to 
define MLCs and also specifies and implements the tooling 
used to automate MLCs and integrate with the enterprise IT 
stack. The Enterprise AI architect understands the model in 
all its contexts: An asset that drives business value, con-
sumes IT resources, and bears risk for the enterprise. Good 
candidates for the role are often those who have experience 
as Enterprise IT architects because they also require a deep 
understanding of IT systems as well as an ability to commu-
nicate effectively with teams across the organization.
 » Model operators are the technical staff assigned to 
monitoring all models in production to ensure that any 
issues or bottlenecks are addressed quickly by the appropri-
ate stakeholders. Good candidates often have experience in 
ITOps, SecOps, or other highly-responsive operational roles. 
Model operators are the primary users of ModelOps 
automation platforms.
Elevating ModelOps to a corporate-level capability provides 
enterprise-level accountability for AI initiatives and provides the 
best structure for ensuring that all stakeholders stay coordinated 
and have the appropriate degree of visibility and control over their 
aspects of model operation and governance.
FIGURE 3-2: A high-level view of ModelOps under the CIO.
32 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Understanding Key Aspects 
of an Enterprise ModelOps Platform
An enterprise ModelOps platform must meet key requirements to 
enable an organization to answer key questions. I cover both in 
this section.
Identifying the key requirements
The term ModelOps platform refers to the software systems used to 
automate ModelOps. The key purposes of an enterprise ModelOps 
platform are to
 » Maintain the authoritative “source of truth” regarding 
everything about every model running in the business, from 
cradle to grave.
 » Provide the ability to define and automate MLCs.
 » Connect every model to its business value, business risk, and 
approval/compliance status.
Answering the key questions
The ModelOps platform enables stakeholders across the enter-
prise to answer the following questions quickly and easily:
 » Which models are running in production? Where are they 
deployed?
 » How much value is each model bringing? Which resources is 
each model consuming? What is the ROI for each model?
 » Are models performing to operational targets? Are their 
results accurate? Do any need to be retrained or replaced?
 » Were all models tested and validated? Which tests were run, 
and who approved them?
 » Are any models not performing within compliance or 
thresholds?
 » Can we satisfy in an audit that all regulatory requirements 
have been met for all effected models?
CHAPTER 3 Architecting Your Enterprise ModelOps Capability 33
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
To achieve its core purposes, an effective ModelOps platform 
should include all the following core elements:
 » A comprehensive and evergreen production model inven-
tory, which is a database of all models in production and all 
their associated artifacts and metadata (code, execution 
requirements, training data, KPIs, performance logs, test 
results, changes, approvals, and so on) covering from model 
inception to retirement
A model inventory is more than just a repository. Fully 
describing a model in all of its contexts requires a number of 
repositories — for example, a code repository that holds the 
model execution code, an artifact repository that holds the 
weights and coefficients associated with the model that were 
generated during training, and others. The model inventory 
includes pointers to all model artifacts and metadata in all 
repositories across the full MLC, from when the model was 
commissioned until after it’s retired.
 » A repository for storing, editing, and managing MLCs
 » A BPM engine for automating the execution and tracking of 
MLCs
 » A production runtime engine with full instrumentation for 
monitoring models in production, and interfaces to models 
that are delivered within a third-party runtime engine
 » Configurable dashboards for showing the real-time status of 
all models across statistical, performance, business, and 
compliance KPIs, individually and in aggregate
 » Interfaces to data science and machine learning (DS/ML) 
platforms and third-party model repositories to support easy 
import of models post creation
 » Interfaces to existing enterprise IT systems for identity and 
access management (IAM), code management, ticketing, 
analytics, and so on
The relationship among these elements is shown in Figure 3-3.
A well-architected ModelOps platform provides an authoritative 
source of truth for all data associated with all models used in the 
enterprise, regardless of where they’re sourced or executed.
34 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Integrating with Your Enterprise 
Processes and Systems
The primary users of the ModelOps platform are the Enterprise AI 
architect, who defines MLCs, and model operators, for whom the 
ModelOps platform is their command center, providing full visi-
bility and control for all models in the business. If the platform is 
architected and integrated properly, the other key stakeholders, 
such as Data Science, DataOps, ITOps, DevOps, and Compliance, 
generally don’t need hands-on access to the ModelOps platform, 
but instead they’re able to perform their roles within the context 
of existing enterprise tools.
FIGURE 3-3: A well-architected ModelOps platform.
CHAPTER 3 Architecting Your Enterprise ModelOps Capability 35
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
For example, if a model operator is asked which models didn’t 
pass their data drift constraints and what happened to them when 
they failed (for example, any tickets that were automatically 
generated in systems such as Jira), they can use the ModelOps 
platform to accurately generate both the list of such models and 
exactly where they are in the remediation process.
If the data scientist retrains or changes the model, the updated 
model is placed in the enterprise code repository, such as Github. 
Via integration with the ModelOps platform, the code changes 
trigger the MLC to execute additional steps, such as code scanning 
and testing, approvals, and release to ITOps for deployment in, 
say, a cloud execution environment. Each operational team is able 
to work within its existing tools, and the platform orchestrates 
and records all activity.
A key way to ensure that the Enterprise ModelOps is embraced by 
all stakeholders is to support integrations with the tools that they 
already use. This approach eliminates the need to train most of 
the stakeholders on a new tool and greatly simplifies implemen-
tation of the ModelOps platform.
Understanding and Meeting Regulatory 
and Compliance Requirements
Even before AI models started to see widespread use, MRM had 
been an established function in many companies, especially in 
industries like bankingand financial services. In the wake of the 
2008 financial crisis, when it became clear that weaknesses in 
model governance created huge exposures for financial institu-
tions, various United States regulatory bodies, such as the Fed-
eral Reserve, FINRA, and the SEC, and their counterparts in other 
countries, began to issue increasingly stringent guidelines and 
regulations governing the use of models.
A good example of regulations that impact models is Supervi-
sion and Regulatory Letter SR 11-7 and its associated attachments 
issued by the United States Federal Reserve in 2011. Specific to the 
banking industry, SR 11-7 sets out comprehensive requirements 
and policies for how models should be developed, used, validated, 
and governed. As a result, MRM has become a core corporate 
36 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
function with board-level visibility and authority in the financial 
sector.
Regulatory authorities have levied significant penalties against 
organizations deemed to have inadequate governance. Some 
examples include
 » A major United States-based bank was fined $400 million by 
the Federal Reserve’s Office of the Controller of the Currency 
(OCC) for lack of controls, including around data analytics, 
and ordered to establish a new risk management function.
 » A European Union (EU)-based company paid a $240 million 
settlement due to errors in model code that were revealed 
but not corrected.
Regulatory scrutiny of decisioning models is expanding well 
beyond financial services to cover all industries and a wider range 
of issues associated with their use. For example, models are being 
used and proposed for a dizzying variety of use cases, including 
predicting equipment failure, evaluating employees, diagnosing 
disease, piloting autonomous vehicles, and identifying crimi-
nals. A good example is EU 2021, which proposes comprehensive 
requirements and accountability for models across a wide variety 
of contexts. See the nearby sidebar “Seeing the future of AI regu-
lation” for more in-depth information.
In addition to requirements dictated by government regulations, 
the marketplace also imposes requirements on organizations that 
use AI. Consumers increasingly demand that the companies they 
buy from employ business practices that are ethical and fair, and 
enterprises have had to respond. So whether responding to regu-
latory fiat or market pressures, AI-driven enterprises will need to 
establish governance policies and processes regarding the models 
they use that address, at minimum, the following concerns:
 » Efficacy: Decisions driven by models that aren’t built or 
maintained properly can put individuals or whole companies 
at risk. Models must be developed properly and validated 
periodically to ensure that they’re robust and produce good 
results.
 » Transparency: In order to protect individuals and groups 
from unfair treatment, the decisions made by models need 
to be explainable and provably free from bias.
CHAPTER 3 Architecting Your Enterprise ModelOps Capability 37
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
 » Ethical use: Some potential uses of AI models can do 
significant harm. One common example is the use of models 
to do social scoring with the ultimate goal of restricting an 
individual’s access to products, places, or services. Another is 
using models to create deep fakes, which are AI-generated 
audio or video clips of people that are used to damage their 
reputations or subject them to extortion. Some proposed 
regulations explicitly limit or ban these kinds of use cases.
SEEING THE FUTURE OF AI 
REGULATION
An important regulation under development is EU 2021 — New Rules 
and Actions for Excellence and Trust in AI. This set of proposed regu-
lations applies to AI systems produced and used in the EU as well as 
in non-EU countries. The rules focus on so-called “high-risk” AI sys-
tems, which include a wide range of devices and services from 
machinery and elevators to medical devices and toys.
Specific uses subject to the rules include
• Biometric identification and categorization of natural persons
• Management and operation of critical infrastructure
• Education and vocational training
• Employment
• Law enforcement
• Migration
• Asylum and border control
• Administration of justice and democratic processes
The proposed penalties for failing to comply are severe, ranging from 
2 to 6 percent of a company’s annual revenue. This penalty makes 
compliance a board-level concern for any affected organization. If EU 
2021 is adopted in a form similar to the initial proposal, nearly all uses 
of AI models will fall under the regulations.
38 ModelOps For Dummies, ModelOp Special Edition
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Regulating the use of models is a global concern. Many key model 
regulations are in place or proposed in the following countries:
 » United States: U.S. SR 11-7 — requires MRM for all models 
in financial services; U.S. 2019 Proposal for Algorithmic 
Accountability Act
 » EU: EU 2021 — New Rules and Actions for Excellence and 
Trust in AI
 » Canada: Directive on Automated Decision making and 
Algorithmic Impact Assessment
 » Singapore: Model AI Governance Framework
 » United Kingdom: AI Public Private Forum
 » Mexico: 2018 — General Principles for AI
Establishing ModelOps as the Foundation 
for Model Governance
While the particulars may vary among different industries and 
use cases, the essentials of good model governance are fairly con-
sistent and include the following:
 » Establishing clear policies regarding the standards that 
models must meet across all contexts, including business 
metrics, statistical metrics, internally generated compliance 
metrics, and external regulations
 » Extensively documenting the purpose for each model, its 
metrics, how it was developed, and how it needs to be 
deployed
 » Identifying all necessary approvals and approvers as the 
model moves from concept to development and into 
production
 » Capturing all artifacts and metadata associated with the 
model, including code, training data, test cases, and results
 » Logging all activities that happen with the model from the 
time of release to production through retirement, including 
deviations from KPIs, remediations such as code changes or 
retraining, and all approvals that took place at each step
CHAPTER 3 Architecting Your Enterprise ModelOps Capability 39
These materials are © 2022 John Wiley & Sons, Inc. Any dissemination, distribution, or unauthorized use is strictly prohibited.
Most regulatory bodies require that the personnel who test and 
validate models must be separate from those who develop mod-
els. This generally means models can’t be validated by the by data 
scientists who create them.
A well-designed ModelOps platform includes the continuously 
updated model inventory that captures all the necessary docu-
mentation and metadata for every model, including any changes 
and approvals. MLCs capture all KPIs and codify policies into 
repeatable processes. Automating MLCs enables continuous test-
ing and monitoring, making it easy to flag and remediate issues 
and ensure that approvals are completed quickly. With all the 
information concerning each model captured in a common plat-
form, generating reports to comply with audits also becomes far 
easier.
Visualizing the Results
A key objective for an enterprise ModelOps capability is to visu-
alize the status of all models in production, regardless of where 
they were created or acquired, and independent of where they 
execute. You can accomplish this objective in several ways:
 » With appropriate interfaces and

Continue navegando